Issue No.03 - May/June (2011 vol.13)
Published by the IEEE Computer Society
Bruce Potter , Ponte Technologies
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MITP.2011.39
<p>IT security has evolved dramatically over the last few decades. Initially, it consisted of password management and basic network security controls. However, as enterprises have come to depend on IT for every aspect of daily operations, the need for security has transcended the datacenter and now is woven through all aspects of IT operations.</p>
IT security has evolved dramatically over the last few decades. Initially, it consisted of password management and basic network security controls. However, as enterprises have come to depend on IT for every aspect of daily operations, the need for security has transcended the datacenter and now is woven through all aspects of IT operations.
The product space has become more complicated as organizations continue to expand how users can access and manipulate their data. Network perimeter controls aren't good enough to protect modern enterprises. Data-loss prevention software, intrusion-prevention systems, and multifactor authentication are moving from "nice to have" to "need to have" for many companies.
Unfortunately, the threats against our systems have also steadily increased over the last few decades. The tools at an attacker's disposal have become more sophisticated and effective, even in the face of our increased defenses. Over the last few years, several high-profile intrusions have received media attention. The activities performed in Operation Aurora, the name given to a series of attacks initially revealed by Google in late 2009, has become the standard playbook for attackers targeting an enterprise.
Rather than focusing on the perimeter system—looking for holes in our firewalls and finding unpatched servers—attackers are now focusing on workstations. Modern adversaries compromise workstations through malicious emails and websites. Then, once inside the network, the attacker can access other workstations, steal credentials, and start exfiltrating valuable information.
Several studies have reviewed the effectiveness of modern IT security defenses. Neils Provo and his colleagues examined the amount and success rate of drive-by Web-based attacks on URLs randomly chosen from the Google database. 1 They found that, on average, 5 percent of the URLs they indexed contained malicious content. Furthermore, even with antivirus software in place, there was a 50 percent chance of those drive-by's successfully exploiting the remote workstation.
That's an unfortunately high success rate for nontargeted attacks. Attackers going after a specific organization or target are likely to have an even higher rate of success.
Given the sophistication of modern adversaries and the difficulty most enterprises have locking down their workstation environments, we have to assume not only that our systems will suffer from successful exploits but also that they likely already have. We can't successfully defend our systems given our current technology and operational strategies; too many gaps exist in today's IT security landscape.
First, our current detection capability lags behind what enterprises really need. IT security spending has focused mostly on defensive products—products that we presume, if deployed successfully, will stop an attack. The lack of spending in the detection space has caused detection products to fall behind their defensive counterparts. Without stronger detection capabilities, attackers will linger inside networks, continuing to leverage their access for nefarious activities.
Another issue is that most of the defensive products we use are simple evolutions of the same products we've been using for years. Firewalls and antivirus products use the same basic technology they've been using for over a decade. The state of the art for defensive security products hasn't advanced as the threats have changed.
Even with the best products, there's still a knowledge gap to contend with. IT security is one of the fastest changing aspects of IT operations. Threats change daily, best practices evolve, and even the core technologies used often aren't standardized. Even worse, for those who have gone through security training and education, the curriculum can vary wildly. Two individuals with degrees in the same security-related field can have vastly different knowledge about the same topics. Finding and retaining staff who can keep pace with the constantly changing landscape is difficult. And if your staff can't keep up, they become a liability instead of an asset.
In This Issue
This all requires a new way of thinking about old problems. As computing power has increased and networks have become faster and more sophisticated, federated systems offer new opportunities for security services. In "A Mutualistic Security Service Model: Supporting Large-Scale Virtualized Environments," John Quan, Kara Nance, and Brian Hay present a plan to provide security services to huge, diverse installations of systems. Using virtual machine introspection (VMI), a security service can monitor virtualized systems without threat of subversion from malware in the guest virtual machines. VMI offers exciting new security capabilities, and the model the authors present leverages those capabilities to help entice organizations to federate systems to provide greater computational power for less cost.
The next article focuses on how to better share knowledge in the ever-changing IT security landscape. In "A Community Knowledge Base for IT Security," Stefan Fenz, Simon Parkin, and Aad van Moorsel propose a project to formalize knowledge in the IT security community. However, rather than focusing on the low-level technical knowledge, this team proposes a more holistic approach designed to benefit IT security managers. The team has carefully considered how to capture and represent security and operational information in a model aimed at helping IT managers make better decisions faster and with more confidence.
Finally, the last article explores an area where the functional capability of a technology far outpaces our ability to understand and secure it. Web services provide many benefits due to the ability to rapidly develop complex software systems that meet business needs. Unfortunately, systems based on Web services can be difficult to secure because of the many different standards and administrative domains and wide variety of functionality coming from different sources. In "Forensic Web Services Framework," Murat Gunestas, Murad Mehmet, Duminda Wijesekera, and Anoop Singhal propose a set of services designed to create a valid forensic trail for complex Web service systems. The proposed service will help bridge the gap between contemporary forensic activities, which are usually targeted at the host level, and the need for forensic evidence in real-world service-oriented-architecture systems.
IT security is a moving target. The one constant is that attackers will change tactics based on our evolving defenses. What was a valid defense a few years ago can quickly become outdated and a liability. It's important for us as IT professionals to constantly challenge our assumptions about how we're securing our systems and the threats we're defending against. If we keep an open mind and push forward new technologies and operational strategies, we'll start making positive strides in securing our enterprises.
Selected CS articles and columns are available for free at http://ComputingNow.computer.org.
Many thanks to Rick Kuhn, co-guest editor of the issue. Kuhn is a computer scientist at the US National Institute of Standards and Technology (email@example.com).
Bruce Potter is founder of The Shmoo Group of security, crypto, and privacy professionals. He helps organize the yearly ShmooCon security conference held each winter in Washington DC. He's also the co-founder of Ponte Technologies, a company specializing in wireless security, IT security operations, and advanced network defense techniques. Contact him at firstname.lastname@example.org.