One of the amazing – and at times, also challenging – things about extremely smart people is that they routinely pack a ton of wisdom and insight into just two or three sentences. The rest of us can receive it, and even think we understand it, but not necessarily to its full depth of meaning.
Such is the case with RSA President Amit Yoran’s predictions for 2016, which includes the following thoughts about the increasing risk in our collective industrial control systems (ICS) security:
“Intrusions into systems that control operations in the chemical, electrical, water, and transport sectors have increased 17-fold over the last three years. The advent of connected and automated sensors aggressively exacerbates these issues. The growth in the use of cyber technology for terrorism, hacktivists, and other actors, combined with the weakness of ICS security generally, combined with the potential impact of bringing down a power facility or water treatment plant (hello, California), makes the critical breach of an ICS in 2016 extremely concerning and increasingly likely.”
Interestingly, he doesn’t actually use the work “risk” at all in making this particular point – but his prediction does include statements about both likelihood and impact, which are the two essential elements in any proper discussion about risk.
To more fully appreciate what Amit’s prediction is saying, let’s unpack it and re-frame it in the context of the generic risk model described in the National Institute of Standards and Technology (NIST) Guide for Conducting Risk Assessments (SP 800-30, Revision 1), as shown in the following diagram:
Qualitatively, Amit’s prediction speaks to all of the elements of framing the risk in ICS security:
- Threat Sources and Events: “Growth in the use of cyber technology for terrorism, hacktivists, and other actors”
- Exploits, with Likelihood of Success: “Intrusions into systems that control operations in the chemical, electrical, water, and transport sectors have increased 17-fold over the last three years”
- Vulnerability, in the Context of Predisposing Conditions: “The advent of connected and automated sensors aggressively exacerbates these issues”
- Security Controls, with Effectiveness: “The weakness of ICS security generally”
- Adverse Effect: “The potential impact of bringing down a power facility or water treatment plant (hello, California)”
- Risk, as a Combination of Impact and Likelihood: “The critical breach of an ICS in 2016 [is] extremely concerning and increasingly likely”
In less than 100 words, Amit has effectively framed the risk to our industrial control systems, using all the elements of this generally accepted risk model. It’s a clear sign of deep knowledge, extensive experience, and disciplined use of correct language and frameworks.
It’s also an example of how the rest of us should aspire to communicate properly about risk – as opposed to, for example, the natural tendencies of highly technical security practitioners to focus on the low-level details of threats, vulnerabilities, exploits, and controls.
Quantitative, it’s not. NIST SP 800-30R1 actually goes into great detail into how to transform this kind of qualitative risk assessment into a “semi-quantitative” measure of risk – e.g., based on a scale of 0 to 100, or 0 to 10. Many security practitioners love the idea of coloring risks as green, yellow, or red – primarily because the business decision-makers they are trying to advise seem to “get” the simplicity of communicating risks in that way.
It’s definitely true that not everyone agrees that describing industrial control systems security risks as “72” or “yellow” really helps the owners of those risks make a better-informed business decision – a nodding of heads and a lack of questions does not necessarily mean a detailed understanding, as this breakdown of one of Amit’s predictions helps to illustrate.
In fact, some would say that these semi-quantitative approaches can help business decision-makers to make bad decisions about security risks, even faster. In an ideal world, these risks would be described in terms of likelihood and impact, with full accounting for the inherent uncertainties – i.e., “there’s a 20% likelihood of a critical breach of an industrial control system in 2016 that results in more than $X billion dollars in damage to the California economy.” But that’s a different soapbox, for a different time.
In the meantime, a more practical consideration is what steps organizations can take, right now, to manage the risk of critical breaches to industry control systems to a more acceptable level. Resources such as this good practice guide for managing third-party risks to industrial control systems provide some excellent suggestions.
Here are three high-level takeaways, which should sound pretty familiar, as they are consistent with the recommended approach to pretty much every other aspect of information security:
- Understand what third-party relationships you have in your environment: Identify all of the vendors, service providers, subcontractors, support staff, and other third parties associated with the ICS value chain.
- Embed expectations about security and risk into all these relationships: Deal with issues of security, privacy, compliance, and risk throughout the lifecycle of working with third parties.
- Ensure that risks throughout the value chain are being managed on an ongoing basis: Even if expectations are clearly established up front, managing third-party security risks means ongoing validation to ensure that they are being maintained.