“I don’t get it!” said the CEO as he dropped the 300 page report on the conference table. Something was very wrong.
It was 2010 and my team had just completed a large, enterprise risk assessment for a financial services company. We followed a traditional assessment methodology and delivered a robust report filled with worksheets, diagrams, charts, graphs, and detailed explanations of risk…none of which made a bit of sense to the executive leadership. The CEO threw the report down on the table and dismissed all our work.
Risk management is a complex and highly nuanced aspect of information security. It is also largely inaccessible to executives who are not immersed in the language, philosophy, and theories of risk. Risk management is also the cornerstone of any good security program. Executive leadership needs to understand the risk the business faces if they are ever going to make informed decisions.
We need a new way to communicate risk to executive leadership. Fortunately, there are steps you can take to improve your risk management communication. However, to understand these steps, we must understand the problem with current risk management techniques.
The Challenges of Communicating Risk
After the aforementioned incident, we reflected on the whole risk assessment process. We had numerous conversations with industry peers and clients. We cataloged our notes and identified a number of trends. Business leaders were frustrated with risk management. Common complaints we heard, included:
- “Why does it take so long? “
- “I thought we had security controls in place to take care of this stuff?”
- “How do we fix these problems?”
- “What do these risk numbers mean? Are we in danger or not?”
- “This is just busywork to keep the regulators happy.”
When we analyze how risk assessments are conducted, we identified the following challenges with current risk management techniques.
Challenge 1 – Difficult in Assigning Value
In a 2012 article in NetworkWorld, Richard Stiennon describes some of the problems with assigning value to IT assets (Stennion, 2012). Stiennon argues that IT assets have extremely volatile value. Moreover, how do you assign value to something like a single email? The value of any one email can dramatically fluctuate over time and based on its content.
Stiennon goes on to describe how risk assessment efforts often devolve into “protect everything” efforts, which is equally impossible. There are ample examples of organizations that had massive budgets for security controls and risk management, but failed to actually prevent attacks. This suggests that such measures are ultimately ineffective since they do not adequately manage risk.
Modern IT environments are incredibly complex and volatile. When you consider all the possible IT assets, such as mobile devices, apps, data, and networks, the list of assets is huge, even for a small company. To compound this problem is the impossibility of assembling the people in an organization who are able to properly value IT assets, particularly when such activities would be viewed as wasting time.
The ultimate problem is that traditional risk assessment methodologies are extremely dependent upon reliable valuations. Therefore, we need a new approach to value, that refocuses on threats over asset valuation.
Challenge 2 – Risk and Security Language is Incomprehensible to Leadership
Language affects not only affects comprehension, but acceptance. When people do not understand the language of risk assessments, they are not likely to accept the conclusions.
Consider this definition for Defined Evaluation Activities from the OCTAVE risk methodology:
Implementing defined evaluation activities helps to institutionalize the evaluation process in the organization, ensuring some level of consistency in the application of the process. It also provides a basis upon which the activities can be tailored to fit the needs of a particular business line or group.
While the concept of defined evaluation activities may be relevant to a risk assessor, this is a foreign language to executives.
The challenge here is that most people (especially executives) are not immersed in the vocabulary of risk management. The terminology of risk is foreign and misunderstood. Most leaders do not understand, for example, that risk is not a problem, but a measurement of the potential of a threat to cause harm.
As such, we need simplified language if we want executives to both understand and accept risk assessment information.
Challenge 3 – Numbers Can Deceive
IT security risk assessments are comprised of complex and often disparate data types. Vulnerabilities can range from trivial to wildly complex. Impact rankings are dynamic and highly variable. Probability values are, at best, guess work since there are limited statistics on breaches and attacks. Moreover, as mentioned earlier, assessing the value of an IT asset is nearly impossible.
IT security risk assessments are ultimately subjective efforts. That is to say, most of the data points that go into a risk assessment are the result of an educated guess from an assessor (or team of assessors.)
Probability is a good example of this problem. What is the probability of a single server getting hacked? The factors that comprise that are, to say the least, complex. While there are published statistics, those statistics make a lot of assumptions. Published statistics are also based on reported incidents, which comprise a minuscule fraction of the total number of attacks.
At best, any numerical value assigned to that probability is an informed guess. That guess is also dependent upon the assessors’ skill and experience in information security. Less experienced assessors are likely to overestimate sensational threats and underestimate the seriousness of less interesting threats (Schneier, 2008).
Many risk assessors are fond of using complex equations and metrics in their risk management reports. This gives the assessment the illusion of accuracy and attempts to disguise the subjective nature of the assessment. In fact, risk assessment numbers can be very misleading.
If an IT asset has a risk ranking of 61, what does that mean? Is that good, or bad? Also, if that number was derived from other numbers, which at best were guesses, then the final risk ranking is also a guess.
Numbers skew how people view risk. It makes risk too impersonal and unnecessarily sterile. Risk is a human issue and people relate to risk in a human way. As such, we need a method of communicating risk that does not rely on meaningless numbers and equations. We need to use simplified, plain language to describe risk if we want executives to understand it.
Challenge 4 – Risk Data Gets Stale Quickly
The threat landscape of information technology is volatile. The vulnerabilities, tactics, tools, and motivations of attackers is constantly changing and evolving. Couple this with the rapid pace of change in IT, and you have a target that is not just moving, but moving very quickly.
Unfortunately, current risk assessment practices are extremely time consuming. NIST and OCTAVE estimate two to three months of full-time work for an assessor to complete a comprehensive risk assessment. The time participants must commit to these assessments is onerous as well. Anecdotally, we know of companies that need 10 to 12 months to complete there organizational risk assessment. The complex worksheets and data matrices consume immense amounts of time. Moreover, risk assessment processes can easily devolve into a melee of competing opinions and statements. The process lacks focus and therefore consumes more time.
Any risk data that is over 90 days old is stale. This adds another layer of complexity to communicating risk to leadership. Risk assessors must be able to complete assessments quickly so the intelligence from those risk assessments is timely.
Talk Like an Executive
Communicating risk to executives necessitates an understanding how leaders view risk as well as how they consume information. While there are many different types of leaders, there are some basic steps you can take to improve your odds of communicating the complexities of risk to leadership.
Use Emotional Words Sparingly
Risk is an emotional thing for everybody. We all rely on our instincts to evaluate risk and determine our tolerances. Unfortunately, this instinct is easily skewed when people do not understand the landscape of the risk. When people do not understand risk, fear takes over and decisions become quick and irrational. Furthermore, emotionalism can lead to an overemphasis of improbable sensational threats, while ignoring more serious (and likely) boring ones (Schneier, 2008).
As such, risk assessments must downplay sensationalism without entirely discounting the inherently instinctual nature of risk evaluation. One strategy for reducing sensationalism is to avoid aggressive, fear words like “war”, “terror” and “catastrophe” while embracing most positive, secure words such as “safe”, “protect,” and “enable.”
Unfortunately, the IT security industry is very fond of promoting and exaggerating sensationalist threats. Recent stories about hacking cars and medical device implants are good examples of threats that are interesting and have terrifying consequences, but not very probable in the real world.
To avoid sensationalism focus exclusively on probable threats, rather than merely possible ones. Probable threats are those that have a reasonable chance of happening to the organization and causing significant damage.
Deliver Intelligence, Not Data
Risk assessments are ultimately subjective assessments that are full of ambiguity. While most leaders are comfortable with “shades of gray” they need to be able to see there is a way through all the grayness to something better.
Where many risk management efforts fail is when they try to present data, rather than intelligence. Intelligence is data that has been rendered down into insight and action. Leaders want (and need) intelligence, they do not want binders full of data. Data gets ignored, intelligence gets attention.
The way to do this is quite simple. Explain what the data says, not what it is. Executives look to security professionals who can tell them what all the data means. Interpret the data and give a definitive assessment of what it means. The point of collecting data is to support intelligence, not replace it.
Communicate in the Now
How you express risk, is just as important as what you communicate. Consider these phrases.
- We will have to implement security controls if we want to protect data.
- We should have implemented controls to protect our customer data.
- Data compromise is a serious threat to our business. We must implement security controls to reduce this threat.
Those sentences are respectively, future, past, and present tense. Notice how they read? The first one sounds like optional and a little like a threat. The second sounds like a compliant. The final example is in present tense. It puts a threat as the subject initially. Moreover, the implementation of controls is a response to the threat. The control is not the subject.
Present tense is more definitive. It does not have the attitude and weakness of past and future tense. Those tenses should be reserved for times when you genuinely need to express something as in the past or future. Otherwise, all risk should be stated as if it is a risk right this very moment.
Present tense also can focus risk conversations correctly. Security practitioners are fond of making new security controls and technologies the focus of their risk communication, with threats being the reason for implementing a new control. Notice that in the first two examples above, controls are the subject and the threat is the object.
This is the exact opposite of how executives view risk. For executives, the threat is the focus of risk and controls are a way to reduce the threat. Controls are dependent upon threats, not the other way around. Forcing yourself to use present tense will also force you to reorganize your communication to place threat and risk at the center of the discussion, and controls as a dependence.
How to Improve the Risk Conversation
To improve the risk conversation, we must begin with the basics and build a new approach to risk assessment. In this section, we will explore six tangible steps you can take with executives to improve your risk assessment efforts and make risk conversations more effective.
Step 1 – Agree on Six Words
Communicating risk to leadership begins with perhaps the most troublesome challenge: the word “risk.” Ask a room of 20 people to define the word risk and you are likely to get 20 different definitions. Many people conflate risk with threat or vulnerability. This leads to a misunderstanding of the risk assessment process and what risk really is.
Therefore the first step to communicating risk to executive leadership is to ensure everybody can agree upon the meaning of the following six words.
- Threat: Something bad that could happen.
- Vulnerability: A weakness that could let a threat happen.
- Control: A protection that helps fix vulnerabilities and stop threats from happening.
- Impact: How bad it will be if a threat happens.
- Probability: How likely is a threat to happen?
- Risk: An assessment of a threat based upon the vulnerabilities, controls, impact, and probability that are associated to it.
Once everybody agrees to these simple definitions, it becomes significantly less difficult to communicate risk. It is important to keep this list to these six words. If you add any more words, it will make people lose interest. Also, notice that the definitions above are simple. This ensures maximum comprehension among a diverse audience of people.
It is vital that risk assessors understand that most people do not care about the nuance of risk management. Complex language and constructs make risk management more confusing and inaccessible. To communicate to leadership effectively, use simple, plain language.
Step 2 – Establish a Lens
If you want to communicate something complex, it must be broken down into pieces an audience can understand. Risk is a big, complex issue that is difficult to understand even for skilled security practitioners. Risk data demands structures that organize and categorize data for easier comprehension.
A “lens” is a method of categorizing threat data to make it more comprehensible. A lens can be any attributes that define an environment, like data, system, or application types. The most common lens we use is data type, which are the various types of data present in an environment. For most companies, there are only a few types, such as confidential data, regulated data, security data, public data, and so forth.
A lens, such as data type, not only helps organize threats, it also aids the process of analyzing threats. A lens forces the definition and explanation of a threat into the context of its lens. For example, the threat of malware infection is no longer some vague possibility, it is a malware infection that could threaten confidential data, or regulated data. In the context of a data type lens, the threat of malware becomes easier to evaluate and discuss.
A lens also provides a more efficient way to present risk intelligence to leadership. If leadership is concerned about attacks against confidential data, risk intelligence can be quickly organized to show the threats that are relevant to this data type.
The process of creating and enforcing a lens has a very useful impact on the risk analysis process. It is also difficult to do. Risk assessors must constantly reinforce the lens and continue to frame discussions into the lens. This requires discipline, but the benefits to communicating risk are immeasurable.
Step 3- Express Security Issues in Terms of Threat
Executive leadership always wants to know what could go wrong. The whole point of risk assessment is to deliver risk intelligence. Security practitioners often express security as a problem. Moreover, phrases like “best practices” or “regulatory requirement” can feel like an imperative to an executive and may elicit an automatic rejection.
On the other hand, when security is framed as a response to a threat, it becomes about the organization improving and protecting itself. This is language that is more comforting to executives, who now can understand the security controls in context of what they can do. Consider these two examples:
Undesirable: Our network is insecure. We do not have strong authentication or intrusion detection systems leaving us vulnerable to attacks. Best practices state we must implement these technologies. We are also required to implement them to meet regulatory requirements.
Preferred: An attack and theft of our confidential data is a very tangible threat to the business. Stronger authentication and intrusion detection technologies would reduce the risk of these threats, as well as meeting important regulatory requirements.
Framing the issue in the context of a defined threat to the business, makes the implementation of security controls a dependency on a specific threat. In the undesirable example, it begins with a problem and makes regulations and best practices a dependency of the problem.
This is an also an example of “flipping the object,” and extremely valuable communication tactic. That is, turn the object of a conversation into the subject. In the preferred example, the subject is a threat and the object is a solution. This is a much more natural expression of risk.
Step 4 – Get Data, Put it in the Backseat
Executives need to know that when they are looking at charts and graphs, there is real, valid data behind that analysis. As previously discussed, risk evaluations are inherently subjective. Using real security data goes gives weight to those subjective evaluations.
Vulnerability and penetration testing data are ideal for risk assessments. They can provide a snapshot of the technical security of an organization. Configuration analysis is another valuable data point. Specifically, reviews of firewalls, routers, switches, and system hardening efforts. A review of these technical controls can provide some extremely valuable insight into the overall security of the organization.
Technical data such as this is not perfect. Vulnerability data can be skewed and configuration reviews biased. To avoid these biases, engage a third-party assessor to conduct technical reviews. Penetration testing, should always come from an external, unbiased source. However, risk assessors must possess the skills to interpret this data, or engage people who can interpret the data.
Technical data on its own does not tell the full story, nor is it what executives want. What it provides is a foundation for the intelligence that executives do want. The best technical evidence is therefore in the background, providing support and confidence.
Step 5 – Simplify Impact and Probability
Impact and probability are key components to risk. They also are difficult for non-security people (such as executives) to fully understand. Both of these measurements demand both a scale and context to make them more meaningful.
Scale is especially important to probability assessments. Without a timeframe, virtually anything is probable. Therefore, all probability must be bounded with a specific time period. Ideally, this should be no more than 12 to 24 months. This gives probability a frame of reference that non-secure people can understand. It also helps aid in the analysis process, as the assessor can evaluate a threat in a controlled timeframe.
Impact evaluations have a similar problem. Impact is a compound assessment with a variable range. In other words, it is very complex and can quickly confuse executives. First, there are multiple forms of impact: financial, operational, and reputational are the most common. Second, what constitutes a “high” or “low” impact depends on what is being analyzed.
For example, consider the threat of regulatory non-compliance if systems are not configured correctly. The impact in this case is compounded. There is a financial impact; fines could be levied. There is an operational impact; systems could have to be taken offline to be fixed. There is also a reputational impact; other organizations may not want to do business with a non-compliant entity (which also exacerbates the financial impact). Not all of these impact values are the same.
As you can see, impact can quickly become a fantastically complex evaluation, which will just confuse executive leaders. Therefore, it is best to simplify both probability and impact into overall rankings.
When assessing impact, it is okay to consider all the possible types of impact, but these need to be condensed into a single impact statement. Probability works the same way. There may be many levels and complexities to probability, but these too demand simplification.
What does need to be explained is what constitutes “high” or “low” impact and probability. The chart below shows a good example of how to do this.
Step 6 – Embrace Simplicity and Brevity in Reporting
Lastly, good risk intelligence needs to be condensed and simplified. Ideally, risk intelligence should be a simplified as much as possible without losing too much resolution.
This is another example of why condensation, simplification, and brevity all work in favor of communicating risk. The more complex risk intelligence is, the more likely it will be ignored. People simply lack the ability to comprehend the vast nuance and complexity of risk. Executives, who are not immersed in the daily details of IT security, are not going to read hundreds of pages of risk analysis and worksheets. What they need is risk reduced down to the basic, core components.
The threat, Malware Infection, is very clearly cross referenced against the vulnerabilities in the organization. However, these vulnerabilities are extremely simplified. They are a distillation of what the assessor discovered about the organization. Moreover, this chart eliminates a description of the controls in place for the benefit of brevity. Instead, the chart presents recommended remediation steps. These recommendations are written in actionable, present tense language.
In this summary, risk is categorized into five focus areas, regulatory, legal, etc. Each of these categories is assigned an overall risk rating, based on the summation of threats that comprise that risk category. A description then summarizes the risk. Notice, the description does not have all the answers, this would be best left for an Action Plan. However, it also does not only focus on problems. It points out areas where there are good controls.
This type of summary is a good way to open a conversation about risk with executive leadership. It is accessible, written in business language, and definitive.
The key to making risk communication work is simplification. Risk is a very complex concept. It is difficult for anybody to understand, let alone executives. The emotional nature of risk can also cloud judgment, which can lead to bad decisions.
Simplicity and brevity cut right to the issue. The shorter and more succinct risk intelligence is, the more likely executives will not only understand it, but accept it and do something about it.
Schneier, B. (2008, January 18). The Psychology of Security. Retrieved February 12, 2014, from Schneier on Security: https://www.schneier.com/essay-155.html
Stennion, R. (2012, October 16). Why Risk Management Fails in IT. Retrieved February 10, 2014, from NetworkWorld: http://www.networkworld.com/news/tech/2012/101512-risk-management-263379.html
This article was originally published in eForensics Magazine in March 2014.
Anitian – Intelligent Information Security. For more information please visit www.anitian.com