4 Rules for a Successful Quantitative Cyber Risk Analysis

Survey Says IT Auditors Want to Do Better on Cybersecurity Here’s How They Can Late Night Work WomanOne common objection to quantitative risk analysis is that is harder or less efficient than its qualitative counterpart. While it is true that a quantitative analysis will always be more rigorous than the wet finger in the air approach, what I have found in becoming a quantitative risk analysis expert and training others for RiskLens, is that these notions of difficulty or inefficiency often come from not following best practices. 

In order to help you avoid these pitfalls, I have compiled four “rules” that are vital to any successful quantitative cyber risk analysis with FAIR™.

1.  Stick to the scope

A scenario scope defines the specific event - asset, threat, and effect that is relevant and sets the guardrails for the analysis.

We all know that it is common for any project or deliverable to experience scope creep, creating delay, confusion, and the possibility to stray from the original True North of the project. FAIR takes the risk of scope creep into consideration in the first step of the analysis through identification of the “Risk Scenario Statement”, which includes all of the elements of the scope outlined above as well as an optional method or vector.

Risk Scenario Elements

As a best practice, have the scoped Risk Scenario Statement consistently visible to all participants throughout the analysis. Great examples are, written on a white board during in-person data gathering sessions or attached to every email sent out asking for data. Having the scoped Risk Scenario Statement spelled out creates context for participants and ensures that you stick to the True North of your analysis. 

2.   Accuracy is king

It is human nature to want to provide a precise answer when asked a question. While this may be a positive trait, it can also be debilitating if not kept in check. For a simplistic question such as “How many new subscriptions did we receive last month?”  doing a quick query search will be able to provide the answer. However, when asked “How vulnerable is our database (asset) to an external actor attempting to infiltrate?” this is a little trickier to answer.

When estimating forward looking values, there will always be uncertainty. There are a number of unknowns to consider for this question. How capable the threat actor is, what controls are in place surrounding the database, how are those controls configured, etc. Giving a range of probable outcomes, such as the vulnerability falling between 25%-75% , accurately accounts for unknowns without going down the rabbit hole to get an exact percentage single value. Analysts could spend days or weeks and experience analysis paralysis trying to get to a single value that may not even be possible to obtain and in the end would more than likely be wrong.

The best practice is to utilize ranges to account for uncertainty (and prevent analysts from pulling their hair out). How to Measure Anything in Cybersecurity Risk by Douglas Hubbard is a great resource for learning how to measure with accuracy vs. precision in quantitative risk analysis. 

3.   Never leave a data gathering session without an estimate

Time and time again I have seen a common cause when an analysis is delayed or halted: The analyst lets a data gathering meeting end without determining a range estimate. The goal for any data gathering session is to derive the range for the part of the FAIR model the session is focusing on, whether it is vulnerability, response cost, or whatever. 

Leaving with a final range of values may not be possible after one hour-long meeting with subject matter experts. There may be considerations that arise and  need to be flushed out with further digging. However, it is imperative that the analyst does not let the meeting come to a conclusion without a baseline range of probable values. This range may be large and need refinement – but laying out this baseline range with the group will ensure that progress is being made on the analysis and will create a sense of ownership for those in the meeting.

Participants who leave a meeting with an outcome that their names are attached to as subject matter experts will naturally become more invested and will take ownership over any data utilized – rather than not having a sense of urgency to provide data to the analyst after the meeting is over.

4.   Clear and concise documentation

Miscommunication is prevalent in our everyday lives, but it does not have to take over our quantitative analyses. One of the benefits of quantitative risk analysis is the rigor and transparency provided. This transparency hinges on the effective documentation and communication of estimates provided, controls considered, assumptions made, and any supporting material. While communicating and documentation seem like such simple things, they are two of the hardest aspects of any project - whether you are working with others or just yourself.

In order for a risk analysis to be meaningful, it needs to be logical and provide the audience with confidence into the accuracy and precision of the estimates involved. If you or your team are not conscientious enough to create documentation, there will be items that slip through the cracks. In every analysis I perform I fully utilize the rationale boxes below every single input. I thoroughly state the who, what, why, how of each figure. This includes any equations, pulling in raw data, or citating reports or subject matter experts that I consulted.  Having concise and direct documentation in analyses clears confusion and ensures communication for your analysis.

Related: 

How a Risk Analysis Scope Gets Off Track (and How to Fix It)

Cure Your Risk Analysis Paralysis: Balance Accuracy and Precision

Secrets to Gathering Good Data for a Risk Analysis

Learn How FAIR Can Help You Make Better Business Decisions

Order today
image 37