I wear my ‘FAIR™ evangelist’ badge proudly. I have had the opportunity to present quantitative risk analysis to a variety of audiences, including my peers in the IT field, senior leadership in private and public sectors and even to business and cybersecurity graduate students as a guest lecturer.
Invariably, after every presentation the questions always come down to data-sparse environments and short-term practicality. But of course, we should have expected this. COVID has accelerated the digital transformation. Business is cyber and cyber is business. Unfortunately, that doesn’t mean the business world has adopted IT principles.
Caleb Juhnke is a contracted senior cyber risk analyst for the United States Department of Agriculture. With over 10 years of experience in Information System Security, Caleb has recently completed FAIR certification and is working to bring FAIR and quantitative analysis to the USDA.
Quite the opposite. Information technology, and cybersecurity in particular, has had to adapt to the business world, and that means providing proof of value -- showing the audience that your solution works in their environment. We are past the Golden Age of IT Tool acquisition. We have entered a new era, the era of cybersecurity tool exhaustion. Management doesn’t want another sales pitch for the latest silver bullet in risk avoidance. Business leaders and resource managers need solutions now. So, after you present quantitative analysis, be ready to offer proof of value, today.
Have a good answer ready for objections regarding data availability
From experience, do not take the question-and-answer time after the presentation to jump into Bayesian statistics and probability theory. If your presentation was done right, the audience will already be left a little off balance from the introduction to Monte Carlo simulations. Jumping back into complicated number crunching should be avoided when possible, especially in an introductory presentation.
As a general rule, when presenting to first-time hearers the magic of quantitative analysis should be wielded as a scalpel, not a hammer. A response to a question about data availability can be a good time to make a quick reference to the proven fields of sparse data analytics and decision science, but the real value to your presentation would be utilizing your understanding of the audience and walking through the FAIR baby steps of data extrapolation: starting with the absurd, and testing to 90% confidence.
The best way to do this is to prepare an example that answers a business problem or loss scenario common and/or relevant to your audience. Most recently I prepared a slide that walked my Midwestern audience through a scenario quantifying the loss associated to a Kansas City area office closing due to inclement weather (snow days).
Using a scenario that is fresh on the minds of the business enabled natural conversations, and guided the subsequent data extrapolation. We were able to scope out productivity losses, and response costs for snow removal. We also started conversations on secondary losses associated with reputation and market share damages due to regions/competitors not affected by winter weather.
By avoiding a one-way conversation with complicated probability theories, the audience was engaged and able to witness the power of quantitative analysis in a controlled scenario. There is no one-sized fits all scenario to answer the question of data availability. The FAIR presenter must do adequate research on the target audience to select universally applicable loss events that provide a good source for scoping. Be ready to conduct impromptu interview questions with audience members about event frequency and loss magnitude. Use your understanding of the FAIR framework to facilitate natural conversation about the loss scenario specific to your audience.
I suspect some of you may object to this suggestion. Some may argue that it’s impossible to explain quantitative analysis without lecturing on a probability theory. To those mathematicians and statisticians who balk at the notion that a simple example is best, remember what Albert Einstein said, “If you can’t explain it to a six-year-old, you don’t understand it yourself.”
Be ready to provide specific recommendations to make immediate improvements
Every audience has a realist. It can be a newly minted, hard-charging MBA looking to be heard, or a weathered senior executive trying to cut through the smoke. The FAIR evangelist must be prepared to provide immediate actionable recommendations to the audience’s environment.
I have often found value in retelling the success, Cody Scott, Chief Cyber Risk Officer at NASA, brought to his organization by standardizing terminology in risk discussions. As a first step, I recommend highlighting the value of implementing FAIR taxonomy and ontology. It is an immediate solution with no cost that hits the management buzz words of ‘effective communication.’
Secondly, I usually recommend slight, but powerful changes to the organizational risk register. Most organizations use a qualitative measurement in their risk registries to define frequency. Make the simple recommendation to replace ambiguous terms ‘low, medium or high’ with annualized loss expectancy. If an event is only expected, or recorded, happening once a year replace ‘low’ with ‘1.’ This actively engages the organization’s risk managers in a form of quantitative analysis, and builds a foundation which can be quickly matured. Organizational change is slow, but providing easy-to-implement recommendations starts movement in the right direction.
So, what do you do following your FAIR presentation to stakeholders? Be ready to engage in practical examples with immediate proof of value. After the initial FAIR presentation, it is the wrong time to petition for quantitative analysis tools. Be prepared to discuss the ability to conduct quantitative analysis from a spreadsheet. When the opportunity is right, an open discussion can be had about scaling your efforts. As a FAIR evangelist the best thing we can do is make the framework as accessible and approachable as possible; do this by preparing applicable examples of data extrapolation, and by giving recommendations that make improvements to the existing risk management program.