“Thought leadership” is a term that gets loosely applied to any kind of marketing material in the cyber risk/cybersecurity world – and then there’s Jack Jones, who has been indisputably out in front showing the risk and security professions the way up from performative, “qualitative” risk measurement to quantitative, financially literate risk measurement with Factor Analysis of Information Risk (FAIR™), and more recently with the introduction of the FAIR Controls Analytics Model (FAIR-CAM™) that moves controls management up from compliance with static frameworks to a dynamic, systems approach.
FAIR-CAM is a model that:
>>Categorizes controls by type and function
>>Sets them in relation to each other, clarifying their interplay
>>Accounts for the direct and indirect effect of controls on risk
>>Assigns units of measurement for control performance enabling a quantitative approach for reliable analysis of the effectiveness of controls and controls systems.
In 2022, Jack delivered more significant messages to the profession in writing and speeches, in a cautionary but ultimately hopeful tone. Here are a few of Jack’s insights:
On the status of cyber risk analysis work
At too many organizations, risk “analysis” is open to anyone who wants to sit at the table and pick a color: red, yellow, or green. “Risk analysis and measurement should be considered a distinct discipline, just as forensics, penetration testing, DevSecOps and others are,” Jack said.
On easy-button risk management
“Our profession has focused on fast and easy risk measurement without a clear understanding of what good measurement looks like or requires,” Jack said. Without good measurement, advancements in risk analytics like automation or machine learning will simply be faster routes to unreliable results.
The three elements of good risk measurement
“1. A clear scope of what’s being measured — e.g., the asset(s) at risk, the relevant threat(s), the type of event (outage, data compromise, fraud, etc.).
2. An analytic model (e.g., FAIR), which identifies the parameters needed to perform the analysis, and how data are used to generate a result.
3. Data, which can (ideally) be empirical data, or simply subject matter expert estimates.
On the surface, these don’t sound too intimidating, but all three need to be done well to get accurate results. And that isn’t as easy as one might imagine.”
On automating quantitative cyber risk analysis
“We need to get our act together now. One of first steps is admitting that we haven’t been doing risk measurement well so far, and if we automate what we’ve been doing, what do we get? Wrong answers faster.
“All risk measurement requires making assumptions. An automated solution simply moves those assumptions and biases from the people doing the risk measurement to the automation solution designers, and if automation builds in wrong assumptions, then risk measurements are almost certain to be wrong.”
On controls modeling
“Why is controls modeling problematic? After all, don’t controls boil down to reducing the frequency or magnitude of loss? Yes, but the devil is in the details. If we don’t understand and account for the mechanisms by which controls affect risk, then the analytic results won’t be accurate, and we won’t be able to make good decisions.”
On defensible risk analysis
“We have to do our homework to ensure that our measurement methods stand up to scrutiny. It is easy to come up with numbers that will look reasonable to the uninitiated, but which can’t be defended. And unfortunately, there is no cost to the people who are measuring risk poorly now. The cost is all borne by the decision-makers and stakeholders who rely on those measurements.”
On open standards like FAIR vs proprietary models for risk analysis
“Open standards for complex things like encryption exist for a reason; that reason being it’s very difficult to get it right, and very easy to get it very wrong. And unfortunately, because cybersecurity risk measurement is complex and nuanced, those who aren’t deeply familiar with all of the different ways it can go wrong won’t be able to evaluate a solution to know whether it can be trusted.”
On the future of the risk analysis profession
“There’s no reason for our profession to feel bad about being immature in its approach to risk measurement. Every profession evolves from lower levels of maturity to higher. There’s only cause for shame if we don’t look at this honestly and take the steps to correct it.
“In fact, it’s an opportunity. How often do people in a profession have an opportunity to make tremendous leaps in how that profession functions? It’s exceedingly rare.
“So, it’s a huge opportunity for us but it’s also a huge responsibility. We have to do our homework. We should embrace that and take it really seriously.”