A recent article from the Wall St. Journal “Cyber Chiefs Calculate Data Breach Costs to Explain Risks to Executives” (subscription required to read) is a welcome endorsement of cyber risk quantification from the influential business publication.
In under 15 minutes, watch this webinar and learn about the FAIR™ model, the open-source standard for cyber and operational risk quantification, and the FAIR Institute, the international community that’s leading the risk management profession toward business-aligned and cost-effective risk management.
Generally speaking, the major challenge organizations have with adopting FAIR™ is changing to a new way of managing risk. And make no mistake, FAIR will require a different way of thinking about risk and a different way of performing risk management.
Whether it is difficulty with data gathering, calibrating estimates, or presenting results, problems that come up in FAIR analysis tend to stem from the same source: a lack of a clearly defined risk scenario statement.
The foundation of any good risk analysis is the rationale -- the documentation of the thought process driving the ranges you selected for analysis with the FAIR model.
Microsoft Learn, the software giant’s educational site for developers, architects and IT administrators, now highlights FAIR™ (Factor Analysis of Information Risk) in its tutorials on cloud security.
All summer, we are reading and discussing the FAIR™ book, Measuring and Managing Information Risk by Jack Freund and Jack Jones, the authoritative text on quantitative cyber risk analysis and risk management, with a new discussion guide every two weeks to help FAIR summer book clubs spark conversation.
With the first flight of an American spacecraft carrying NASA astronauts launched from US soil since 2011, a FAIR Institute Member sent us a note pointing out that the NASA Risk Management Handbook shares a lot of the spirit of FAIR™.
It’s a common misconception about quantitative risk analysis that not “enough” data or“bad” data means bad calibration. That’s not true in a couple of ways. First, one always has “enough” data to conduct an analysis and second, with calibrated estimation, we’re not dependent on the amount of data we bring to the table.
>>DHS/OMB mean well in pushing for a risk-based approach to cybersecurity in the Federal Government, but their requirements fall short of helping agencies effectively prioritize and right-size their cybersecurity investments