FAIR Institute Blog

Calculating Your Company’s Total Cybersecurity Risk Exposure (Part 1)

[fa icon="calendar"] Apr 21, 2021 10:11:36 AM / by Gideon Knocke

Gideon Knocke

Skyscrapers - Total Cyber Risk of an Organization copyQuantifying risk scenarios using quantitative analyses helps understanding the exposure to specific risks, however, building a portfolio of quantified risks to understand and manage a company’s risk landscape comes with additional challenges.

To set up a portfolio, additional factors must be taken into consideration: The risk scenarios should be largely separate to avoid overestimating risk when aggregating them. Also, a portfolio of risks aims to provide answers to the coverage of a company’s total risk exposure. In other words, if the top 10 risks are aggregated, does that cover 5%, 50% or 95% of a company’s total risk exposure? To answer this question, an analysis of a company’s total risk exposure provides the necessary insights.

FAIR Community Expert ContributorAnother advantage of this analysis is increased confidence. The higher the level of abstraction, the better risk becomes predictable. Comparing your risk portfolio with an analysis of the total risk exposure is a welcome cross check, whether single risk scenarios might have been overestimated.

 

1. A bird’s eye view - choose a generic approach

There are several approaches to calculate the company’s total risk exposure. One simple approach is taking the perspective of a cybersecurity insurance. For the given limits and incidents/assets covered in an insurance policy, the average expected annualized loss should be lower than the cost of the insurance, assuming the insurance is economical for the insurer.

A more sophisticated approach to analyze a company’s risk is to utilize breach databases and other external information that supports the estimation of impact and likelihood. For such analysis, it is important to focus on the primary macroscopic drivers of cybersecurity risk such as size and industry of the company. Often, the analysis puts too much weight on the details of the security program or specifics of the company’s business. This might lead to a faulty analysis as the reference point of the analysis are companies of similar size and industry who are likely to have a similar risk exposure.


Gideo Knocke - Risk Manager - Fresenius

FAIR Community Expert Contributor

Gideon Knocke is a cybersecurity risk manager at Fresenius, the medical equipment maker and hospital operator, based in Frankfurt, Germany.

 


 

2. Utilize and interpret external information

The Cost of a Data Breach Report published by IBM Security based on Ponemon Institute research, is a good starting point. According to this 2020 report, $3.86M was the average total cost of a data breach. To adjust this average value to the size of a specific company, the number of employees is a good proxy. Utilizing the information from research by Romanosky, a company of 100,000 employees would expect to have approximately 22% higher cost than given by the breach report.

The research by Romanosky also shows that the more records affected in a breach, the lower the cost per single record in a breach. The 2019 version of the Cost of a Data Breach Report stated in addition that approximately 25,500 records are being lost on average within this study. Depending on the extent a company is storing and processing sensitive records, the average amount of breached records could be much different. Considering analysis by Romanosky, a data breach of just 2,000 records would be expected to still result in approximately half the incident cost of the average data breach given by the breach report. This approach presents an alternative to taking Ponemon’s cost per record as a basis which would greatly underestimate the impact of small-sized breaches and overestimate the impact of major breaches.


FAIR fundamentals training led by experienced experts - join a hybrid course.


Lastly, the number of records expected to be lost must be analyzed. This can be expressed in a range given by a minimum/maximum (90% confidence interval) and the most likely amount of records breached. A useful data source to estimate the probable amount of records breached is the VERIS community database. According to VERIS, a reasonable estimation could be a lower bound of one record, a most likely breach of 1,000 records and an upper bound of 2.7M records. With the example  numbers given in this article, a breach would range between $200,000 and $20M with a most likely breach of $4.7M.

 

3. Be smart about incorporating information

The result can be further adjusted. For example, by taking factors affecting breach cost as identified by the data breach report into consideration, such as the formation of an incident response team. However, these factors might not be independent. For example, a company that makes extensive use of encryption has a higher probability of having a CISO in place so taking both factors into account could lead to an overestimation in cost reduction. Various other data sources exist such as industry specific reports, loss tables and reports on insurance payouts. It is crucial to understand the limitations of the data used. But the more independent high-quality data sources are being utilized, the higher the accuracy and confidence towards the results.

Related: How to Measure Aggregate Risk by Jack Jones

Topics: Member Content

Gideon Knocke

Written by Gideon Knocke

Gideon Knocke is a cybersecurity risk manager at Fresenius, the medical equipment maker and hospital operator, based in Frankfurt, Germany.

Join the FAIR Community