I have had the privilege or the curse of working with metrics--depending on which side of the fence you are on--through the course of my career. I have tended to lean towards the latter. This is mainly because a lot of the time, the organizations I was working for were focusing on whether the metric was a leading or a lagging indicator, a KPI or a KRI and not whether it was actually a useful metric to monitor in the first place. The underlying problem was that they were looking at risk in subjective ways, and mis-using metrics, as in “more patching must equal less risk.” Needless to say, it was rather frustrating.
One of the benefits of moving to FAIR (Factor Analysis of Information Risk), with its objective, risk-based analysis is that is clarifies which metrics are useful to monitor for understanding the impact of cyber events. I’d recommend you start by watching these three sets of metrics on an ongoing basis to set yourself up for more effective risk analyses.
Frequency of Attacks
One of the key factors in a FAIR analysis is the frequency of the events occurring, for instance firewall activity or phishing attempts. Since this is usually a forecast in most analyses, it is a good metric to monitor as time progresses.
If there is a large change in the original forecasted range, this is something that is important to help determine why that material change occurred. It can help lead you to answer questions like, “Was an additional control added?” Or “did our position change within the marketplace that is causing us to be attacked more?”
Susceptibility
Keeping an eye on the various controls that may affect your susceptibility to attack, like access, authentication or structural integrity (patching, etc.) controls, is a good starting place. All three of those items could play a significant role in a cyber risk analysis. A change to one of them has the potential to cause major effects to your susceptibility (or Vulnerability percentage, in FAIR terminology – for an explanation, see my video 'Vulnerability' in Risk Analysis, Explained in 2 Minutes).
Loss Magnitude
Keeping an eye on your changing potential losses (or risk exposure) may seem pretty obvious, however for some organizations that’s not the case. For instance, if your time to respond to an event has gone up, or if you have outsourced your breach response, or if new regulatory fines have gone into effect or you have an added dependence on one application, your potential loss magnitude may have changed.
Remember, the most important thing that needs to happen before you can even look to set up metrics to monitor is to first identify what are the loss events that are keeping the leadership up at night (or should be). Having a clear insight into what factors are causing the most loss exposure to the organization will guide you to the most meaningful metrics to follow.