A while back I wrote a post called The Dangers of Being a Cubicle Risk Analyst. The premise being that a good risk analyst could not gather all of the information necessary to run a sound and defensible risk analysis from what they could gather in their four walls. A good risk analyst ventures out to gather both loss event frequency and loss magnitude data from those in the know throughout the organization.
What I’ve learned since my original post, is that risk analysts are comfortable leaving their cubicle to gather frequency and vulnerability information, as it typically takes them just down the hall to their in-business unit colleagues. But many experience their own trepidation, or real internal resistance to gathering loss magnitude data.
As I understand it, the fear being that those SMEs who know the value of a customer, what our churn rate is, how many people uptake us on our credit monitoring, how many people would be involved in potential litigation, etc., would take too long to track down and/or wouldn’t give us the time of day.
First, I’m still encouraging analysts to pull up their socks and get out of their cubicles. If you need some practical hand-holding, read this post from FAIR consultant Tyanna Smith: Secrets to Gathering Good Data for a Risk Analysis. At least get out and see the folks in the Privacy Office, who collect data from all over the organization, as another FAIR specialist, Isaiah McGowan, recommends: Smart Risk Assessment Starts Here.
But, in my experience working with analysts who do run into resistance, these are the three other ways they gather loss magnitude data, some of which I am a bigger fan of than others.
Existing Organizational Work
The first way is by leveraging existing pieces of organizational work. What most readily comes to mind are Business Impact Analyses (BIAs), Disaster Recovery Plans, Crisis Management Plans and work conducted by Enterprise Risk Management. Of all of the approaches, this I can get the most behind. It makes complete sense to reach out and leverage existing work so that we don’t have to recreate the wheel.
As someone who’s conducted this type of work in another life, my concern with taking this approach, and blindly accepting the information contained therein, is the amount of rigor that’s gone into the various work products. If your organization has a rigorous Business Continuity and Enterprise Risk Management function, that’s fantastic! Yet it’s been much more my experience their work lacks a rigorous process, and is far more subjective and most likely qualitative in nature.
The next approach I’ve experienced is leveraging, where they can, pieces of industry data. What most readily comes to mind are the reports by the Big 4, the Ponemon Institute and others.
Now, it’s important to keep in mind that not all industry data is created equal. Some reports are based on sound and repeatable processes and extensive data, while others are rather questionable, and cause wincing when discussed out loud.
The intention of this post is not to discuss whose data is reputable and whose isn’t, but to caution against trusting the data without understanding how it was gathered and whether it is sound to use for your industry and organization. It is important to keep in mind that although there are consistent forms of loss (Response, Productivity, Replacement, etc.), how that loss will materialize to your organization will depend upon your industry and the unique characteristics of your organization.
Risk Analyst Assumptions
The final approach, and the one I believe is on the shakiest of ground, especially as it relates to new quantitative risk analysts, is using their own assumptions as magnitude inputs. For those seasoned pros who have completed a series of analyses, and who’ve had considerable conversations with magnitude SMEs, I can see their inputs being a decent proxy. Yet for new risk analysts, their input is rarely more than shooting from the hip. Taking this approach is the epitome of a cubicle risk analyst.
I understand the trepidation and the resistance to gathering this data.
The approach is new, and the fact that you’ll have to reach out to colleagues you’ve never talked to before can be daunting. Keep in mind though that the goal is not to boil the ocean when it comes to gathering this data. Focus on those data points that are most impactful or important for upcoming analysis work. Maybe reach out to those SMEs where you have an existing relationship, or that have data that is considered low hanging fruit.
I think it’s also important to keep in mind why your organization decided to implement FAIR and take a quantitative approach to analyzing your risks. Often times it’s to move away from a previously more subjective and less rigorous approach. The desire to speak the same language when it comes to risk and have much more defensible and reliable results.
Ultimately, this is not achieved by quarantining yourself to your cubicle, or even your department. This is achieved by venturing out and gathering data from throughout your organization.
Sign up for FAIR Training and elevate your risk analysis game.