After watching Prashanthi and Tony’s fireside chat at the 2021 FAIR Conference about getting a FAIR program started, I was struck by the simple and insightful themes that they kept repeating. Well, simple on paper, but not always easy to keep in mind when you’re in the thick of a FAIR rollout.
Fireside Chat - How To Get a FAIR Program Off the Ground
Moderator: Rachel Slabotsky, Sr. Manager, Professional Services, RiskLens
Tony Martin-Vegue, Senior Information Security Risk Engineer, Netflix
Prashanthi Koutha, Senior Risk Engineer, Netflix
FAIR Institute members can watch the video of this FAIRCON21 session in the LINK member community. Not a member yet? Join the FAIR Institute now, then sign up for LINK.
We can all get lost in lofty goals for our risk programs, but Prashanthi put it best when she said that they agreed early on to just be better than yesterday. This incremental approach can take the pressure off your team and avoid the “boil the ocean” syndrome that Tony also warned about. Consider focusing your efforts on developing a program that is designed to:
- Be effective, efficient, and adaptable.
- Meet the expectations of stakeholders (such as clients, partners, investors, or regulators)
- Enable the organization to thrive through calculated risk-taking.
Try to approach this program mission with as few preconceived ideas about what the “mature quantitative risk program” should look like. If you’ve improved one aspect of the organization’s decision-making process, then count that day as a success.
Blog post author Evan Wheeler is VP of Risk Management at Fintech firm NVDR, Inc., and a FAIR Institute Advisory Board Member. He has spoken at several FAIR Conferences, taught risk management at the graduate level at UCLA and other universities, and is the author of Security Risk Management (Elsevier).
4 Questions before Launching Your FAIR Program
This session from Prashanthi and Tony contained many gems … these four questions jumped out to me:
1. What are the stakeholder’s strategic objectives?
Start by identifying the problem or pain points that you’re trying to solve before you go looking for issues. Too often we will take a list of “vulnerabilities” and try to find a relevant scenario to match, but really the process works best in the other direction.
This approach will keep you focused on what is relevant, but it is even better to identify risks to the organization’s strategic objectives. You want to identify the areas that could put the organization’s major goals at risk. Explore what your stakeholders value, and tailor your approach to match.
2. What is the decision?
Most likely your first analysis will be driven by a pending investment, project prioritization, or resource planning conflict. Start by understanding the type of decision that needs to be made, then scope the analysis accordingly, and reflect those drivers in how you report the results. Don’t underestimate the importance of the scoping step and customizing the reporting.
3. Which data points support decisions?
You may have many data points coming out of your analysis … which ones are worth highlighting because they support more informed decision-making? Edit and scrutinize based on the value of each data point you want to include in your reporting.
4. So what?
You found a bunch of gaps or weaknesses (maybe you have a risk register), but you still need to challenge yourself about why an issue matters in the context of the business. Don’t think of it as a risk register or a list of things to fix, but rather as a decision register. It is most effective when used as a forecasting tool to help your leaders make more informed strategic decisions.
4 Steps to Build Your FAIR Program
The next steps of your journey to build out a FAIR program translate the learnings from the four questions into actions.
1. Get out from behind your desk
Any form of risk analysis requires you to immerse yourself in the strategy and operations of your organization to understand the context of the risk scenarios. Sitting at your desk and thinking “big thoughts” won’t get the job done. Think of a professional sport as an analogy – you spend a lot more time on the road playing on other fields than at home. The same is true for risk assessments – observe how the organization makes decisions and look for opportunities to answer a key question.
2. Get to know your stakeholders
How do they consume data? How do they make decisions? How can you customize your process and reporting for them? For example, there is no rule that you can only use a single reporting format for all your stakeholders. Learn more about their style and preferences, and tailor your reports accordingly.
3. Gather insights
Think about comparing scenarios, identifying themes across scenarios, or sharing lessons learned from the analysis process that will better inform your stakeholders. For example, your conclusion might be that we can’t reduce the loss event frequency, but we can contain the impact with quicker response – that might a useful insight for your decision-makers.
4. Build a narrative
Don’t just present raw data and loss exceedance curves to your senior leadership. The results of your analysis need to be put into the context of the organization and the type of decision being made. Consider whether you are explicitly calling out how the stakeholders can use your analysis data to make a decision. For example, if you give them percentiles or an average, what does that mean and how should it be used? You may find that your audience will more easily relate to a story than interpreting a table of numbers.
If you notice that your risk program isn’t getting traction or you’re struggling to justify your approach, try watching this short video which highlights the key questions and actions to keep you focused on your FAIR journey
Prashanthi and Tony mentioned several resources that got them started such as Douglass Hubbard’s How to Measure Anything in Cyber Risk and the de facto book on FAIR, Jack Jones and Jack Freund’s Measuring and Managing Information Risk. They also recommended experimenting with the FAIR-U tool to get experience with the risk analysis process and joining the FAIR Institute and the Society of Information Risk Analysts (SIRA) to learn from communities of cyber risk quantification (CRQ) professionals. This is just scratching the surface of the available resources but should give you plenty to get your program off the ground.
When you’re looking for data for your analysis (both impact and frequency factors), there are three sources to try:
1. Internal incidents
2. External incidents of organizations with either a similar tech environment or in the same industry
3. Then present this information to your SMEs, so they can make adjustments and give you expert estimations that are “accurate enough”
Prashanthi points out that the risk team is in a great position to gather and aggregate data from many different sources (such as incident responders or threat intelligence analysts) and present the data back in a risk-aware way.
Most importantly, remember to take it slow. Resist the temptation to implement everything you’ve just read in Hubbard’s How to Measure Anything book all at once. Keep your scope focused on a few key challenges or decisions for your organization. Get some wins under your belt, make adjustments, and keep learning.
Also remember to bring the rest of your organization along for the ride – the risk team doesn’t need to do everything themselves. In fact, you’ll be more effective if you socialize the incremental improvements with your stakeholders as you go. Steer clear of any big bang approaches and instead focus on identifying champions outside your team and going after quick wins.
It wasn’t mentioned in this session, but I see the cyber risk quantification (CRQ) movement getting distracted by vilifying risk matrices, ordinal scales, and risk with colors. I’m happy to see so many more cyber professionals and business leaders getting educated about the flaws of these common shortcuts, but I also think we’re losing sight of what matters.
Let’s be sure that we understand the challenges our organizations are facing, that we’re soliciting our leaders for how they prefer to receive data for decision-making, and then designing our programs and tools to meet those needs. We don’t need to tear down what exists to justify a CRQ approach – its value will be self-evident if we do our jobs well.