Human Nature in Our FAIR Risk Programs: Work With It, Not Against It

Behavioral Economics and Risk ManagementAs a graduate student exploring the emerging field of Behavioral Economics, or the science and art of judgment and decision-making, I have the unique opportunity to regularly draw from this study’s foundational concepts and directly apply them to communicate risk more effectively and help build the FAIR™ risk quantification program at Equinix.

The most important thing I have learned thus far: We’re better off working with human nature than against it.


Zach CossairtFAIR Institute member Zach Cossairt is Information Risk Program Manager at Equinix and the winner of the 2021 FAIR Business Innovator Award. Zach is working on his Master of Arts in Behavioral Economics at The Chicago School of Professional Psychology.  Read a Meet a Member interview with Zach.


One of this emerging field’s pioneering minds, Daniel Kahneman, was awarded the Nobel Prize for his revolutionary work formally establishing the psychological foundations of human decision-making (His partner Amos Tversky would have surely shared the award with Kahneman if he had been alive at the time).

The duo’s series of research is reflected in Kahneman’s book, Thinking, Fast and Slow, that introduces a language for talking about the human mind and describes how two distinct cognitive modes of thought work together when making judgments and decisions under risk and uncertainty.

Richard Thaler and Cass Sunstein characterize these dual-system operations well in their own influential book Nudge: The Final Edition.

“It is useful to imagine the workings of the brain as consisting of two components or systems. One is fast and intuitive; the other is slow and reflective.”

My goal of this post is to explain how understanding Kahneman and Tversky’s research, and the work of other brilliant contributors to the field of Behavioral Economics, can prove useful to our roles communicating risk to other humans and driving positive change when developing and maturing a risk program.


Try FAIR risk analysis for yourself with the FAIR Institute’s free FAIR-U application.


The Two Cognitive Systems in Risk Management: Where’s the Handoff?

Behavioral Economics and Risk Management - Square 2System 1 thinking is fast, automatic, and can be attributed to what we consider unconscious thought, or intuition. Operating characteristics of this system are similar to those of perception and cue recognition and are not what most interpret as “thinking” at all.

System 2 thinking is deliberate, self-conscious, and controlled.

Here are a few ways that these two systems work together, and where automatic thinking can benefit from backup by the more reflective System 2:

Managing Overconfidence in Intuitive Expertise

When should we be comfortable trusting a self-confident expert valuing the skilled intuition they’ve developed based on their past experiences? Intuitive skill can absolutely be learned but in very specific settings.

Skilled intuition is successfully developed in stable environments that promote regularity and an opportunity to learn through prolonged practice and continuous feedback. How does this description of the ideal environment for System 1 to thrive relate to the one in where we asked to forecast risk?

System 1 needs a little help during risk analysis. We operate in an almost zero-validity dynamic environment in which we make long-term forecasts of uncertain future events where we lack the opportunity to learn the valid cues that neater, and more regular systems would provide. Threat actors are shifty, control variance is real, and interdependencies make things especially difficult, so we should be respectfully skeptical when our experts present confidence in intuitive judgment during risk analysis.

With FAIR, and risk quantification in general, we harness deliberate thinking to apply necessary rigor through decomposition to more directly observe components that are easier to measure. This allows us to slow things down and allow for the System 2 to take the reins and test our hypotheses with necessary reflection.

It is our job as risk professionals to fill our analyses with as much objective data as possible, improve subjective judgment through calibration, and maintain well-documented rationale to diplomatically manage the doubts of stakeholders who hold strong faith in their “feeling” that the risk they identified was really high.

Making Judgments under Risk and Uncertainty

Behavioral Economics and Risk Management - Square 1Kahneman and Tversky’s Heuristics and Biases research project discovered that we adopt mental shortcuts to simplify judgmental tasks that are often too difficult for our boundedly rational mind to solve. They showed us that these general-purpose heuristics are a result of answering questions we are not originally asked, and this substitution can lead to systematic biases that can make our decisions worse.


Learn FAIR quantitative risk analysis in beginner or advanced courses through the FAIR Institute.


People unconsciously adopt the availability heuristic to assess the likelihood of risks by asking how readily similar events come to mind. Events that have occurred recently, are salient, emotionally charged, or a result from personal experience will often be judged to occur more often than those with objectively higher probabilities.

System 1 will retrieve these instances easily from memory and System 2, which is averse to attention and effort, will often leave these intuitions unchecked. We should do our best to nudge decision makers back into the realm of true probability to ensure that resources are best aligned to prepare for and respond to the most pertinent risks.

Managing Inertia and Resistance to Change from the Status Quo

Behavioral Economics and Risk Management - Square 2People will work harder to avoid losses than to achieve gains. Ever see or read Moneyball? The protagonist Billy Beane said, “I hate losing more than I even wanna win.” He famously gave this quote that simply describes how evolution has set us up with a strong withdrawal system that triggers negativity and recoil before positivity and approach.

This loss aversion comes from one of the oldest parts of the brain that our ancient ancestors utilized very often and quite well to avoid major threats and give themselves the best odds to live long enough to reproduce. It’s entirely a System 1 operation, it’s the most important contribution from the field of Behavioral Economics, and it’s everywhere in what we do building risk programs and influencing decision-makers.

The unevenness between our motives to avoid losses and achieve gains produces inertia and status quo bias, two things we are all too familiar within the field of quantitative risk management. Every day we deal with humans who are stuck in their ways of using colors and adjectives to communicate risk. This way of doing things is their reference point, and any proposed change is evaluated as a concession, and therefore a loss. People. Hate. Losses.

The reluctance to give up what we have and rather adopt the “Why fix it if it ain’t broken?” approach is deep rooted, so we should not be surprised that large reforms such as changing a risk culture often fail. Remember when we talked about developing intuitive skill? Prolonged practice at reacting to bad things more quickly than good ones made System 1 a champion at avoiding losses, and this means it takes a powerful effort to frame what we are doing in a way that can overcome inertia and the gravitational pull to the status quo.


Network with, learn from your peers at the forefront of risk management. Become a member of the FAIR Institute!


Applying the Insights of Behavioral Economics to Risk Management

This post has shared only some of the major discoveries through the study of judgment and decision making, or the field of Behavioral Economics. I’ve found practical application of this discipline’s concepts through an understanding of what we can expect out of the humans we interact with when contributing to the success of our risk programs.

We should not expect people to follow us into the risk management promised land simply through trust in our good faith and logical reason. This is something we would expect Mr. Spock of Star Trek to do, not an overconfident human averse to change. It’s in our best interest to design our risk programs with considerations of sound psychological foundations and to work with human nature, not against it.  

Related post: Daniel Kahneman’s Book 'Noise' Sounds the Same Alarms about Muddled Decision-Making as the FAIR Movement

Note: The views in this blog post are solely the insights of the author and do not reflect those of his employer.

Learn How FAIR Can Help You Make Better Business Decisions

Order today
image 37