As auditors , you often get a bad rap. Given audit is a compliance focused profession, one of the many aspects of your job is telling someone that the way they do theirs is wrong, which is not a fun conversation for either party. I could write a whole blog on this topic alone – I have actually. Check it out here.
One thing I was always taught growing up, though, is never to present a problem without a solution. With that thought in mind, I have outlined three tips to making your job more than compliance. All three tips are made possible by following the FAIR model for consistent, quantifiable analysis of cyber risk.
1. Trust but Verify
Sound familiar? In my audit days, this was something we heard constantly in reference to our clients. Trust they are performing the procedure/process/ mitigation appropriately but verify it just in case. This is the same mentality we need to have with ourselves and with others on our teams.
One of the things I was guilty of in auditing was never going beyond the check-box. Something was ‘High Risk’ because somebody said so and an audit finding was issued because the box was not appropriately checked. But who decided that box needed to be checked anyway? And what might happen if it isn’t?
Regardless of what level you are at in your team, follow the motto: Trust but Verify. Start asking ‘how’ and ‘why’ questions.
- Why is that system/process a high risk?
- How might this control deficiency get exploited?
- Why is this control in place to begin with?
- How sensitive is the information this control is related to?
2. Put It into Perspective
The first, and most important lesson I learned when I adopted a FAIR mentality related to cyber risk is that there is far more to "risk" than controls (or lack thereof). In order to truly understand the amount of risk a control deficiency poses to an organization; you must first understand the purpose the control plays.
One way to do so is to imagine what scenarios could occur as a result of the control not operating effectively. For example, if a change management process is missing an aspect of the approval process, it could lead to an inappropriate change being pushed into production which could cause an outage of the organization’s crown jewel application. Remember: focus on only probable scenarios, not all possible.
On the other side of the spectrum, an authentication control failure is meaningless to the organization’s bottom line if it is related to an asset that has a low likelihood of being targeted by malicious actors.
Learn more here: How IT Auditors Evaluate the Effectiveness of Controls with Risk Quantification.
3. Rethink Your Vocabulary
If you have ever been trying to communicate an audit finding or risk level to a client and felt like you were talking to a wall, it could be because you are speaking different languages (or they put you in the secret dungeon room in the bottom of their building they reserve for auditors).
The thing is, in the risk management industry, we have a tendency to throw around words that are ultimately meaningless because there are so many varying definitions for them. Like the very first word of the industry: risk. It goes beyond just misusing common words, though. We also use words that have no objective definition at all, like "High", ‘"Medium"’, and "Low".
The Factor Analysis of Information Risk (FAIR) model is a method of understanding and communicating risk in the most universally understood language there is: economic value – dollars and cents.
By using a common vocabulary and communicating amounts using quantities as opposed to subjective categories, we can begin to have meaningful conversations with our clients as well as with one another across the industry.
Read this next: What I Learned Leaving Internal Audit for Risk