Ponemon Report on the True Cost of Compliance -- A Missed Opportunity

Ponemon-Report-True-Cost-Compliance-Missed-Opportunity.jpgThe Wall Street Journal recently referenced a research report published by Ponemon Institute entitled The True Cost of Compliance With Data Protection Regulations.  After reading the report I’ve come to the conclusion that although the research objective was admirable, it completely missed the target. 

Some of my concerns should be pretty obvious to anyone who reads the report with a critical eye.  Others are more subtle, but no less concerning.

Why should you care?  Maybe you have no intention of reading the Ponemon report.  As our profession matures and adopts more rigorous methods for understanding, measuring, and managing risk we will undoubtedly see more research studies and reports.  Some will be really well done, and others not so much. 

This blog post is in some ways less about the Ponemon report itself, and more about how important it is to read these reports closely and critically.  We owe it to ourselves as professionals to set a reasonably high bar for reporting so that we aren't either mislead and/or don't have our time wasted.

My first problem with it…

They use the term “compliance activities” as a label for things like:

  • Policy development
  • Threat intelligence
  • Encryption
  • Communications and training
  • etc…

The problem is, those aren’t necessarily compliance activities.  They may be things that compliance frameworks/regulations call for, but to label them as compliance activities seems a bit myopic.  They are security and risk management activities.

Let’s say I’m the CISO of an organization that isn’t subject to data protection regulations.  What’s different about the costs I incur in protecting my organization, from costs incurred by organizations that are subject to such regulations?  My program will still have the costs associated with policy development, threat intelligence, encryption and such (which the report calls "compliance activities"). 

The only real difference is that I’m not incurring the costs associated with preparing for and facilitating regulatory audits, reconciling findings with the examiners, sometimes deploying controls called for by regulations that aren't cost-effective, and debating with executives whether we should prioritize a regulatory finding that doesn’t represent significant risk, at the same level or higher than deficiencies that weren’t within the scope of the exam but that represent more risk.  Those are the compliance activities and costs that data protection regulations impose.  

Some readers may say, “Organizations that aren’t regulated don’t follow good security practices, so good practices are, in fact, compliance activities.”  You’re free to make that assumption, but it isn’t universally true.  Any well-run risk management program will include those elements.  

But let's assume for the moment that this is really just semantics.  If, for example, we replaced the word “compliance” with “risk management” throughout the report, would its conclusions hold?  Unfortunately, the problems with this report run deeper than just nomenclature.

More is better?

The “key takeaway” from the report was that “it pays to invest in compliance” and that if organizations spend more on compliance it will ultimately be less costly than if they are in non-compliance with data protection guidelines.  Said more simply, the more you spend on security the less likely you are to experience adverse events

Okay, but up to what point — i.e., what about diminishing returns?  The report completely ignores the question of diminishing returns, so should we assume that an organization should spend all of its resources on compliance activities?  Of course not, but without an answer to this question I really haven’t gained anything from the report.  Anybody could tell you that by spending more on security you should, on average, experience fewer loss events.

But perhaps someone will retort, “You only have to do as much as the regulations call for.”  That line of thinking assumes that regulation requirements are all that are necessary for an organization’s protection and that those requirements align perfectly with executive management’s risk tolerance.  In my experience, neither of those are usually true.

One plus one equals…?

On page 16 the report claims that “More compliance audits reduce the cost of compliance”.  That one sent my head spinning, so let’s look at it to see if it holds up logically.

According to the report (page 24), audit activities fall under the “Compliance Monitoring” category of compliance costs.  Since audit activities incur cost, more auditing should add to the cost of compliance rather than reduce it.  Unless, of course, other compliance activity costs go down to offset the increase in audit costs.  However, when I look at the other activities that contribute to compliance cost, it isn’t clear to me how any of them would decrease as a result of more audits. 

Ponemon-Report-True-Cost-Compliance-Missed-Opportunity-Text-Two.jpgMy confusion is increased on page 18, where the report claims that compliance costs (you, know, security activities) go down with a higher Security Effectiveness Score (or SES, the security posture rating system developed by the Ponemon Institute).  How can this be, given that the security program components that should drive a good SES score all cost money?

In order for me to believe the report’s claim that better security and/or more auditing reduces compliance costs, it needs to explain the cause and effect — i.e., which other compliance costs go down as a result of more audit activity.  Absent that explanation, one plus one is not adding up to two.

Costs from non-compliance

Any time I see a report or analysis that includes losses measured in financial terms I immediately become curious about where the dollar figures came from.  I’ve been doing quantitative risk analysis a long time, and I’ve learned that there are very few people in the cyber, security, IT, and compliance profession who are capable of providing accurate loss data.  It’s like finding a business professional who can make good estimates of security effectiveness or threat intelligence. 

We hammer this point home when we train people in FAIR — always get impact data from the business.  Furthermore, even when speaking with the right people in the organization, the interviewer (at least) needs to be trained in calibrated estimate methods in order to get reliable data.

I don’t know who did the interviews for this project or whether they were calibrated, but nowhere in the list of participating roles did I see any that appeared qualified to provide impact data.  That’s a huge red flag for me. 

A few other things make me question the accuracy of this report from a “cost of non-compliance” perspective:

  • The report (page 3) claims to evaluate the “full costs associated with an organization’s compliance efforts”, but it ignores at least one significant cost component.  What the report failed to consider is that the money it says organizations should throw at compliance activities is money that could alternatively be used to address other business imperatives, such as growth, operational needs, or other forms of risk.  Economists and most of the business executives I’ve worked with would see these missing opportunity costs as a meaningful hole in the analysis.  (Note: The report does claim to cover a specific category of opportunity cost, but it’s different than what I’m talking about.)
  • Without better clarification, I don’t know how they differentiated “Business Disruption” from “Revenue Loss” in their non-compliance cost activities centers.  This question became even more significant when I saw on page 22 the statement that, “… the inability to deliver services and products causing revenues to decline”, which seems to make it very clear that business disruption and lost revenue are related.  My concern is that, without clear differentiation, they double counted losses in the report.  I’ve seen this mistake many times over the years, and am fearful that it happened in this project as well. 
  • I might have been willing to give the report the benefit of the doubt regarding my point above if it weren’t for the next paragraph of the report.  Specifically, the next paragraph states, “Beyond the economic impact, non-compliance increases the risk of losing valuable information assets such as intellectual property, physical property, and customer dataFurther, non-compliant organizations risk becoming victims of cyber fraud, business disruption, and many other dangers that might cause them to fail.”  Unless I’m mistaken, the effects of those events are fundamentally economic in nature, so I’d be very curious to hear a description of what they consider to be “beyond the economic impact”.  Every time I’ve had a conversation with someone who has tried to describe non-economic impacts, they’ve always pointed to reputation damage.  But the report seems to recognize the economic effect of reputation damage on page 25 under “Lost Revenue”, so I’m further at a loss as to what the non-economic impacts they’re referring to might be.  

The bottom line is that the report either leaves out information that is necessary in order for us to have much confidence in its treatment of non-compliance costs, or it is simply broken and inaccurate in this dimension of its analysis.  Either way, this places a very large question mark in my mind about the report’s findings.

A gap in the “Gap”

Page 12 of the report discusses the cost of non-compliance, and claims the data provides evidence that organizations do not spend enough on compliance activities and that, ”… if companies spent more on compliance… they would see a more than commensurate reduction in non-compliance cost.”  It then provides a chart (Figure 8) that shows a gap between the cost of compliance and the cost of non-compliance.  A gap between those two cost types, however, isn't the same thing as "more here equals less there".  In fact, nowhere in the report do I see anything that supports their claim. 

Ponemon-Report-True-Cost-Compliance-Missed-Opportunity-Text-One.jpgIn order to support their claim, the report would have to show that non-compliance costs are statistically lower for companies that spend more on compliance.  The title for Figure 10 on page 13 seems to claim such a relationship (Non-compliance cost by percentage of the IT budget), but that’s not what the chart shows.  Figure 10 actually shows some cost (I assume Non-compliance cost?) by headcount. 

Even if Figure 10 was the chart it claims to be, I’d have to ask myself whether they’ve accounted for the fact that it isn’t necessarily how much you spend, but whether you spend it on the right things (i.e., prioritize well, treat root causes, choose cost-effective solutions, etc.).  I’ve encountered numerous organizations that spend a lot of money on security activities (“compliance activities”) yet still experience a lot of losses that would be classified as non-compliance cost in this report.

Sloppiness

I include this section of my review not to pick nits per se, but to encourage future research projects to apply reasonable quality controls.  I can’t help but have my confidence shaken in the overall credibility of a report when I see sloppiness.  Don’t get me wrong — mistakes happen in reports.  I’ve done it, and I bet you have too if you’ve written this kind of report.  So it isn’t a matter of whether mistakes exist in a report, it’s the type of mistakes and the volume of them.  If there are enough of them, a report’s overall credibility has to become suspect.

Some of the problems I noticed were production-related.  Others were content quality.  For example, there were places in this report where chart descriptions don’t match the chart.  In some cases this was simply a matter of descriptions and charts on a page being switched.  The reader has to mentally make the switch back.

In other places, the report claims to deliver a piece of intelligence/insight that, well, it never discusses.  For example, at the top of page 15 it says it views different compliance regulations in terms of importance and difficulty.  It only covered difficulty, and never got around to “importance”.  For that matter, the information on that page seems entirely out of context with the rest of the report.  At least it isn’t factored into the analytics or conclusions in any way.

On page 22 the report claims that “…non-compliant organizations risk becoming victims of…”, which logically suggests that compliant organizations don’t risk becoming victims of those events.  Clearly, organizations that do spend significant money on compliance are still subject to losses.  Probably, this is just an oversight on the author’s part rather than an actual belief that compliant organizations can’t be victimized, but it is at the very least an inaccurate and misleading statement.

In the Conclusions section it makes a classic FUD statement, “…might cause the business to fail.”  First of all, there was nothing in this report that discusses catastrophic events of that nature.  Secondly, any report claiming to provide analytically-driven insights has no business including a statement of that nature unless it's specifically discussed/supported in the analysis.

In the interest of time (yours and mine), I’m not going to describe the other quality problems I saw in this report.  If you take the time to look, I’m sure you’ll find several that I haven’t discussed. 

Wrapping it up

For the reasons above, I’m afraid I can’t take this report or its findings seriously, which is unfortunate given the importance of the topic.  I’m fearful, however, that people both inside and outside our profession will reference the report to drive decisions either because they haven’t looked at the report closely, or because its conclusions can be leveraged by them to advance an agenda (it happens). 

In fairness to Ponemon Institute, it isn’t possible to hit a home run every time you step to the plate (at least I know I don’t).  Perhaps they’ll revisit this topic next year and make the necessary adjustments to knock it out of the park.  Fingers crossed.  In the meantime, if you hear/see someone reference the report, please suggest to them that they reread it with a critical eye.

 Jack Jones is the creator of  the FAIR model for risk quantification.  Join Jack for live, online discussions -- become a FAIR Institute member, then check the schedule for Workgroup Calls

Related:

Jack Jones Looks Forward Into 2018 for Cyber and Technology Risk

Learn How FAIR Can Help You Make Better Business Decisions

Order today
image 37