In my past life as a CISO, I was often asked what percentage of my employer’s IT budget went toward security. I always answered, "Why should I care?" Without fail, this apparently nonchalant answer evoked a response: "Well if you don't know that, how can you determine whether your organization is spending enough on security?"
They invariably asked me to explain myself. After all, what self-respecting CISO doesn't benchmark him / herself against their peers? Don’t get me wrong, I completely understand the desire to check yourself against your peers, and in some circumstances it’s worthwhile. However, I don’t believe there’s much value in budget comparisons for our profession today, and those comparisons may actually work against a CISO as they try to help their employer manage the business's information-related risk.
To what benefit?
What practical benefit is there to comparing my spending against that of the industry? If my numbers are lower than average, am I going to be able to use that to garner more support? Not in my experience. If I haven’t effectively made my case already for the various security initiatives on my radar, the simple fact that my employer isn’t spending an average amount isn’t likely to pull a lot of weight.
On the other hand, if our numbers are about average, then I may very well be at a disadvantage in requesting additional funding for things that really do need attention. Likewise, if our numbers are high, then there’s a very good chance I’ll need to tighten the belt. Now, if the industry numbers were truly meaningful, then positive or negative budgetary adjustments on my part might be appropriate. But, the numbers aren’t meaningful and so comparison stands a better chance of hindering my ability to be effective than it does of aiding me.
Many of the security people I talk to will argue that their organization doesn’t spend enough on security. Consequently, if a significant number of companies are “under spending” (according to their CISO), then setting an industry baseline based on averages derived from “under-spending companies” further erodes the usefulness of the metric.
Does leadership care about how much they're spending on security? Sure they do, but only within the context of whether it’s the "right" amount for their organization, which differs from "the same amount as everyone else."
What’s included in the numbers?
As I’ve engaged in surveys and discussions with peers regarding security spend, I’ve seen a high degree of variability between organizations and what they consider “security spend”. The simple fact is that organizational structures vary widely and change often. Therefore, a comparative metric on a data set that can't be normalized very likely leads to both flawed conclusions and flawed consequent decisions.
Where are we in the curve?
By this I mean the “maturity curve”. In other words, is our security program just starting out, is it well established, or is it somewhere in-between? Keep in mind that the amount and nature of spending on security varies throughout the life-cycle of a security program. Therefore, it isn’t useful to compare organizations that are at different points on the curve. Sure, an argument can be made that by averaging we compensate for these differences, but it still leaves me unable to make a meaningful comparison regarding what my organization spends given its point on the curve.
It’s not just how much we spend, it’s HOW WELL we spend it.
One of my objectives as a CISO is to provide some competitive advantage to my employer by trying to achieve equivalent (or better) risk management at less cost than our competition. Now, I don’t know specifically what the competition is spending (but I do know the supposed “average”!), nor do I necessarily know what they’re spending it on (although I can guess with some degree of confidence because of the focus on “best practices” that seems common here). But I do know that if my target is simply to spend the same amount as everyone else, then I’m not focused on the right thing and I’m not being a responsible steward of my budget.
If I use the “average” security spend in the industry as a target, then I implicitly assume that the average company is doing a good job in managing information risk. I believe a quick glance at today’s risk headlines makes it clear that this isn’t the case. This is really a topic for another blog post, so I won’t dive deeply here. Briefly, I believe our industry is still far too dependent on the shamanistic principles of:
- FUD (Fear, Uncertainty and Doubt) -- scare the non-believers into following our advice. “The thunder-gods will get you.”/”The hackers will get you.” -- not much difference there.
- Best practices -- “The tribe down the river does it this way, grandpa did it this way, so we have to do it this way.” Some best practices are badly dated, others reek of vendor agenda, so there’s no guarantee that a best practice is the right solution for our particular risk issues and corporate risk tolerance. Perhaps worse, blind adherence to best practices violates our responsibility as stewards of our budget to look for cost-effective solutions.
- Gut instinct of the practitioner -- Don’t get me wrong, many security practitioners have developed outstanding instincts. Furthermore, good instincts are a critical component of dealing effectively with almost any aspect of life. The problem is that without applying a dose of critical thinking and analysis to the complex problems we face, we’re -- a) too vulnerable to personal bias, industry myth, and dogma, and b) unable to effectively defend our conclusions and recommendations to our stakeholders.
Perhaps the most significant concern I have about budget benchmarking is that it implies there’s some universally accepted “appropriate amount” of spend. Hogwash. The fact is, every organization has different resources, expenses, risk levels, and risk tolerances from every other, and it’s a fallacy to believe one-size-fits-all. The good news is that our organization’s leaders know what their resources and overall expenses are, and they have an innate sense of what their risk tolerance is (because they’re making risk decisions every day). I believe the challenge has been that we haven’t been doing a great job of providing leadership with useful risk information. Until we can do that, the question of how much we’re spending on security seems almost moot.
If not benchmarking, then what?
The bottom line is that the “right amount” of security spend is unique to each organization. Furthermore, executive management’s opinion is the only one that ought to matter regarding what that amount is. They are the ones who have a clear understanding of the company’s condition, objectives, resources, competing risk issues, and risk tolerance. It’s their job to manage the overall business risk portfolio. Our job is to help them make well-informed decisions regarding our piece of that puzzle by providing a clear, unbiased, and useful picture of their information-related risk and risk mitigation options. Until/unless we do that, then any argument regarding appropriate security spend isn’t terribly useful.
What is your point of view?
But then again, this is just my point of view based on my experience. What has yours been? Have you found ways to leverage industry numbers on funding to gain additional support? I would like to heard from you. Please share your comments below.