Why security metrics aren't helping prevent data loss

Security metrics are supposedly a way for upper management and IT departments to converse intelligently about in-house security programs. Why aren't the metrics working?

Reported data loss due to security breaches is not slowing down in the least bit, as the graph below (courtesy of vividly points out.  What's more, these statistics only include publicly reported breaches. One can only imagine how many security breaches are unreported by organizations wanting to avoid public scrutiny.


And did you notice what happened during 2009 — interesting isn't it? I'm told there were several reasons for the drop in reported data-loss events. The one being championed by most was the introduction of security metrics. It seems security metrics as a tool started gaining real credibility around that time. The SANS Institute paper, Gathering Security Metrics and Reaping the Rewards, released during 2009 mentions:

"Many substantial benefits can be derived from initiating a security metrics program, and there is little reason for delay. At the onset it requires only a meager investment comprised mostly of the time spent planning, gathering data, and producing each report. This makes a security metrics program an intriguing project, especially in economically challenging times when funding can be tricky to secure."

What are security metrics?

Security metrics are often misunderstood, being referred to as a measuring process, and that is not the case. Shirley C. Payne in her SANS Institute paper, A Guide to Security Metrics, explains the difference:

Measurements provide single-point-in-time views of specific, discrete factors, while metrics are derived by comparing, to a predetermined baseline, two or more measurements taken over time. Measurements are generated by counting; metrics are generated from analysis. In other words, measurements are objective raw data, and metrics are either objective or subjective human interpretations of those data.

Next, Shirley describes what would be considered a "useful" metric:

"Truly useful metrics indicate the degree to which security goals, such as data confidentiality, are being met, and they drive actions taken to improve an organization's overall security program."

It's hard to argue with the graph and what experts were saying. Security metrics are doing their part to reduce data loss.

Number of data-loss events started increasing again

So what happened in 2010 and afterward? Why did the number of data-loss events trend upward each year? Security metrics were supposed to help. Security metrics did one thing for sure, convince upper management to spend money on new security programs as reported by tech media and analysts alike — Gartner, for example:

"While the global economic slowdown has been putting pressure on IT budgets, security is expected to remain a priority through 2016, according to Gartner, Inc. Worldwide spending on security is expected to rise to $60 billion in 2012, up 8.4 percent from $55 billion in 2011. Gartner expects this trajectory to continue, reaching $86 billion in 2016."

We have security metrics in place, companies are spending gobs of money on security, and I don't think companies are reporting security breaches for the fun of it. What's going on?

Once again, there is no simple answer. How could there be, just think of all the different variables that come into play. But since we're discussing security metrics, let's see if this reporting methodology has anything to do with it.

New report may have the answer

While searching for information on security metrics, I came across the article, Are Security Metrics too Complicated for Management?, by Shelley Boose of Tripwire. In the piece, Shelley mentioned a survey made by the Ponemon Institute. Commenting on the survey, Dr. Larry Ponemon, chairman and founder of the Ponemon Institute, said:

"Even though most organizations rely on metrics for operational improvement in IT, more than half of IT professionals appear to be concerned about their ability to use metrics to communicate effectively with senior executives about security."

That was enough for me. I contacted Shelley, asking for more information about the survey; she referred me to this Ponemon report. Of immediate interest to me was the section about security metrics (PDF download). It started off with a bang: "Security metrics — important, but still not effective for communicating risk."

It seems I may have found at least one reason why data-loss events are still on the increase.

Survey methodology

One thing I always appreciate about Ponemon Institute is their willingness to publicize the methodology they used in surveys, this one being no exception (PDF download):

"A sampling frame of 24,550 individuals of U.S. organizations and 18,012 of U.K. individuals who work in IT operations, IT security, business operations, compliance/internal audit and enterprise risk management were selected for this survey."


42,562 individuals surveyed should give provide a decent sampling. Now let's get to those survey questions.

Survey says

Question 1: How important are metrics in achieving a mature risk-based security management process? (Blue responses are from the U.S and red responses are from the U.K. in all of the graphs.)


A large majority of the respondents did not dispute the importance of security metrics as a key performance indicator.

Question 2: Do you believe that your company's existing metrics are properly aligned with business objectives?


The response to this question may give us a glimpse at why security metrics are not working correctly. Over 50 percent of the respondents either said "no or were unsure" if security metrics aligned with business objectives. The report ventures a guess as to why: "One potential contributing factor in this disconnect is that security professionals have traditionally viewed metrics as valuable operational performance measurements, while executives tend to evaluate security based on cost. Neither of these approaches is well adapted to communicating the effectiveness of risk-based security programs."

Question 3: Please rate your effectiveness in communicating all relevant facts about the state of security risk to senior executives.


Another hint at why security metrics may not be working, close to 50 percent of the respondents feel they are not effective in communicating the state of security risks to upper management. The next question goes to the heart of the matter, asking why the respondents do not create metrics that are understood. The answers may surprise you.

Question 4:  If no or unsure, why? In other words, why don't you create metrics that are well understood by senior executives?


The fact that close to 50 percent of the respondents felt information provided by security metrics was too technical, or that other departmental issues took precedence was enough for those editing the report to comment:

"In the same way it's not acceptable for CFOs to say they're too busy to prepare financial reports for the board or senior executive team, in the near future it will not be acceptable for senior IT leaders to be too busy to prepare understandable security reports. Security professionals must find or create metrics that are more broadly understood by business leaders."

Final thoughts

The survey appears to have found the disconnect —business speak versus IT speak. Business metrics that executives are familiar with tend to reflect strategic goals, prioritizing cost over less tangible security benefits. Where security metrics favor operational goals, and prioritize technical improvements over business contingencies.

I'll let the report have the last say:

Finding meaningful ways to successfully bridge this communication gap is critical to broader adoption of risk-based security programs. The onus for this effort clearly lies with IT security and risk professionals.


Information is my field...Writing is my passion...Coupling the two is my mission.

Editor's Picks