Why security metrics aren't helping prevent data loss

Security metrics are supposedly a way for upper management and IT departments to converse intelligently about in-house security programs. Why aren't the metrics working?

Reported data loss due to security breaches is not slowing down in the least bit, as the graph below (courtesy of vividly points out.  What's more, these statistics only include publicly reported breaches. One can only imagine how many security breaches are unreported by organizations wanting to avoid public scrutiny.


And did you notice what happened during 2009 -- interesting isn't it? I'm told there were several reasons for the drop in reported data-loss events. The one being championed by most was the introduction of security metrics. It seems security metrics as a tool started gaining real credibility around that time. The SANS Institute paper, Gathering Security Metrics and Reaping the Rewards, released during 2009 mentions:

"Many substantial benefits can be derived from initiating a security metrics program, and there is little reason for delay. At the onset it requires only a meager investment comprised mostly of the time spent planning, gathering data, and producing each report. This makes a security metrics program an intriguing project, especially in economically challenging times when funding can be tricky to secure."

What are security metrics?

Security metrics are often misunderstood, being referred to as a measuring process, and that is not the case. Shirley C. Payne in her SANS Institute paper, A Guide to Security Metrics, explains the difference:

Measurements provide single-point-in-time views of specific, discrete factors, while metrics are derived by comparing, to a predetermined baseline, two or more measurements taken over time. Measurements are generated by counting; metrics are generated from analysis. In other words, measurements are objective raw data, and metrics are either objective or subjective human interpretations of those data.

Next, Shirley describes what would be considered a "useful" metric:

"Truly useful metrics indicate the degree to which security goals, such as data confidentiality, are being met, and they drive actions taken to improve an organization's overall security program."

It's hard to argue with the graph and what experts were saying. Security metrics are doing their part to reduce data loss.

Number of data-loss events started increasing again

So what happened in 2010 and afterward? Why did the number of data-loss events trend upward each year? Security metrics were supposed to help. Security metrics did one thing for sure, convince upper management to spend money on new security programs as reported by tech media and analysts alike -- Gartner, for example:

"While the global economic slowdown has been putting pressure on IT budgets, security is expected to remain a priority through 2016, according to Gartner, Inc. Worldwide spending on security is expected to rise to $60 billion in 2012, up 8.4 percent from $55 billion in 2011. Gartner expects this trajectory to continue, reaching $86 billion in 2016."

We have security metrics in place, companies are spending gobs of money on security, and I don't think companies are reporting security breaches for the fun of it. What's going on?

Once again, there is no simple answer. How could there be, just think of all the different variables that come into play. But since we're discussing security metrics, let's see if this reporting methodology has anything to do with it.

New report may have the answer

While searching for information on security metrics, I came across the article, Are Security Metrics too Complicated for Management?, by Shelley Boose of Tripwire. In the piece, Shelley mentioned a survey made by the Ponemon Institute. Commenting on the survey, Dr. Larry Ponemon, chairman and founder of the Ponemon Institute, said:

"Even though most organizations rely on metrics for operational improvement in IT, more than half of IT professionals appear to be concerned about their ability to use metrics to communicate effectively with senior executives about security."

That was enough for me. I contacted Shelley, asking for more information about the survey; she referred me to this Ponemon report. Of immediate interest to me was the section about security metrics (PDF download). It started off with a bang: "Security metrics -- important, but still not effective for communicating risk."

It seems I may have found at least one reason why data-loss events are still on the increase.

Survey methodology

One thing I always appreciate about Ponemon Institute is their willingness to publicize the methodology they used in surveys, this one being no exception (PDF download):

"A sampling frame of 24,550 individuals of U.S. organizations and 18,012 of U.K. individuals who work in IT operations, IT security, business operations, compliance/internal audit and enterprise risk management were selected for this survey."


42,562 individuals surveyed should give provide a decent sampling. Now let's get to those survey questions.

Survey says

Question 1: How important are metrics in achieving a mature risk-based security management process? (Blue responses are from the U.S and red responses are from the U.K. in all of the graphs.)


A large majority of the respondents did not dispute the importance of security metrics as a key performance indicator.

Question 2: Do you believe that your company's existing metrics are properly aligned with business objectives?


The response to this question may give us a glimpse at why security metrics are not working correctly. Over 50 percent of the respondents either said "no or were unsure" if security metrics aligned with business objectives. The report ventures a guess as to why: "One potential contributing factor in this disconnect is that security professionals have traditionally viewed metrics as valuable operational performance measurements, while executives tend to evaluate security based on cost. Neither of these approaches is well adapted to communicating the effectiveness of risk-based security programs."

Question 3: Please rate your effectiveness in communicating all relevant facts about the state of security risk to senior executives.


Another hint at why security metrics may not be working, close to 50 percent of the respondents feel they are not effective in communicating the state of security risks to upper management. The next question goes to the heart of the matter, asking why the respondents do not create metrics that are understood. The answers may surprise you.

Question 4:  If no or unsure, why? In other words, why don't you create metrics that are well understood by senior executives?


The fact that close to 50 percent of the respondents felt information provided by security metrics was too technical, or that other departmental issues took precedence was enough for those editing the report to comment:

"In the same way it's not acceptable for CFOs to say they're too busy to prepare financial reports for the board or senior executive team, in the near future it will not be acceptable for senior IT leaders to be too busy to prepare understandable security reports. Security professionals must find or create metrics that are more broadly understood by business leaders."

Final thoughts

The survey appears to have found the disconnect --business speak versus IT speak. Business metrics that executives are familiar with tend to reflect strategic goals, prioritizing cost over less tangible security benefits. Where security metrics favor operational goals, and prioritize technical improvements over business contingencies.

I'll let the report have the last say:

Finding meaningful ways to successfully bridge this communication gap is critical to broader adoption of risk-based security programs. The onus for this effort clearly lies with IT security and risk professionals.


Information is my field...Writing is my passion...Coupling the two is my mission.


Worst.article.ever. Your journalism is bad and you should feel bad.


I would not expect metrics to help stem data loss.  People and processes do that.  Metrics show the results.


At some points in my career I have had to create various reports for Senior Management.  The report metrics that were asked for didn't seem to fit with my understanding of doing my technical job, that is the metrics usually were so Director A can show VP B that X was happening or to get funds for X.  Generally if the metrics had lots of pretty colors (usually charts/graphs) they made good metrics for Senior Management.  The technical aspects may be quite different than the metrics that Senior Management wanted to see.

One time I saw a product on network monitoring that seem to cover everything.  It built an interface in layers, at the top layer, high level information in graphic form was presented, if you clicked on it, it went to the next layer with more actual numbers and information.  You could keep clicking until you got to the physical adapter on the wire.  Non-techs could easily see the big picture however techs could drill down to all the details.

It seems to come down to what do you put your attention and awareness on as a Company, Management, Workers, Mission, Process.  If the company is healthy and a whole entity then things will work, if the company is dysfunctional, fractured or unhealthy then things can get a little crazy.

BTW:  I have to agree with the comment about posting with Tech Republics new design.  I can't post with IE10. I had to switch to Chrome in order to post this.


Does it not depend upon what is being measured to make the analysis relevant to the business?


Management have no chance of understanding any technical report from the IT department.

Metrics (in general) are basically meaningless busy work that mangers show, to other managers, to boast about how much work their department does.

One of my friends works in a Government dept.

He tells me that he now has to spend more time filling in reports about what work he is doing, than he spends doing his actual job!

Michael Kassner
Michael Kassner



Hello, Alex

I received your email explaining why you feel I misrepresented security metrics. Could I ask you to please reread the article, and point out where I stated security metrics were bad. I cannot recall anywhere that I did. You mention that you use security metrics to stop active attacks and prevent breaches, that is great. 

The contention of the sources the article quoted is that there is a disconnect between upper management and those such as yourself that understand the highly technical version of security metrics. I do not believe I stated anywhere in the article what I personally believe, if I did, please point it out to me. 

Also, the sources' contention was that if there isn't a change, the status quo will continue, as will the overwhelming number of security breaches that do not get caught due to upper management neither understanding nor seeing any benefit to security programs created because of what IT professionals have learned from security metrics.


Metrics create distinction and clarify perception vs. reality. It highlights options and influence decisions. Decisions directly impact actions, and outcomes. Therefore, it is exceptionally simplistic to say metrics only show results.


This is like saying "statistics don't win baseball games, players do." It ignores the fact that the relevant performance statistics correlate well to strong performance.

The irony of course is that the data we have presenting this in part comes from Gene Kim. Gene wrote a little program called "Tripwire" you may have heard of it.

Michael Kassner
Michael Kassner


You are correct, but... Metrics convince the powers that mind the finances and make policies to put security in motion.  

Michael Kassner
Michael Kassner


Thanks, Craig. That was a wonderful comment. Through the years, I've had the similar experiences, but having others reaffirm them is appreciated. 

Michael Kassner
Michael Kassner


That is the problem as I see it, Phil. There is no generic set of metrics. I also suspect that the board room and IT department are not discussing this. I get the impression that this will take some bending by both sides to gain understanding of what is needed. 

Michael Kassner
Michael Kassner


I need your help. Can you explain why we can't alter the metrics so that upper management does understand? 

And is it busy work if it gets upper management to agree to something that they would not have otherwise? 

Michael Kassner
Michael Kassner


I guess my thought about baseball is that without players there is no game.

As for Tripwire, I gather you realize what organization authorized the report, and the results you are in contention with? 


@Michael Kassner@lehnerus2000 

The site is awful since the reorganisation.
None of my article bookmarks work anymore.

This new commenting system is rubbish.
I just typed a lengthy reply and the site timed out and deleted it!

It looks like I will have to type up my replies using Notepad++.

The site search is rubbish.
It doesn't work properly in IE10 or FF22.
I just did a search for a previous article that covered a facet of this problem.
It was about what developers think of management.
The results claimed that there were 155 pages, however I am unable to view any result that don't appear on the first page!


"I need your help. Can you explain why we can't alter the metrics so that upper management does understand?"
The old adage about experts sums it up succinctly:
"An expert is someone who knows more and more about less and less, until eventually he knows everything about nothing."

Modern managers are experts.
They are experts in "management theory" and that is all they know.
When reality doesn't match theory, they ignore reality.

I would suggest that the majority of "hero" CEOs, all had hands-on backgrounds in the companies they ran.
That is to say, they knew what the company did and what was required to create the companies products and services (e.g. Bill Gates or Jack Welch).

To make upper management understand security metrics they have to be presented in (or converted to) the following format:
"Executive Bonuses Lost, Fines Levied & Lawsuits filed vs Data Damaged, Stolen and/or Illegally Accessed"

To do this, IT departments need accountants and lawyers who understand IT.

"And is it busy work if it gets upper management to agree to something that they would not have otherwise?"
That would not fit my definition of "busy work".

What I mean is "The Simpsons" definition.
Work that accomplishes nothing, but produces lots of statistics that can be used to "prove" that useful work was done.

For example, how many pieces of legislation the Government has passed in a given time period.
The amount of work done, doesn't prove that the legislation was helpful, required, useful or even valid.

Michael Kassner
Michael Kassner

@lehnerus2000 @Michael Kassner 

Possibly, but I remember many professors and sources who have explained extremely difficult subjects to me, a person who has no claim on understanding some of the very technical aspects of IT.

I have also asked these same professors and sources for their opinion on this subject, and they all state the topics are difficult, but could be presented in way that upper management would be able to understand. 

Editor's Picks