Security

Focusing measurements and metrics on security

How to measure security

Not so long ago, one of my clients called me in for a process and measurement readjustment project. We started with the three core services—infrastructure, development, and customer service. The measurements we selected covered a range of procedural, operational, and creative factors. The metrics for each measurement emphasized behaviors we hoped would be more in line with the company's goals.

However, change by measurement takes time. While we waited for our first batch of quarterly results, my client asked me to deal with one of the most nonconventional IT services. The providers were well noted in the company for their troglodyte-like behavior and lack of reliable measurements.

This team, the Security Services Organization, worked primarily with the external Web services and the network switches to create security schemes. In theory, they offered data security services to the entire organization. In reality, data, functional, and application security devolved onto other groups.

My client asked me to first define this group's responsibilities based on various organizational charter documents and their current activities. Then, he wanted measurements and metrics to guide them back into the organization. Given his level of disgust with their current behavior, firing the entire team was an option on the table. However, he asked me to exhaust all possible avenues before making that recommendation.

The current state of security

The security team, like many such groups at other clients' companies, spent a great deal of time talking about potential security threats. They researched threats on the Web, studied esoteric-seeming materials, and made complex pronouncements from on high. Their presumed level of technical expertise (whether earned or not) gave their statements inordinate weight with the rest of the IT staff.

The security team focused on external threats, firewalls, and occasionally VLAN segmentation. Their stated goals were to prevent an external intrusion and internal assaults through the wireless network. They played through elaborate scenarios in the test lab, exploring ever more exotic forms of attack.

At the same time, the vast majority of the company's resources were inadequately secured. Customer support managed most of the data security for the organization. The developers, adrift without clear direction from the experts, gamely worked to plug holes in their applications. The server administrators in infrastructure managed their own patch schedule, sometimes at odds with one another.

The authorization assignment and verification process also needed a great deal of work. Any user could call the help desk to get access to nearly any function. With the conflicting and contrasting security schemes produced by the developers, no one really had any idea what "appropriate" security levels might be, or what functions should go to whom.

The latter might be considered a problem with the customer support organization. In part, it was. At the same time, the responsibility for auditing the system, establishing clear access levels, and monitoring the process all logically devolved onto the security organization.

Measuring the security service

I sat down to design measurements and metrics. The team needed a sharp kick in the pants to get moving in the right direction. With that in mind, I intentionally installed measurements we would have to deemphasize in the long run. The measurement categories I chose were audit compliance, budget, communications, incident response, and process management.

Audit compliance caused the largest number of complaints from all involved parties. We gave the security team a list of all the security areas in their control. Each week, they were to return to us a list of the activities they took in order to manage those areas. Each month, we asked an independent group along with representatives from customer service, development, and infrastructure to verify whether the work had the stated effect. If it did, they received a pass on this measurement. Each failed item reduced their potential score. Anything over a 20-percent failure caused them to fail this measurement.

Budget compliance (a perennial favorite) allowed us to measure the amount of money spent vs. the amount of money allocated for security activity. We also added another metric: amount of money spent vs. the percentage of high-to-medium risks (rather than threats) addressed. Failure to meet high-level risks, or spending money/time on low-level risks/threats while higher-level threats went unattended to, reduced their budget score.

Including communications as a measurement raised a host of issues. The security team felt they were responsible only for technical solutions. My client felt they were a key communicator in the overall system management process. In order to move them more toward this latter role, we implemented metrics measuring the time spent usefully participating in meetings; the speed in responding to queries; and the clarity of communications regarding specific security queries. We felt these measurements would, over time, create a useless focus on sudden rather than meaningful communication. However, during the first few measurement cycles, they might also help to focus the team on proper communications techniques.

Incident response involved metrics sampling—the speed in responding to security breaches, appropriate auditing of security changes, and the speed in responding to security-related changes concerning the products in the environment. If the team responded quickly, performed their auditing tasks in a reasonable time frame, and worked with the other IT departments, they received good marks. Each time they failed to involve others or delayed their response, their score decreased.

Process management, as a measurement, was intended to drive the team toward establishing and managing a comprehensive security plan. Here's where I disagreed with my client: I felt the plan's creation should be a project, while its maintenance could be a measurement. But he didn't want to divert the team's attention. In the end, we settled on the following metrics: successful audits of the IT security processes, dissemination of the IT security processes, and adaptation of the IT security processes. Activity that fell into these three categories increased the team's score in this measurement.

We also disagreed on whether the team should automatically fail if they did not take action in the process management category. My client set the base score to automatically fail unless the team spent time actively working in these areas. I argued that the team's scores in other areas should be weighed as well, since they indirectly indicate process awareness. However, he felt that the threat of failure, and the attendant action, was important to his overall goals.

With security, my client and I compromised on long-term measurements in order to achieve short-term goals. Would you have made the same decisions? Would you measure the same things? Why or why not? On what level do you need your security team to deal with threat rather than risk? What other issues might come up?

Editor's Picks