Security Metrics at the Grassroots Level
Jun 18, 2008 4:00 AM PT
Want to try an experiment?
Part 1: Get yourself a crowd of willing co-experimenters (about 20 to 30) and tell them that you're going to ask them a trivia question. Tell them you're going to read the question to them and when you count to three, everyone should shout out their answer at the same time. They should all shout out a guess -- even if they have no clue what the answer is. Just yell it out.
Then ask the group some really esoteric question -- one that most people are unlikely to know but that isn't so out there that it's unanswerable. For example, "Who was the last surviving Bounty mutineer on Pitcairn Island when it was discovered by the Topaz in 1808?" Most people won't know that. Some people might have the right answer -- but if so, I guarantee you won't be able to hear it over everyone else shouting their guess when the time comes to answer.
Part 2: Now, ask the crowd the same question again -- but this time tell people to shout out an answer only if they are certain they're right. No guessing this time. Instead, only people that are completely confident should respond.
Guess what? Most likely at least one person in the crowd will know the answer, and this time, everyone will hear them say it. In the first case, knowledge is stifled; in the second, it is shared. The trick is increasing the signal to noise ratio.
I learned this experiment from a science teacher with a vested interest in getting me to shut up (OK, so that science teacher was my dad). However, it nevertheless serves as a useful illustration of an important point: Answers are only as valuable as our ability to hear them.
This point is particularly important in information security -- it's a "noisy" discipline, lots of people claim to have the right answer (and usually don't), and the same answers won't work for everybody. The challenge is to find the answers that work for you, despite the contradictory things that you might hear from the "peanut gallery" of vendors, consultants, auditors and peers. In short, you need a trusted "voice" that you know you can rely on -- for me, that voice is metrics.
Most of the time, getting good data about the state of your information security program is a difficult exercise. Folks inside the firm -- both in and out of the security organization -- are biased. It's not purposeful. They're not just telling you only what you want to hear. It's just that they're too close to the day-to-day operations to give you objective and quantifiable feedback.
Vendors, consultants, auditors and other "outsiders" that you may engage are almost certain to provide feedback about your program, but it's likely to be feedback that's tightly coupled to their particular point of view, expertise and sales agenda. Metrics, on the other hand, are set up for the express purpose of objectivity; the goal is to give you that feedback so that you can analyze your program in a definite, quantifiable and repeatable way.
To many information security practitioners, the idea that metrics have something useful to offer isn't a new concept. We're bombarded by folks in the industry telling us about how great metrics are (in fact, you might have even heard it before from me.) We've all been told time and again that we should have metrics -- that they're helpful for our security programs and that they provide value. Despite the numerous people telling us about the value, there's a serious shortage of people with practical advice. Plus, it's a hard topic -- many of us have already tried one or more security metrics initiatives, and (unfortunately) most metrics initiatives fail.
So, accepting for the sake of argument that metrics are useful, and accepting that they can provide that trusted "voice" giving us information about the state of our programs, the challenge then for those of us in the field is to put rubber to the road and actually make them happen. It seems overwhelming. In fact, setting up a sophisticated and mature metrics program is overwhelming -- but setting up a small, grassroots, "guerrilla" metrics program isn't. Neither is growing it into something bigger.
When it comes to putting a security metrics initiative into practice, probably the biggest success factor has to do with whether you start by going "whole hog" or whether you go for the "snowball" approach.
It's like going to the gym. If you're not used to exercise, and you go to the gym for two hours of heavy lifting and an hour of intensive cardio, you're very unlikely to go back to the gym tomorrow to continue that same routine (that is, once you're able to get up). However, if you go to the gym that first time and start with a small, manageable workout, you'll find your strength and endurance slowly building up over time. In a few months, you'll be pumping iron with the best of them.
The same is true of metrics. If you start by trying to measure and record everything under the sun right from the get-go, you're likely to fail -- it's just too much to take on all at once. However, if you start small -- formalizing the collection of data that you already have -- you'll find that this simple approach can be a pretty good jump start on a metrics program. For example, if you have an IDS (intrusion detection system) or a vulnerability assessment tool already deployed, start by recording the results of that and waiting until you have processes in place for viewing that data before you proceed to start gathering different types of information. Later, once you're comfortable with that, integrate other data sources that you might have already, like data from a ticketing or work-tracking system, data from your antivirus or anti-spyware tools, information from security tools already deployed, and so on.
You (and your management) will see the benefits that metrics can bring, and you can build a framework that can be used once you start to collect different types of information from other places in the firm. As your framework gets more and more sophisticated, you can ask security staff within the organization to start sharing their status with you in ways that can be "consumed" by the metrics framework. It doesn't have to be complicated -- for example, ask the operations folks to share information about the systems they manage (like firewall systems) and the processes they follow (like incident response) in a format that is compatible with the framework. Then, once you have a few of those, you can reach outside the security organization to "partner" organizations like internal counsel, human resources or audit to see if they can fold in information from their processes into your framework.
Starting small by building around data you already have has the side benefit of not requiring much in the way of budget to make your metrics "program" come to life. This has advantages for firms that have already tried the "whole hog" approach to metrics with less than stellar results. In fact, if you make it your goal to just add one source of data to your metrics framework every quarter (again, using data you already have), you'll find that in about a year or so you'll be way ahead of where most programs are. What do you have to lose?
Ed Moyle is currently a manager with CTG's information security solutions practice, providing strategy, consulting and solutions to clients worldwide, as well as a founding partner of Security Curve. His extensive background in computer security includes experience in forensics, application penetration testing, information security audit and secure solutions development.