Right after the collapse of the dot-com bubble, I bought a used Netra T1 (a 1 rack-unit Sun box). I got it for a song on eBay from a hosting provider that was going under, and for the price, how could I pass it up? Sure, it was unnecessary — but it was so cheap! I had to have it.
At the time that I bought it, I wasn’t entirely sure what I was going to do with it. But boy, was I excited. Predictably, when it did arrive, it quickly turned into an expensive paperweight. That’s not to say that I didn’t toy around with it for a while; but over time, it got less and less use. After a few months went by, it settled into its current role as hardcore basement dust-gatherer.
Sound familiar? It should. Everybody does something like this: For me, it’s used computer equipment; for some people, it’s designer shoes; for others, it’s decorative tchotchkes. We all buy stuff that we think we’re going to want — sometimes we’re right and we wind up using it, sometimes we’re wrong and it turns out we didn’t really want it after all.
Fine for Us, Risky for Our Businesses
If you’re wondering why I’m bringing this up, it’s for a very specific reason: Down this road lies risk for our businesses. And our job, at the end of the day, is to prevent risk. How does that risk come about? First, because it’s not just individuals that buy stuff they don’t need. And second, when it comes to certain types of purchases, buying something that sits in a closet is worse — from a risk perspective — than not buying it at all.
Just like individuals, companies buy all sorts of stuff they don’t need. It happens all the time. Our firms buy software, hardware and services that wind up not getting much use. Like us, companies make decisions about what they think they want, only to find out later that they didn’t really want what they bought after all. Sometimes it’s because they don’t have resources to manage a piece of software, sometimes they just don’t have resources to deploy it in the first place; sometimes, priorities change and leave the purchase in the wake.
Just like the Netra, the purchase winds up gathering dust in a closet somewhere.
The second point is that there’s tremendous pressure to keep something we’ve bought once we’ve done so. When we buy something we’re not going to use, human nature says keep it around.
Why did the Netra stay in the basement for more than five years after I bought it? Because throwing it away means admitting that I wasted that money. Who wants to do that? And what’s the worst case? If we’re not using it, no harm, no foul, right? Not always, which is exactly my point.
The Difference Between an Oversight and Negligence
If you make a purchase that supports a security control, the potential downside of not using it is worse than just not getting what you paid for. Instead, imagine for a moment having to explain your decision if a worst-case scenario were to happen. It’s useful to illustrate through example, so let’s pick a commonly occurring “shelfware” purchase, network IDS (intrusion detection system).
Say, for the sake of argument, that you buy an IDS system, and you find it to be unmanageable in your environment (this happens quite often). IDS systems can generate a lot of alerts — so much so that many organizations find it to be way too much for them to handle. The temptation when this happens is to turn off the IDS or at least to disable alerts — since your operations folks don’t have the bandwidth to respond, why continue to bombard them? So (reasonably) some organizations make the decision to turn the system off.
Now what happens if there were a compromise that the IDS system would have flagged? If you hadn’t purchased the IDS in the first place, the worst someone can say in hindsight is that you probably should have had an IDS — that it was an oversight on your part to not deploy a control that would have addressed the issue.
If, on the other hand, you have an IDS that is deliberately disabled, what then? The cold light of hindsight might reflect negatively on your decision to disable it. Accurate or not, someone could claim your decision was negligent. I don’t agree they’d be right, but you have to admit it puts you on the defensive.
By purchasing the control, you implicitly recognize the problem. And once you recognize the problem, you’d better do something — either document why don’t think the risk justifies investment or deploy some control to address the risk. Having a solution sitting on the shelf and choosing not to deploy it is dangerous ground — no matter how justified you are in putting it there in the first place.
Use It or Not – Stay Off the Middle Ground
So the challenge to us is twofold — we need to first refrain from introducing new shelfware to the environment, and we need to locate any shelfware we currently have and do something with it. For purchases that you haven’t made yet, the solution is easy: Try (and try hard) before you buy. If you think you can’t manage something, don’t buy it. Be conservative. Recognizing that there’s no option to let a security-related purchase sit makes it much easier to cull the less wise purchases before they happen.
But what’s harder is addressing the shelfware that you already have. In those cases, you either need to deploy it or dump it. There can be no middle ground.
Now, I’m not trying to scare you. Deployment in this case doesn’t mean that you need to go “whole hog” right off the mark. In the IDS example, for instance, you don’t need to hook the IDS up to every nook and cranny of the environment on day one. In fact, you probably shouldn’t. But you should put together some kind of deployment plan — either you phase it in in a way that’s manageable or you document why you can’t use it and you end the support/maintenance contract.
If you choose to phase it in, it doesn’t really matter what your timetable is so long as you have one. Maybe your plan is that you pilot the IDS in the quality assurance lab for a year. Maybe the IDS doesn’t see the whole environment for years and years after that.
At the end of they day, it doesn’t really matter what you do so long as you’re doing something and not leaving it to rot in some closet. If, in hindsight, someone says you should have deployed it faster? Well, at least they can’t say you knew about a possible problem and chose to take no action.
Ed Moyle is currently a manager withCTG’s information security solutions practice, providing strategy, consulting and solutions to clients worldwide, as well as a founding partner ofSecurity Curve. His extensive background in computer security includes experience in forensics, application penetration testing, information security audit and secure solutions development.