In case you live on the moon, what happened last week was that a small amount of Microsoft source code was leaked to the Web. Granted, small is relative. The leaked code consisted of more lines than I’ve ever written in my life, but early measurements had it at about 15 percent of Windows 2000.
Evidently, the leak was done by a vendor that uses the code to facilitate running applications written for Windows on Linux and Unix. Microsoft eventually confirmed the leak, but that hasn’t stopped the barrage of commentary from security experts, journalists, analysts and, amazingly enough, the open-source community, which waxed eloquent on why the exposure of Microsoft code to the Web was a disaster for the company.
EDITOR’S NOTE (February 20, 2004): Rob Enderle is responding to comments about this column in the talkback forum. Scroll to the end of the article to participate in the discussion.
Now, don’t get me wrong. Like the President of the United States, Microsoft has a huge credibility problem, which means that every time almost anything happens to the company, folks come out of the woodwork to espouse nefarious motives and catastrophic outcomes. They are almost never right, but it does make for interesting reading.
Silver Lining in the Leak
As it turns out, this leak might be a cloud with a silver lining for Microsoft. Much like the effect of the U.S. presidential primaries on the party in office, a source-code leak generates a huge amount of discussion about why source code on the Internet is a bad thing. And because the origin of these negative comments is largely the open-source community — or folks who appear aligned with the open-source community — the source-code leak is having the interesting side effect of causing people to question the security of open source in general.
This has to be one of the least intelligent moves I’ve ever seen from an advocacy group — and I’ve seen some whoppers.
Remember that the open-source community uses the thousands-of-monkeys method to ensure security. This method hearkens back to the college theory about a thousand monkeys who — if given all eternity and endless typewriter ribbon — eventually type out the complete works of Shakespeare.
The open-source community argues that with thousands of eyes looking at the code, the code is much more robust and the security of the resulting products is near absolute. Any CIO or CFO who hasn’t heard that this is the method of the open-source community will probably be reaching for the heart-attack pills about now.
The Impact of Sarbanes-Oxley
The open-source advocates have been able to maintain the thousand-monkey argument largely because the opinion was widely held that open-source software benefits from lots of volunteers and is therefore more secure than proprietary closed-source software. But Enron, and particularly Sarbanes-Oxley, has turned this notion on its head with a vengeance. I’ve been getting e-mail from CIOs that indicates they are increasingly becoming aware that open-source software might not pass any security audits designed to comply with Sarbanes-Oxley.
That is because, in an audit, you have to be able to certify every part of an application. If there is even a chance that someone who has not been properly qualified touched a financial application or the platform on which that application resides, IT will fail the audit. Corporate boards are motivated to take draconian measures when this happens to protect their own assets.
Until the Microsoft problem surfaced, IT had time to think through this issue and look for ways to mitigate it because the audit was clearly going to be focusing more on physical controls than electronic ones — under the assumption that one was a greater exposure than the other — and because the initial staffing for any audit function is financial, not IT-based.
However, the increased awareness this issue has generated should cause some of the teams to reassess their adoption of open-source software. The reason this idea came up — outside of the fact that some of Microsoft’s source code made it onto the Internet last week — is that I had lunch with a friend of mine who is in the executive search business. She specializes in CFOs and other financial executives. Right now, apparently, the hottest job on the market is Audit Manager.
Audit Managers in Demand
This lunch, which happened around the time that the code leak became public, caused me to tread back along memory lane and recall how we set up IT audits and what we looked at. When an audit happened, you had to document every place code came from and every place it went.
You had to ensure that no one who wasn’t approved at the proper level touched anything that impacted a critical piece of corporate IP or had even a glancing relationship with financial reporting. And you had to make sure there was no obvious collusion going on that violated the separation of duties controls that existed to protect the company.
I would have had a field day with open-source software, where patches are often received or discussed with outside entities who actually could work for foreign governments or competitors, where collaboration could easily be reinterpreted as collusion, and where the very mention of the thousands of people looking at a product would result in a front-page comment in an unsatisfactory audit.
Internal Audit Practices
Not only was I a field audit manager, but I’ve spent a lot of time over the last several years teaching IT organizations how to survive internal IT audits. Few, it seems, have actually had this experience, and many don’t understand it or its related risks.
An internal audit’s goal is to find problems. Therefore, they tend to be incredibly harsh in their review and have no bias toward open source or Microsoft — or Apple, for that matter. Their authority comes directly from the audit board, and they can actually cause nearly any employee to be fired on the spot if their findings indicate the employee significantly violated a critical policy. Even if that violation was unintentional, termination could still be the outcome.
So, in the face of the Microsoft code leak, I have to think the old saying that people in glass houses shouldn’t throw stones applies here very well. My sense is that these stones, tossed by the open-source community, will be coming back like boomerangs with booster rockets.
I predict that in the near future, a large number of folks relying on open-source software will suddenly see that while auditors can be funny, when it comes to source-code leaks — including the entire source code freely available in the open-source community — they have no sense of humor whatsoever.
Rob Enderle, a TechNewsWorld columnist, is the Principal Analyst for the Enderle Group, a company founded on the concept of providing a unique perspective on personal technology products and trends.