IT Security and Software Development
Mar 26, 2004 6:30 AM PT
Let's do some arithmetic. Multiply the number of different hardware platforms in current use by the number of operating systems that have a reasonably large user base. Subtract the systems that simply won't work together. Multiply the result by the number of applications, servers and databases used in business, academia and on the Internet.
A silly exercise, sure. But the point is that even leaving aside software components that will never be used together at the same time and place, there are almost countless software combinations that must interoperate securely and do so in an environment plagued by more denial-of-service (DoS) attacks, viruses, worms and other malware every day.
In fact, the proliferation of both software and hardware means compatibility and security issues must be dealt with at several levels. First, will the technologies work together reliably under all possible test configurations? A company building an application for internal use probably will have those parameters under control. But what about commercially available products or open-source software?
Every Possible Configuration
Multimedia developers have known about this compatibility issue for years. As multimedia producer and author Tay Vaughan put it, "It is very difficult for even a well-equipped developer to test every possible configuration of computer, software and third-party add-on boards."
That used to apply just to internal bugs or design flaws, and there were third-party testing centers to help find problems. But now multimedia has gone to the Net, and some widely distributed multimedia players have been found to contain potential malware delivery flaws as well.
Vaughan's caveat has been echoed recently in a wider context by Aberdeen Group vice president Jim Hurley, who told TechNewsWorld recently: "It's almost impossible for one supplier to test all of the outcomes of how their products can be hacked. It's just too exhaustive."
If that's true, what is to be done? Are we to be condemned to ever lower security expectations?
Part of the answer might lie in the way software is produced. David Quinn, a British IT consultant formerly with British Telecom who now works on contract banking and telecommunications projects, suggests that part of the problem lies in the large teams that work on major applications and operating systems.
"You try to set standards and 'middle bits' that everything talks to [in order to] try and cut down the diversity," he said. "But you're never going to completely cut it down. You can see that when you use Windows, where the same thing happens differently in different places." The bigger part of the problem, he added, isn't -- as is sometimes alleged -- sloppy code, but errors in the system design concepts themselves.
That disjointed and unstandardized development environment is unlikely to change, he suggests, and the reason is business imperatives.
"What militates against consistency is the necessity of meeting deadlines," he told TechNewsWorld. "People lose the purity of it and just 'go for it.' My contention is that this problem will always be there. Managers will always press for the cut corner. However, I think that in the area of security particularly, it's probably a bit better, because they know they're sensitive there."
Security Silver Bullet
We can always hope for the "silver bullet" -- the security solution that will radically transform the security landscape -- but we're unlikely to see it anytime soon. However, that doesn't mean there's nothing we can do.
"Configuration management is particularly important," D.K. Matai, executive chairman of mi2g Intelligence Unit, a UK-based security analysis firm, told TechNewsWorld. "Ninety percent of the time, successful hacker attacks take place because configuration management has been incorrect."
Whatever the known vulnerabilities are within a particular operating system, server or third-party application, Matai explained, "the appropriate patches ought to be applied, and the default configurations and services which are running on a particular system ought to be shut off if they are not needed."
But Matai also suggests that more draconian approaches may be required.
"I think we are going to see three things in the future," he said. "One is a migration toward higher levels of authentication. This will include biometric authentication coupled with a smartcard or a PIN key that a user carries. And a random list of passwords which are changed more regularly than they currently are.
"Two: The complex data which belongs to a particular user currently resides, by and large, on his computer. In the years ahead, that data is likely to migrate further upstream, and the data vaulting is going to be guaranteed by an organization akin to a bank that will provide high-value data custody services.
"At the third level, it is likely that some type of a 'driving license' regime may be implemented whereby governments and countries worldwide begin to recognize the computer as a weapon. And they will therefore either restrict the level of capability that a general computer has when sold in the marketplace; or require the users of those computers to demonstrate their capability to remain more vigilant in the event that their machine gets sabotaged."
Dramatic Changes for Solution
Not everyone foresees such dramatic changes, however. As with most things, it comes down to a compromise between goals and resources. David Schatsky, senior vice president of research at Jupiter, sees "enlightened enterprises" as approaching this as a problem with what he calls a "multipronged, procedural solution."
"The most sophisticated companies," he told TechNewsWorld, "and the ones that have most at risk, have been developing pretty sound processes over time. Financial institutions, brokerage firms and so forth that manage money or high-value intellectual property are investing substantial sums in these processes, and they're roughly staying ahead of the game. The problem is that not every company has those resources to invest, or the maturity of the processes to get at those levels of security."
It's a conundrum, as Ben Franklin must have realized when he wrote the following line in Poor Richard's Almanac: "He that's secure is not safe."