Interoperability is fast becoming a key watchword in business computing circles. Open source products continue to gain enterprise acceptance. With that acceptance comes an increase in users who demand that data produced with one application to work with data produced for another application — or even another operating system.
Open source advocates want maximum interoperability, which allows them to use whatever software they choose. For instance, a company using an open source business application expects the files it creates and works with to be compatible with Microsoft’s market-dominating products, and vice-versa.
Interoperability, however, could render computer systems more vulnerable to increased security risks. Does that mean, then, that open source users have to choose interoperability over security? Will accessing data produced with a Microsoft application automatically expose users of non-Microsoftproducts to the same vulnerabilities that plague Redmond’s wares?
“This is a decades-long debate. More transparency can breed security risks. This condition can be valid in some cases,” Dirk Morris, CTO of security provider firm Untangle, told LinuxInsider.
Interoperability doesn’t necessarily doom users to security problems. The level of risk depends greatly on the products used on both sides of the application and platform combination.
“Usually in a closed community, such as a single platform, it is easier for people to unknowingly spread a virus. If the environment is mixed, this becomes less critical,” Fred Pinkett, vice president of product management for Core Security, told LinuxInsider.
Using a well-patched version of a program and doing adequate penetration testing often play bigger roles in terms of how exposed one is to vulnerabilities, he suggested.
Weighing the Odds
That’s a point often echoed by security experts and product developers alike. Security and interoperability is not a one-size-fits-all situation.
“By itself, interoperability is not more or less secure. The problem is when customers want to run different applications together,” Dominic Sartorio, president of The Open Solutions Alliance, told LinuxInsider.
If the open source community were to set more common standards, it could lessen concerns about interoperability issues. For example, code writers have to pay more attention to security in individual applications, he suggested.
The ‘More Eyes’ Debate
Critics of open source software may point to its wide-open, no-secrets nature and call the model inherently less secure.
The counter-argument preaches that open source applications are more secure because the open code lets more eyes look for problems.
“It is easier for the bad guys to make a problem with open source. But it is also easier for the good guys to see it,” said Pinkett.
But, he readily admits, discussion of this issue rapidly erodes into a religious argument. It winds up amounting to little more than one’s personal preference as to which view of interoperability risks holds more credence, he said.
The prevailing view about the impact of interoperability on security is how users mix and match their data and applications. No one scenario can determine absolute safety or high risk.
“It is not a matter of open source versus closed source. It is more a matter of what you do with it, Pinkett said.
For instance, it’s a foregone conclusion that widely used applications do have more exposure to exploits, according to Sartorio. The security rating of any program is based on how many vulnerabilities it has, regardless of whether it is open or closed.
“Open source is no more and no less susceptible to software security vulnerabilities than closed source. More eyeballs doesn’t necessarily imply any secure code reviews. There is no guarantee, just as there is no guarantee in the closed source case,” Reed Auglier, operations director at Security Innovation, told LinuxInsider.
In both cases, a secure software development model is required to prevent malicious code introduction, he said.
Data shared with other programs and programs shared on different operation systems will continue to coexist. That’s just the direction the industry is headed. Precautions are needed, however, to ensure the security risks are minimal.
“It is when people start applying bubble gum and bailing wire to make things work together that things get worse,” warned Sartorio.
The front end represents a major issue involving security with shared data and programs. The trouble comes when people wire together separate components without a single log-in interface at the front end, Sartorio cautioned.
“One solution is a common user log-in that covers all mixed components. Security management is more effective when there is only one user name and password,” he explained.
Enterprises involved with interoperable applications and data should perform an exhaustive risk analysis as part of their due diligence for any software they install. This includes both open source and closed source applications, according to Auglier.
He also urged users to avoid the “everybody is using it” trap. Part of risk management is deciding when and where to apply the resources available, he explained. Often, such risk analysis tends to get skipped whenever the “many others are using this software so we are all in the same boat” argument comes up, he said.
The situation does not change with in-house software. Even when companies develop their own closed source software through outsourcing, there are security risks that must be considered.
“In all cases, secure development processes should be used, and the user of the software must have insight into the management of the secure development processes so that they can buy off on it,” he said.