Symantec Source Code Scattered to the Winds
Hackers have posted the source code for two Symantec security products, claiming they obtained the information from systems belonging to Indian military intelligence. The products affected are four and five years old, Symantec said. "If the source code from product released in the past three or four years was compromised, I'd be pretty concerned," said security consultant Randy Abrams.
Jan 9, 2012 6:00 AM PT
Source code for two security applications from Symantec has been stolen and posted on the Web. The hackers claiming responsibility, who call themselves the "Lords of Dharmaraja," say they obtained code for the Norton Antivirus application.
However, it appears they actually obtained code for two enterprise products, Symantec End Point 11 (SEP 11) and Symantec AntiVirus Corporate Edition (SAV) 10.2 instead, Symantec spokesperson Cris Paden told TechNewsWorld.
The code is "four and five years old," and SEP 11 has since evolved into SEP 12.0 and 12.1, while SAV 10.2 has been discontinued, although it's still being serviced by Symantec, Paden said.
"Presently, we have no indication that the code disclosure impacts the functionality or security of Symantec's solutions," Paden remarked. "Furthermore, there are no indications that customer information has been impacted or exposed at this time."
The hackers are from "a local chapter of Anonymous" in India, Paden claimed.
What Was Stolen, What Was Lost
Initially, there was some confusion as to what had happened.
The thieves' initial announcement, made last Wednesday, only involved documentation from 1999 describing how Norton Antivirus worked, rather than the application's source code as claimed, Symantec found.
On Thursday, the hackers announced they possessed additional information, and Symantec found they had actual code this time. That was for SEP 11 and SAV 10.2. The hackers claimed they obtained the code after cracking servers belonging to Indian military intelligence.
"We are still gathering information on the details and are not in a position to provide specifics on the third party involved," Paden said, adding that Symantec's own network was not accessed.
Fallout From the Hack
How safe are users in the aftermath of the hack?
"I wouldn't worry too much about the stuff from 1999, but if the source code from product released in the past three or four years was compromised, I'd be pretty concerned," Randy Abrams, an independent security consultant, told TechNewsWorld.
On the other hand, "Given how fast threats have been changing over the last few years, even a product a year old may be dramatically different than the shipping product," Rob Enderle, principal analyst at the Enderle Group, pointed out.
Time for Show and Tell
Businesses may need to begin inspecting their partners' IT security standards and networks more closely in the future.
"It's not enough to ensure you follow best practices," Mike Lloyd, chief technology officer at RedSeal Networks, told TechNewsWorld. "In an interconnected world, you have to worry about the security of other organizations."
Running checks on business partners' security setups and networks "has been the standard advice since the big IBM trial in the late '80s," Enderle pointed out.
IBM had built a so-called impenetrable system and hired a hacker to test it. "He broke in within a few hours by going in through a trusted partner," Enderle told TechNewsWorld.
How About a Baseline?
However, companies may not be able to audit their business partners' systems as thoroughly as they'd like to.
"Your business partners and strategic customers may be friendly, but they are not going to expose specifics to you about how well they protect themselves," RedSeal's Lloyd pointed out.
Further, the legality of checking a partner's network security "depends upon a number of factors, including the physical location of both parties, consent arrangements, and the actual testing being done," Abrams stated. "When it comes to dealing with a government, sometimes you play by their rules or you don't play in the country at all."
A STIG, or security technical implementation guide, is a methodology for standardized secure installation and maintenance of computer software and hardware. The term was coined by the Defense Information Systems Agency (DISA).
"Anyone who faces risk due to assets in someone else's control needs to establish a yardstick that the outside entity can use to show they have taken due care," Lloyd said. The yardstick needs to be quantifiable objectively, must maintain some privacy for the organization being studied, and "must actually measure security posture, not just busy-ness," Lloyd concluded.