Cybersecurity and privacy threats aren’t confined to the tech world. They’ve cast their pall on the world in general. Computer viruses, malware and data leaks have become commonplace, personal privacy has become a bad joke, and cyberwar looms like a virtual mushroom cloud.
What sometimes gets lost in the gloom are the many ways security professionals have been working to shore up cyberdefenses and rebuild some semblance of personal privacy. ECT News Network’s roundtable of technology insiders recently discussed some of the progress in cybersecurity and privacy protection, and while they may not have settled upon any overarching solutions, they did identify several rays of hope.
Taking part in the conversation were Laura DiDio, principal at ITIC; Rob Enderle, principal analyst at the Enderle Group; Ed Moyle, partner at SecurityCurve; Denis Pombriant, managing principal at the Beagle Research Group; and Jonathan Terrasi, a tech journalist who focuses on computer security, encryption, open source, politics and current affairs.
Advances in deep learning and other technologies offer some hope for identifying and eradicating cybersecurity risks before they can do irreparable damage, but most of the remedies our panel mentioned involve human behavior.
The main building blocks for shoring up the cybersecurity walls are increasing competence, adjusting priorities, working together, establishing accountability and taking government action, they said.
Closing the Skills Gap
Security professionals should look inward for the best opportunity to make sweeping improvements, suggested Moyle, who advocates establishing a license to practice, similar to a medical license.
“This is controversial and arguably wouldn’t help the skill shortage,” he acknowledged.”That said, a good 70 percent of those in the profession should be doing something else due to fundamental lack of skill and/or willingness to stay current.”
Though she didn’t specifically call for a licensing requirement, DiDio said that “corporations need to get the appropriate level of security training for their IT and security administrators, do vulnerability testing at least once a year, and stay up-to-date on all software and patches.”
Training probably shouldn’t be limited to building a more qualified army of cybersecurity professionals, however.
The one thing that could have the most dramatic positive impact on cybersecurity overall is end user training, according to Enderle.
“Users are still the most likely cause of a breach,” he pointed out.
“At the end of the day, end users themselves constitute the biggest threat and undermine security more than the hackers,” DiDio agreed.
“Companies need to provide security awareness training for their end users to make them aware of the latest email phishing scams, CEO fraud, malware, ransomware, and viruses that are making the rounds,” she said. “You have to change the attitudes and the mindsets of people so that they think before they click on a potentially bad link.”
Few would argue against education and training — yet many companies are not making serious investments for their staffs or their end users. There’s still a tendency to acknowledge the problem in a general way but to lag on taking specific defensive measures, which can be a strain on resources.
Security training languishes on many back burners because organizations — or individuals — haven’t experienced the repercussions of a cyberattack firsthand. Or they have been victimized, but they don’t yet know it.
“People have to stop thinking, ‘This won’t happen to me,’ You cannot practice security in 20/20 hindsight,” warned DiDio.
“The scariest thing is that most organizations and individuals have no clue that they’ve been hacked until disaster strikes — for example, the hacker is demanding a ransom and organization is locked out of its servers. Or the individual user’s personal information has been compromised and data is lost, stolen or destroyed,” she said.
Moyle drew a sharp analogy between cyberhealth and personal health.
“It’s like asking someone the best way to avoid heart disease. There’s an answer to this that’s not rocket science. People don’t want to hear it though,” he said.
The answer, of course, is to focus on “diet, exercise, not smoking or drinking, keeping stress low, minimizing caffeine, etc.,” Moyle continued.
“They already know this. It’s the execution part where people fall down because they consciously elect to do otherwise,” he said. “This isn’t an indictment by the way — people evaluate and decide that the threat isn’t worth adaptation or change to their lifestyle. For the record, I do it too.”
This is where a deeper awareness of the consequences of inaction comes into play.
“When I choose to make unhealthy lifestyle choices, I impact my own health — increasing my susceptibility to heart disease, for example — but it really doesn’t impact anybody else,” Moyle noted.
“That’s true in security too, to the extent that I’m making decisions that impact me alone — for example, the security of my personal computer and data. The problem that we run into is that the tradeoff in many cases is sacrifice on the part of one party that benefits the security of another,” he pointed out.
“Ed’s characterization of the problem as patients not following their security prescription is spot-on,” agreed Terrasi.
“Security and convenience exist in a trade-off relationship, meaning that true security will make the day-to-day operation of a company more complicated,” he said.
“If I’m a company with an online presence, the security decisions I make impact me to a degree, but even in the worst case — say, a large-scale breach — the long-term consequences aren’t terribly severe to me. They’re terrible for someone else,” Moyle said.
To illustrate his point, Moyle pointed to the TJX and Sony attacks a few years back, which appeared to severely damage the companies. A study on the effects of those attacks found “a negative and statistically significant impact of data breaches on a company’s market value on the announcement day for the breach. The cumulative effect increases in magnitudes over the day following the breach announcement, but then decreases and loses statistical significance.”
In other words, from the companies’ perspectives, the attacks amounted to “short term major drama but not that big a deal for them over the long term,” Moyle said. “The impact is instead to the customers, who tend to forgive the company but have their overall risk increased — in some cases, significantly — over a much longer time horizon.
“Effective information security habits are known, so why are companies not putting them into practice? The answer is that the alternatives aren’t as profitable,” said Terrasi.
“The cost in lost profit and class action settlement is simply less than the cost of security measures to not get hacked in the first place. Economics calls this ‘externalizing costs,’ and it is the same dynamic that leads companies to pollute. It is easier to dump chemicals in a river than carefully transport them to a disposal site.”
Get Out the Baseball Bat
If companies are willing to accept tradeoffs that hurt others but do little damage to their own interests, what then?
“The remedy to companies’ willful refusal to implement adequate information security is the same one we used to remedy their refusal to dispose of toxic waste, which is government regulation,” said Terrasi.
“Once the fines get higher than the cost of being hacked, companies will very quickly find ways to not get hacked,” he added.
“We don’t see too many companies that can get away with knowingly providing products that kill their customers — and when they do, what happens to them? Knowing the consequences for such an action, they take steps to avoid it,” said Moyle.
“For example, every bottle of aspirin sold nowadays has a tamper-evident seal. Why? Because someone tampered with some in the 80s and people freaked out enough that the providers of that prioritized the countermeasure enough that it can’t happen again,” he recalled.
“If customers responded like that to a breach — one where the company in question either should or did know better — I can guarantee you we’d have fewer security problems now,” Moyle added.
“I’d add that we need to move from just defense to a far more aggressive offense,” said Enderle.
“If we can focus our own attack efforts on those doing the attacks and more aggressively hunt down the attackers and destroy the economics of the malware industry, we could have a sustained change,” he suggested.
“We also should communicate that every piece of hardware, software, email you buy or create is a security choice, and hold people accountable for the related damage if they choose badly,” Enderle added. “We likely could make real progress.”
Call In the Feds
It may be that user education, professional training, and meaningful accountability won’t be enough to make a significant positive impact on cybersecurity without government intervention.
The government needs to take the threat seriously and fund an adequate response, Enderle said.
“There are some scary scenarios that have been shared that indicate a significant hit on the grid alone that lasted 60 days would kill 75 percent of us. The government isn’t taking this seriously enough,” he emphasized.
“We need an international treaty along the lines of the Geneva Conventions on War or the Kellogg-Briand Pact outlawing war as an instrument of foreign policy,” suggested Pombriant.
“Technology alone will always fail us. We need the majority of the planet to not feel disadvantaged by not participating in a cyberwar. Hence the need for a treaty — and the sooner the better,” he said.
“There is no one best thing, no short cut or silver bullet that can improve cybersecurity dramatically. It’s got to be a concerted effort undertaken by end users, corporate enterprises, vendors and regulators working in concert,” said DiDio.
Developers have a role to play too, noted Terrasi.
“The two hacks that rippled the furthest outside of information security circles and made a splash in the public consciousness — Spectre/Meltdown and WannaCry — were both illustrative of some of the most significant challenges facing the security professionals right now,” he pointed out.
“Both attacks show how severe vulnerabilities can accrue generationally if the proper care isn’t taken when software is initially designed and when software developers are too hasty to turn their attention to the next project,” Terrasi said.
“This is especially true with Spectre/Meltdown, where CPU speedup tricks were found to be easily gamed to undermine everything operating above the hardware level — which is basically everything,” he continued.
“Tech professionals have a handle on their tier in the layers of abstraction — app developers understand app bugs, kernel developers understand kernel bugs — but they have still not devised a reliable model for how to mitigate knock-on effects to the layers beyond those in which they operate,” Terrasi pointed out.
“The WannaCry attack demonstrates a similar dynamic, but between upstream and downstream instead of abstraction layers,” he observed.
“It’s on Microsoft, for example, to work with customers who may have a legitimate reason for using Windows XP, such as hospitals using medical devices that languished in the regulatory approval process as XP sped toward its end-of-life, and not leave them high and dry,” Terrasi argued.
“The industry will not move to the next level until each player learns to better address the needs of partners and seek their input,” he said.
“The fact is, we live in an interconnected society and that makes security ever more challenging,” noted DiDio. “Everyone from the CEO down to the end user has to take cyber security seriously.”
Privacy is dead, our panelists seemed to agree. Where their opinions diverged, to an extent, was in whether it might be reincarnated or whether its loss actually matters very much.
Whether they perceived the goal as dealing with the new privacy normal or making a heroic effort to reverse the tide, each of our panelists offered concrete suggestions for addressing the situation. Among them are building public awareness, pushing for regulation, and implementing new technological solutions.
Getting a Grip
“Restoring privacy once it has been compromised is like trying to restore a herd of cows once they have been eaten. I think the privacy horse has left the barn, moved to Mexico, and burned his ID,” quipped Enderle.
“It’s too late and it’s unnecessary,” said Pombriant.
“We don’t need privacy per se — we need conventions about how to ethically use the data that’s available. Right now, we live in the Wild West of technology, and we need civilizing influences,” he added.
“The better goal would be to ensure that information isn’t misused at this point,” Enderle agreed.
That requires “making people far more aware than they are about the related risks,” he added.
“The only thing that would work is for people to get sick of having their privacy compromised. This will drive regulation, which will in turn force companies to respect privacy of data over the commercial utility of the data,” Moyle said.
“You have to start by placing limitations on selling consumer information and that is very, very difficult. For starters, I’d like to see the government curtail robocalls!” said DiDio.
“In spite of noble efforts by some forward-looking states, there is no meaningful regulation at the federal level on how to protect privacy,” Terrasi pointed out.
“This is not primarily intentional or in service of surveillance, but more attributed to lack of expertise and lack of initiative,” he said.
“Congress has long-since abolished its scientific and technological advisory body, and when they last had a chance to revive it they voted not to,” Terrasi pointed out.
“Technology moves too fast for many consumers and users. Congress is an increment slower than them, and even when they do take it upon themselves to contemplate action, they are not in a position to make informed legislation, leaving them to either abandon the legislation or let industry lobbyists write it,” he explained.
Turning to Tech
“If we want to return the agency of one’s own digital privacy back to the users, the industry has to be willing to rethink some protocols and standards, and government at a high level has to keep up with technical advances and solicit outside, nonpartisan expertise,” Terrasi said.
Privacy-compromising architecture is entrenched from a technical perspective, he noted.
“IP addresses are geolocatable, DNS lookups aren’t encrypted — though Google and Mozilla are working to change this. Metadata lives in headers that can’t be encrypted, and cellphones trust any tower they can connect through and ping them constantly,” Terrasi pointed out.
“This is all the case not because any of the designers were negligent, per se, but because they never anticipated today’s use cases,” he explained.
“The Internet, the Web, email, and cellular communication all had meager initial adoption by mostly technical experts whose needs these technologies met without the need for much sophistication. In a time when websites were static and didn’t deal in commerce, what need would an engineer think there would be for encryption?” Terrasi asked.
“When Amazon and online banking come along, encryption becomes life or death for those services,” he continued.
“The growth of the Internet and other digital technologies has hinged in part on how well fixes can be bolted on after-the-fact, but after a point, these don’t cut it anymore. You can fill potholes at first, but eventually you have to tear up the asphalt and repave the street. We’re at the repaving stage for protocols that inherently respect privacy,” Terrasi said.
“We encrypt very little of the data we use, and I can see a time coming when encryption is standard. Oracle already provides encryption in its cloud, and that’s something we need to get after,” suggested Pombriant.
“I’d add we need to stop doing stupid things like trying to force vendors into creating and supplying law enforcement with encryption keys. These organizations’ own security isn’t absolute, and a stolen key would be potentially more catastrophic than most of the crimes they are trying to mitigate,” Enderle pointed out.
“The problem isn’t so much that data isn’t encrypted. The majority of Web traffic is now encrypted, which wasn’t true only a couple of years ago,” Terrasi noted.
The problem is that “it isn’t kept safe when it has reached whoever collects it. Most of the time, data is encrypted when it moves over the wire, but architecturally data has to be decrypted to be used, and it is used all the time,” he said.
“The privacy of most people living today is irrevocably lost, but that doesn’t mean we shouldn’t try to build new architectures and protocols to protect the privacy of those who come after us — or of our future selves, as we change over time and old data about us becomes stale,” Terrasi argued.
“If we impose requirements that force companies to be transparent with users about what data they collect, and build new Internet software infrastructure — think protocols — that build in more privacy by default,” he suggested, “we can set ourselves on a better footing going forward.”