World With New Limits: The Coming End of Moore's Law
Here in the tech community -- as in so many others -- declaring the birth or death of an era is a tried-and-true path to social fame. For that reason, proclamations to that effect are pretty dang common.
Those of us here in the Linux community are pretty accustomed to such announcements by now -- just witness the never-ending "year of Linux desktop" and "death of desktop Linux" rotation that seems to besiege us year after year.
Yet there's no denying that the bearer of such messages makes a difference. It's one thing to hear the "death of desktop Linux" proclaimed by a longtime Windows fan -- and, of course, to see it defended in turn by Linux Girl, your tireless protector of all that is FOSS.
It's quite another thing, however, to hear the end of an era anticipated by none other than our very own Linus Torvalds.
"On the five- to 10-year timeframe scale, I'm very interested to see how the industry actually reacts to the fact that soon we will come against some physical limits," Torvalds reportedly said at LinuxCon recently. "People used to be talking about having thousands of cores on one die because it keeps shrinking, and those people clearly have no idea about physics because we won't be shrinking for much longer."
In fact, the upcoming decade is going to bring difficult changes, he added.
"That's going to affect us in kernel land because we are the layer between hardware and software," he explained. "What happens when hardware doesn't improve and magically make us faster?"
'The New Frontier of Computing'
Torvalds' musings came as nothing short of a bucket of cold water on more than a few Linux fans, who had been contentedly sleeping off their LinuxCon revelries down at the blogosphere's Broken Windows Lounge.
Once awake, however, they had plenty to say -- not that it's ever otherwise.
"When the question is one of how many transistors you can fit on the head of a pin, there are limits, so at some point we will not be able to fit more onto a given die size," Google+ blogger Kevin O'Brien told Linux Girl, for example. "To my mind that means we move away from brute force approaches to computing.
"There is a huge amount of space not yet explored on how to program more intelligently and get software to be more useful without requiring ever-increasing transistors and CPU cycles," O'Brien explained. "That is the new frontier of computing, and let's face it, a 32-core CPU is not going to help you read your email any faster."
'It Could Work in Linux's Favor'
Indeed, "Moore's Law can't last forever," agreed Linux Rants blogger Mike Stone. "It's just not possible. There will come a time when the physics involved just become impossible."
Stone is "not particularly worried about it, though," he told Linux Girl. "It very well could work in Linux's favor. Linux has always been resilient and adaptive -- more so than any other OS I can think of.
"The tech industry has been lazy and has depended on Moore's Law, but once it's gone they're going to have to come up with new and different ways to make their products better and faster," Stone concluded. "I'm sorry to say that I don't know how they're going to do that at this point, but I have all the confidence in the world that they will."
'We Will Barely Notice'
It will happen eventually, "but I suspect that when it does, we will barely notice the difference," suggested consultant and Slashdot blogger Gerhard Mack. "We already have enough CPU power for most daily tasks, and most of what we consider 'slowness' tends to be time spent transferring data either from the local drive or the network.
"Linux will do fine, although I suspect we will see fewer hardware changes needing driver updates," Mack added.
Nothing can grow forever," Google+ blogger Gonzalo Velasco C. pointed out. "Space (chips square nanometers) is also finite. So, I guess things are going to be the way they always have been."
At one point RISC processors were "dead"," he pointed out. "Now, with ARM, they are very much alive and kicking very nicely!
"Perhaps bigger RISC processors (like ARM for desktops) might come to life," concluded Gonzalo Velasco C., "powering the PCs in a new era. Who knows? Perhaps not Moore, but Isaac Asimov or Jules Verne ;-)"
'Maturity Over Time'
"It is true that integrated circuits have miniaturized to the point where we are nearing real, inherent limits and can't go much smaller," Travers explained. "However, what will this mean for Linux? Not much.
"The fact of the matter is that we are seeing lower-end chips being developed for all sorts of tasks, and so the computing market is diversifying quite a bit," he pointed out. "An end to Moore's Law will bring about maturity over time and with it stability. I don't see how that would be a bad thing."
The end of Moore's Law would "spell much larger problems for proprietary software vendors who couldn't count on new hardware to sell software upgrades (which has been current practice for a long time)," Travers added. "A mature hardware market would really favor open source software at all levels. Not only would incremental improvements be more important than ever, but so would the fact that one isn't, essentially, paying for the privilege of having someone else tell you what you can't do with the software."
'That Will Be Good Enough'
To wit: Moore's Law has been a boon to Wintel helping to keep costs going down and performance going up to hide the bloat," blogger Robert Pogson concurred. "GNU/Linux has always been completely configurable by the user to permit adapting to any hardware. I've never had any trouble getting GNU/Linux to run on old hardware except if the hardware was older than 10 years."
In the future, "Torvalds is essentially saying that all hardware will perform like last decade's hardware," Pogson added. "That will be good enough, and where anyone needs more power, they can use multiple cores or multiple sockets. It's all good.
"GNU/Linux is a networked OS," he concluded, "and one computer or a million of them can work together to get the job done."
'ARM Doesn't Scale Very Well'
Linux developers will "have to do just as MSFT did and stop counting on ever bigger chips allowing them to increase bloat," Slashdot blogger hairyfeet said. "Personally, I'd say that is a VERY good thing.
"I mean, look at how much faster X86 chips are now compared to say 2002," hairyfeet explained. "Does your X86 system FEEL faster than the one you had in 2002? Not really, and the reason for that is how much bloat the devs have been adding to the system."
However, what is going to "bring a screeching halt to mobile advancement is NOT Moore's Law," he asserted. Rather, "it's the simple fact that ARM doesn't scale very well at all and the batteries haven't kept up. You look at the benches and the latest and greatest ARM chips have BARELY passed the nearly 9 years old Pentium 4, and even then only on certain benchmarks."
In fact, "that comparison gets MUCH worse when you look at more modern chips," hairyfeet concluded.
'Tech Life Will Go On'
It doesn't actually matter whether Moore's Law will come to an end or not -- "at some point, more power simply becomes irrelevant," opined Robin Lim, a lawyer and blogger on Mobile Raptor. "How many people do you know who really must have Intel's latest Core i7, or even a i5?
"We are still in a smartphone and tablet craze state, so people are now again very cognizant of processing power," Lim explained. "In time, smartphones and tablets will be like laptops. You will pick one that suits you and be less and less interested in the latest and greatest."
It won't, however, "have any particular effect on Linux development," Lim suggested. "Linux developers will build for whatever hardware is available. Tech life will go on."
Meanwhile, "what we should be happy about is one day all this technology will be as ubiquitous as running water and electricity," he concluded. "Almost everyone has it. The world is already a much smaller place, information more pervasive and the level of convenience all this has brought to our lives cannot be quantified."