Welcome Guest | Sign In
TechNewsWorld.com

Study: Data Center Power Usage Exploding

By Keith Regan E-Commerce Times ECT News Network
Feb 15, 2007 3:01 PM PT

Data centers in the United States are consuming 5 million kilowatts of energy per year, an amount equal to the power consumption of the entire state of Mississippi, according to a report released Thursday.

Study: Data Center Power Usage Exploding

The findings are a warning to decision makers to increase the efficiency of servers and microprocessors, maintains chipmaker Advanced Micro Devices, which commissioned the study. Corporate Vice President Randy Allen highlighted the research results in a keynote address delivered at the LinuxWorld OpenSolutions conference in New York City.

The study is the first to accurately calculate the amount of energy consumed by data centers each year, Allen said. To remain up and running, data centers alone required a full year's output from five 1,000 megawatt power plants -- the size of a major nuclear or coal-powered plant -- he noted.

"We have long known that data centers worldwide consume a significant amount of energy," Allen acknowledged. Still, the findings are "a wake-up call not just for the information technology industry, but also for global business, government and policy leaders."

Unchecked Energy Demand

Data centers and related infrastructure consumed about 1.2 percent of the total energy demanded by the U.S. in 2005, or about the same amount as all of the color televisions in use across the country, the report states.

"This study demonstrates that unchecked demand for data center energy use can constrain growth and present real business challenges," Allen added.

The good news comes in the form of next-generation servers that consume less energy and produce less heat, requiring less energy for air conditioning to keep data centers cool and functional, Allen argued.

AMD has been on the cutting edge of building microprocessors that can power less energy-hungry servers. However, nearly every major technology company now offers equipment with power efficiency ratings.

Software vendors, too, have recognized the potential benefits of going "green." There are products, for instance, that can turn off computers when they are not being used.

Quantifying the Problem

In 2005, total data center electricity consumption was 45 billion kilowatt hours, or about US$2.7 billion worth of electricity, according to the study, conducted by Stanford University professor and Lawrence Berkeley National Laboratories scientist Jonathan Koomey. Globally, the bill came to $7.2 billion.

A number of factors, including server sales reports from IDC and published power-demand ratings of hardware used in the data centers, formed the foundation for the research.

The demand for power by data centers has doubled since 2000 due to an explosion of Internet use and other factors, the study notes.

Although 5.6 million servers were being used in U.S. data centers in 2003, that number inflated to 10.3 million servers in 2005. Most of the increase came in the form of relatively inexpensive, low-end servers, many of which host Web services such as VoIP (Voice over Internet Protocol), and file- and video-sharing Web sites.

The high-tech industry should not only support, but also take the lead on work being done by the Environmental Protection Agency (EPA) and the Department of Energy (DOE), in order to help identify ways to cut energy consumption related to information technology, Allen said.

In addition, he called for an annual report on data center energy efficiency that measures progress toward higher efficiency and provides measurable standards for businesses.

AMD plans to share more of the study at a workshop organized by the EPA and Lawrence Berkeley Labs on Friday, Feb. 16, at the Santa Clara (Calif.) Convention Center.

Good News, Bad News

While the report is a sobering reminder of the energy demanded by an economy increasingly reliant on the Internet for shopping, news and entertainment, the sheer size of the energy demand of servers also represents opportunities.

Green energy investments are up significantly -- many of the venture capitalists who backed dot-com start ups 10 years ago are now investing in early stage firms charged with developing alternative energy technology and computer hardware that is more power efficient.

AMD has clearly decided that power efficiency is an area in which it can compete effectively against Intel and other chipmakers, Mercury Research President Dean McCarron told the E-Commerce Times.

"With the gaps in pure performance narrowing and energy prices rising, power consumption has the potential to become a key differentiator going forward," he said.

Many data centers are not equipped to handle the glut of new servers that have been installed in recent years, Gartner analyst Michael Bell said.

Within two years, half of the data center capacity for energy and cooling will be overwhelmed by the sheer number of installed servers, predicts a Gartner survey conducted last year. That alone will likely force many companies to invest in significant upgrades.


Facebook Twitter LinkedIn Google+ RSS
How urgent is the need to provide broadband services for rural U.S. communities?
It's critical to the entire economy, and everyone should share the cost.
If rural residents really want high-speed Internet, they should foot the bill.
Internet providers will benefit -- they should build out their own networks.
The government should ensure that everyone is connected, but broadband isn't necessary.
People who choose to live off the grid do so for a reason -- leave them alone.
Providers should improve broadband services in heavily populated areas first.