Reducing the Massive Energy Appetite of Data Centers

Scientists and engineers are exploring many strategies to cut the energy that powers modern computing.
Image
data center

Image credits: Tony Webster via flickr

Rights information: CC BY 2.0

Charles Q. Choi, Contributor

(Inside Science) -- The data centers of Google, Facebook and other major online giants consume vast amounts of electricity. Now scientists are devising new chips and other strategies to help data centers save energy.

According to the U.S. Department of Energy, U.S. data centers, which are giant warehouses packed with computer servers, accounted for roughly one-fiftieth of total U.S. electricity use in 2014, equivalent to the energy consumption of 6.4 million average U.S. homes. As demand for online services grows, data center energy consumption is also expected to rise.

Roughly 40 percent of the energy that data centers consume is used to cool their computers. One tactic Google is pursuing to cool its data centers is to pump seawater into them, while Microsoft is experimenting with building data centers underwater.

Another strategy that researchers are investigating to reduce data center energy consumption is to make them more efficient. For instance, Massachusetts Institute of Technology computer scientist Arvind and his colleagues now find that changes to the "caches" that data centers use to store the results of common queries can slash cache power consumption by a factor of 25. "My guess is that these caches take about 10 percent of the energy resources of data centers," Arvind said.

Date center caches generally store data using dynamic random-access memory (DRAM) microchips, which are fast but expensive and energy-hungry. Instead, Arvind and his colleagues' new strategy, known as BlueCache, relies on flash memory, the kind used in flash drives, which consumes about 5 percent as much energy, costs about 10 percent as much, and has about 100 times the storage density. They detailed their findings Aug. 29 at the International Conference on Very Large Databases in Munich.

Flash is much slower than DRAM. Still, the scientists noted that flash still works much faster than human reactions, so they reasoned that incorporating flash could help data centers save energy "while incurring delays that people find acceptable," Arvind said.

BlueCache employs a number of techniques to make their servers competitive with the current DRAM workhorses. For instance, it adds a little DRAM to each of its caches to make it more efficient at spotting data it has not yet imported that are needed for common queries. Also, instead of relying on software to read, write and delete data, each of its caches uses specialized hardware circuits for these operations, increasing speed and lowering power consumption. Moreover, BlueCache bundles queries for data from the caches together to maximize efficiency of communication.

Another strategy that electrical engineer David Wentzlaff at Princeton University and his colleagues are pursuing is based on how data centers often help many users carry out similar tasks, such as checking email or browsing the web. The new microchip architecture they are developing, called Piton, is designed to recognize when users are carrying out similar computations and execute identical operations consecutively so they flow one after another. Doing so can increase data center energy efficiency by roughly 20 percent, the researchers said.

Each Piton chip also controls the access that competing programs have to memory in a way that can independently yield an 18 percent performance boost. Moreover, placing the cores or processors of a chip physically closer to the data they need can independently increase efficiency by 29 percent when applied to a 1,024-core chip.

Currently Wentzlaff and his colleagues have packed 25 cores on a six by six mm chip. "We've played video games on it," Wentzlaff said. He added that they could scale the system up to connect thousands of chips together into a system with millions of cores. They have also made Piton's design openly available online via OpenPiton.

One way data centers might conceivably become more environmentally friendly is to rely on electricity from renewable sources instead of fossil fuels. However, previous research by computer engineers Michael Zink and David Irwin at the University of Massachusetts at Amherst and their colleagues found that data centers actually consume more electricity if they rely solely on solar or wind power. This is because the variable nature of these energy sources means it could take longer to run a program, "and running longer means consuming more energy," Zink said.

Still, there are ways to optimize data center use to account for these fluctuations, Irwin said. "Computation is really flexible -- the workload of one data center can be sent to another data center where the sun is shining," he said. "We see renewables as a big part of data center energy consumption going forward."

When it comes to reducing data center energy consumption, "there are many, many strategies that people are pursuing," Arvind said. "I don't see any as a silver bullet, but when you accumulate them all over five or 10 years, their results look amazing."

 

Author Bio & Story Archive

Charles Q. Choi is a science reporter who has written for Scientific American, The New York Times, Wired, Science, Nature, and National Geographic News, among others.