Tuesday, March 04, 2008

Data centres - Cool it!

Data centres - Cool it!

Mar 4th 2008
From Economist.com

The data centres that power the internet demand a lot of power. Time, then, to make them more efficient


Google

AS ONE industry falls, another rises. The banks of the Columbia River in Oregon used to be lined with aluminium smelters. Now they are starting to house what might, for want of a better phrase, be called data smelters. The largest has been installed by Google in a city called The Dalles. Microsoft and Yahoo! are not far behind. Google's plant consumes as much power as a town of 200,000 people. And that is why it is there in the first place. The cheap hydroelectricity provided by the Columbia River, which once split apart aluminium oxide in order to supply the world with soft-drinks cans and milk-bottle tops, is now being used to shuffle and store masses of information. Computing is an energy-intensive industry. And the world's biggest internet companies are huge energy consumers—so big that they are contemplating some serious re-engineering in order to curb their demand.

The traditional way of building data centres such as Google's is to link clusters of off-the-shelf server computers together in racks. Hundreds, even thousands, of such servers can be combined to achieve the sort of arithmetical horsepower more usually associated with a supercomputer. But the servers all require energy, of course, and so do the electronic links that enable them to work together. On top of that, once the energy has been used it emerges as heat. The advanced cooling systems required to get rid of this heat demand the consumption of more power still.

All of which is expensive. Though the price of computer hardware continues to plunge, the price of energy has been increasing. The result is that the lifetime cost of running a server now greatly outstrips the cost of buying it. A number of researchers are therefore looking for ways to operate big computer centres like the one at The Dalles more efficiently.

Horst Simon, a computer scientist at the Lawrence Berkeley National Laboratory in California, is working on something called the “climate computer”. The servers in data centres contain what are known as multicore processors. These combine a small number of powerful chips to do the calculations. Often these processors are more powerful than is strictly necessary for the sorts of jobs that data centres do. The climate computer would use arrays of less powerful, and hence less power-hungry, processors. The difference is staggering. Dr Simon and his colleagues think they can build a system that consumes a hundredth of the power of an existing data centre without too much loss of computational oomph.

Jonathan Appavoo, Volkmar Uhlig and Amos Waterland, three researchers at IBM, are taking a different approach. They have been tinkering with the architecture of the company's Blue Gene supercomputers. Blue Genes already use chips that consume much less power than do the microprocessors typically employed in data centres. They also use a specialised, energy-efficient communications system to handle the traffic between machines. This bespoke approach, IBM reckons, adds up to a machine that demands less energy per task, is more reliable, and occupies a lot less space than clusters bolted together from mass-produced servers.

Blue Gene computers are designed to excel at specialised, demanding tasks (they got their name because they were originally tested on some knotty problems in genomics). However, Dr Appavoo, Dr Uhlig and Dr Waterland think that, with a little tinkering, a network of Blue Genes could easily handle the software used by internet firms. Project Kittyhawk, as it is called, would link several thousand Blue Gene computers together to form a system with more than 60m processing cores and room to store 32 petabyes of data. A petabyte is a million gigabytes—and Kittyhawk, if it were built, would in theory be able to host the entire internet by itself without breaking sweat. What that would do to Oregon's economy remains to be seen.

website page counter