Powering PCs is becoming a global concern

-
Aa
+
a
a
a

26 February 2006,Taipei Times Jack Schofield

If you want to start a rumor, how about that Google is going to build its own nuclear power station? The logic is easy. Larry Page, the company's co-founder, reportedly sees "running out of power" as the biggest potential threat to Google, and the electricity needed to run its "server farms" -- tens of thousands of power-hungry computers storing billions of Internet pages -- could soon cost more than the hardware. Partly this is because Google is based in California, where the state solved its 2001 energy crisis by borrowing US$10 billion to buy electricity at massively inflated prices. But the rest of us are heading in the same direction.

We live in a world where the use of chip-based computers and consumer electronics devices is increasing rapidly, while supplies of oil and natural gas are diminishing perhaps even more rapidly. Worse, the threat of global warming means we should now be decreasing our energy use, like the Japanese, not increasing it. And although each individual PC or peripheral may not use much electricity, when you have a billion of them, it adds up.

Powering down

Sadly, it's impossible to say how much power a PC uses without measuring it, because of variables such as the type of motherboard, the speed of the chip and the power of the graphics card. (A fast graphics card can use more power than the processor.) PC power supplies can range from about 150W to about 650W, and will actually draw more than that during peak loads. However, PCs use much less power when idling and the US Energy Star program -- which PC manufacturers have been following since 1992 -- is aiming to get the power consumption on idle below 50-60W.

The simplest approach is to use the PC's power-saving software to turn the screen and hard drive off and then suspend the whole system after a specified time.

The situation is improving thanks to market trends towards flat screens and the use of portables rather than desktop computers. LCDs use much less power than traditional monitors, and by design, most notebooks use less power than most desktops. At the extremes, the 1GHz Pentium M Ultra Low Voltage chip uses only 5W whereas Intel's hottest chip for gaming, the 3.73GHz Pentium 4 Extreme, can consume up to 115W.

However, Intel has done a U-turn on its processor design goals, which should help. The Pentium design drove up clock speeds (and power consumption) to build the fastest chips. In 2002, Intel executives still assured me that "gigahertz is far from over" and looked forward to a 4.7GHz Pentium codenamed Tejas. Last year, however, they announced a new mantra: "performance per Watt."

Alistair Kemp, a spokesman for Intel, says the company has now developed "a new microarchitecture that will be coming out in the second half of this year." New chips will, he says, "reduce average power use quite substantially." The reduction will be from about 95W, for a fast Pentium 4, to about 65W.

Performance per Watt is also important for the arrays of servers in corporate data centers. Luiz Andre Barroso, principal engineer at Google, has already warned that "the possibility of computer equipment power consumption spiraling out of control could have serious consequences for the affordability of computing, not to mention the overall health of the planet."

In an article called "The Price of Performance" in the professional journal ACM Queue, Barroso expressed concern at the cost of providing computers, with electricity overtaking the cost of buying the hardware in the first place.

Running costs are exacerbated because companies generally try to utilize their servers as heavily as possible 24/7, or 8,760 hours a year. Faster processors generate more heat, so computer rooms require extra cooling. This uses more electricity, costing more money.

Of course, there's nothing new about any of this. The late Seymour Cray, the world's greatest supercomputer designer, spent a lot of his time on plumbing. In 1985, he resorted to pumping a non-conducting liquid called Fluorinert over the Cray 2's electronics to cool them. When mainframes ruled the world, IBM packed its mainframe chips in ceramic Thermal Conduction Modules with chilled water flowing through pipes to conduct away the heat. Some of today's high-performance games PCs use similar techniques, and it's still an option for servers -- but no one wants to go back to plumbing.

multiprocessors

Barroso suggests that multiprocessor chips are "the best [and perhaps only] chance to avoid the dire future envisioned above". What has changed recently is that multicore processors -- with more than one processing element on a single die -- have finally entered the mainstream, and many PC manufacturers are now shipping systems with Intel Core Duo processors that deliver more per watt than their forebears.

Indeed, the idea has even reached the home market. Microsoft's Xbox 360 games console has a processor with three IBM PowerPC cores, each of which can run two programming threads: the result is a single chip that can work like six. Sony's forthcoming PlayStation 3 will use an IBM Cell chip with multiple processing elements.

Not even low-power multicore chips will solve the power consumption problem permanently, but they should at least buy us a few years breathing space.

And if people factor the cost of power consumption (and cooling, where required) into their computer purchasing decisions, both for commercial and ecological reasons, this will put pressure on the manufacturers to do even better in the future. This story has been viewed 561 times.