It’s often said that a smartphone contains all the computing power NASA used to put people on the moon. But while today’s computer chips are incredibly capable, they’re still not as energy efficient as they could be. The more powerful chips get, the hotter they run and the more energy they require to cool down, so they don’t fry their electronic components inside.



As reported by the BBC, IBM is looking for a solution. One idea is to cool computers the way the human body cools its brain — by using a fluid, i.e. blood. IBM’s Patrick Ruch and Bruno built a proof-of-concept computer chip that contains tiny channels that would circulate a fluid past electronic components, cooling them down. They think an electrolyte similar to what goes inside a battery would work best for the “electronic blood.” As it passed through the channels, it would not only dissipate heat, but would also deliver energy to the chips. It would work because as the fluid traveled through ever-smaller channels, it would pass electrodes that would pick up the electrons from the fluid and use them to create a current.




The ability to cool heat this way would allow scientists to pack more processing power into a smaller space or even fatten otherwise flat computer chips into block-like structure. Right now that can’t be done, because chips rely on circulating air to stay cool and piling processors on top of each other traps too much heat.

The “electronic blood” cooling system could save on energy costs, too. Google, for example, spends millions of dollars on air conditioning bills to keep its data centers cool, which expend enough energy per year to power 200,000 homes.



Ruch and Michel’s system for cooling has a way to go. There’s the matter of choosing a workable electrolyte and fabricating chips with tiny channels in them.

But more than that, the project is part of an effort to get computers to work the way a brain does. Brains are pretty efficient — most of the energy a human brain uses goes to processing information. Only a small fraction of the energy in the brain gets turned into heat. That isn’t true of computers.

To get an idea of the difference, think of the competition between IBM’s Watson and the humans who played Jeopardy. Each brain used only about 20 watts, while Watson ate up 84 kilowatts. If we’re going to fit computers that think like humans into spaces smaller than a semi-trailer, it only makes sense that they should cool themselves as efficiently as our brains do.

Have something to add to this story? Share it in the comments.

0 comments:

Post a Comment