• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Where does the power go in a processor?

Minjin

Platinum Member
Jan 18, 2003
2,208
1
81
I've been thinking about this recently and I realized that even with a decent grasp on physics, I don't understand where the power is going. What kind of energy conversion is happening when we power a processor? Lets assume that it's 100% efficient, meaning that there is no heat. So what work is being done? Is it purely switching losses? And by that I mean switching transistors on and off. If so, are we moving something or are we creating a potential?
 

piasabird

Lifer
Feb 6, 2002
17,168
60
91
Well the power is being used to open and close gates or logic circuits. These circuits use an off/on state. When they are off that means there is a power threshhold keeping the gate closed. This is caused by resistance. When the power supasses the threshhold it flows through the gate. The resistance burns the power fed into it until there is enough power to open the gate.

Resistors can be used in different ways. They can limit power or stop electricity from flowing or they can stabilize power if used together. The resistance causes a consumption of power and the side effect is the production of heat. When the heat is caused inside the CPU this is electricity or power being consumed.

Even the Power cables coming into our houses consume some power. This is why we use AC power at such high voltages. It limits the loss of power. If wires in your house consume too much power it can cause your house to catch on fire.

You may want to take a class in Electrical theory or Computer Gate logic design. I dont claim to fully understand how all electrical circuits work. I do like to read up on solar electricity.

www.homepower.com
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
A CPU depends on resistance. The gates which make up the components of the CPU utilizes resistance to change.

There can be no such thing as a 100% (as in superconducting with 0 resistance) CPU. That's like asking how fast a bird can fly in space. There is no air for the bird's wings to do anything.

The only benefit of having high efficiency (or low resistance) would be a CPU that used very little power.

The power that is used is turned to heat from the resistance of the gates (which is fundumental to CPU operation).
 

Minjin

Platinum Member
Jan 18, 2003
2,208
1
81
Originally posted by: miniMUNCH
Originally posted by: Throckmorton
All of it turns into heat
Almost.
If we're truly converting most of it to heat, the crazy part of my brain says that we should make all electric heaters out of CPUs. That way we would be able to get more work out of our energy. I'm imagining distributed computing where the colder the average temperature, the more computing power is housed there.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
1) Heat
2) Switching Power
3) Leakage Power

You are basically moving electrons from rail to rail. One moment you're charging a capacitor by connecting it to power through a resistive transistor. The next moment you're discharging the capacitor by connect it to ground through a different resisitive transistor. So you're slowly moving charge from power to ground.
 

Onund

Senior member
Jul 19, 2007
287
0
0
Originally posted by: Minjin
Originally posted by: miniMUNCH
Originally posted by: Throckmorton
All of it turns into heat
Almost.
If we're truly converting most of it to heat, the crazy part of my brain says that we should make all electric heaters out of CPUs. That way we would be able to get more work out of our energy. I'm imagining distributed computing where the colder the average temperature, the more computing power is housed there.

Would the crazy part of your brain tell you to buy a $500 heater instead of the $10 one? Or, thinking of the 1800W heater I saw, that would be 18 100W quad cores...

On a related note, I've been thinking of ways to use my computer heat more efficiently. My current plan is to redirect it to the basement for the summer and as is during winter. Actually, I could probably make a decent food dehydrator out of the heat. This requires some case modding though...
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Originally posted by: Minjin
If we're truly converting most of it to heat, the crazy part of my brain says that we should make all electric heaters out of CPUs. That way we would be able to get more work out of our energy. I'm imagining distributed computing where the colder the average temperature, the more computing power is housed there.

That's just the part of your brain that misses the fact that electricity, when it's arrived at your wall outlet, has already gone through all its being inefficient - generation from primary power, long distance transportation, and probably several stages of transformation.

From there, electric heaters are all 100% efficient. Overall, burning fuel in a far away power plant and using an electric heater is much much less efficient than using a combustion oven right there.
 

NanoStuff

Banned
Mar 23, 2006
2,981
1
0
Originally posted by: Peter
Originally posted by: Minjin
If we're truly converting most of it to heat, the crazy part of my brain says that we should make all electric heaters out of CPUs. That way we would be able to get more work out of our energy. I'm imagining distributed computing where the colder the average temperature, the more computing power is housed there.

That's just the part of your brain that misses the fact that electricity, when it's arrived at your wall outlet, has already gone through all its being inefficient - generation from primary power, long distance transportation, and probably several stages of transformation.

From there, electric heaters are all 100% efficient. Overall, burning fuel in a far away power plant and using an electric heater is much much less efficient than using a combustion oven right there.

Combustion ovens don't simulate protein folding stupid. Companies would buy processing power from you, and that would subsidize your heating costs. You get heat, they get calculations. It's a better alternative than heat without other beneficial side effects of the expended energy. In fact computing installations have been set up in the arctic to act as heating for research facilities, and in return the cold arctic air eliminates massive energy expenditures on cooling the computer. It's a very symbiotic system.
 

Nathelion

Senior member
Jan 30, 2006
697
1
0
As to the cost, you wouldn't have to have the newest and best out there - I bet if you made huge amounts of small dies on, say, an older, cheap process and chained them together in some way you could make quite decent economy out of it. I'd think routing a network cable to every heating element you have would be a problem though.
Also, you'd need to "chain the cores together" in such a way that 1) they can take quite a bit of punishment, such as playing kids throwing their schoolbooks all over them and 2) they will degrade gracefully as the cores break, and 3) they'll have the bandwidth to shuttle the data back and forth, preferably without an external controller allocating the work, and 4) they will still have enough computing power to actually matter, and 5) they'll last for a long time and can take extreme heat variations when people don't need their heater and put it in the attic.
Solve those problems, and you may just have something.
 

Nathelion

Senior member
Jan 30, 2006
697
1
0
Or, you´re European and you pay lots of taxes on that gas.

Realistically, CPU heating elements will never work. But I imagine data centers could make a couple of bucks by supplying district heating.
 

irishScott

Lifer
Oct 10, 2006
21,562
3
0
I have an old P3 running Rosetta@home 24/7. There's a noticeable difference in the room temp when it's off. Only a degree or two (F), but it's there.
 

Minjin

Platinum Member
Jan 18, 2003
2,208
1
81
Oh I'm not saying its something that we could do right now. But the more I think about it, I think its almost inevitable as we work to maximize efficiency in everything we do.

Your computer would be your central heater. It would be sized such that in the winter, run hard enough, it would provide all of your heat. Extra processing power would be sold, perhaps to be used by those in the other hemisphere. In the summer, you'd idle down and use processing located elsewhere. We'd all use thin clients of some sort, and everything would be plugged into The Network(tm)...
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
I don't share the same view of the future. I see processor power increasing at lower and lower operating power (and thus lower heating ability) at a continually decreasing cost. There will be no need to sell extra processing power because there would be nobody needing it when faced with abundance of cheap, energy efficient and fast processors.

In that respect, your electricity cost would be better spent on getting the most efficient heater.

specialization is where it's at.