The full truth revealed: GTX480 is one power sucking monster --Fudzilla

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
5 degrees fahrenheit.. OMG! call the firedepartment. :D
Thats practically nothing.

You serious? For a device it isn't much but for environmental temperature to a human going from 70F to 71F is noticeable when you are just trying to stay comfortable in your home. 70F to 75F would be uncomfortably warmer. It's a small number relative to all the numbers but you should be pretty sensitive to temperature changes if you aren't a vegetable. I think my computer room is about 3 degrees warmer than the rest of the house and I have to open the door sometimes to get it comfortable.

70 might be a very cool day, 75 a nice day, 80 is pretty warm, 90 is hot. At least for the climate I'm used to :) Dunno how people live in 100 degree weather..
 
Last edited:

slayernine

Senior member
Jul 23, 2007
894
0
71
slayernine.com
We're talking 150 watts of heat difference between the two. Most floor heaters are 1000-1500 watts, and they won't really heat up a room, just a small area (under your desk). A fully loaded computer is maybe 400 watts.

Does your room temperature skyrocket when you turn on a few light bulbs? Those generate heat :rolleyes:


It's in your head that your 5870 is heating up your room significantly.

No the 5870 is actually heating up your room. Put your hand behind your computer, whats that you feel? Hot air. Computer runs all day long outputting heat and a heater runs in short bursts topping up the rooms heat. Thus a computer adds a hell of a lot of heat to the room. Add in a dedicated video card and it is a lot more.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
We're talking 150 watts of heat difference between the two. Most floor heaters are 1000-1500 watts, and they won't really heat up a room, just a small area (under your desk). A fully loaded computer is maybe 400 watts.

Does your room temperature skyrocket when you turn on a few light bulbs? Those generate heat :rolleyes:


It's in your head that your 5870 is heating up your room significantly.

+1....LOL. Good post!
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
A 60w light bulb can increase the temperature in a large room by 5 degrees Fahrenheit.
Wait, what? That seems somewhat unlikely, let's run the math.

The specific heat capacity of dry air in a room is about 1.006 J/g per degree Kelvin at room temperature. The density of the air is about 1.204 kg/cubic metre.

Assuming a "large room" has dimensions of 5m x 5m x 2m, that's 50 cubic metres of air. This air weighs in at 50*1.204 = 60.2 kg, and we want to increase the temperature of the air by about 2.8 K (5F). It will take 1.006 * 60200 * 2.8 ~= 169 kJ to do so.

1 Joule = 1 watt-second. A 60 W incandescent lightbulb only radiates about 90% of its output as heat, so if you turn it on for a second you generate a mere 60 joules of energy, only 54 of which is thermal energy. If you leave it on for about 52 minutes and thermodynamics otherwise decides to go for it's lunch break then yes, you can heat a large room with a 60W bulb. In reality, that won't happen for the reasons Earthwormjim cited.

In contrast, a human body radiates about 100 J of heat each second. So if you observe a temperature increase when you walk into the room and turn on the 60W light, about two-thirds of that increase is because of you!
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Wait, what? That seems somewhat unlikely, let's run the math.

The specific heat capacity of dry air in a room is about 1.006 J/g per degree Kelvin at room temperature. The density of the air is about 1.204 kg/cubic metre.

Assuming a "large room" has dimensions of 5m x 5m x 2m, that's 50 cubic metres of air. This air weighs in at 50*1.204 = 60.2 kg, and we want to increase the temperature of the air by about 2.8 K (5F). It will take 1.006 * 60200 * 2.8 ~= 169 kJ to do so.

1 Joule = 1 watt-second. A 60 W incandescent lightbulb only radiates about 90% of its output as heat, so if you turn it on for a second you generate a mere 60 joules of energy, only 54 of which is thermal energy. If you leave it on for about 52 minutes and thermodynamics otherwise decides to go for it's lunch break then yes, you can heat a large room with a 60W bulb. In reality, that won't happen for the reasons Earthwormjim cited.

In contrast, a human body radiates about 100 J of heat each second. So if you observe a temperature increase when you walk into the room and turn on the 60W light, about two-thirds of that increase is because of you!

I liked it back when I could heat my house in winter with 4 light bulbs :(
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
No the 5870 is actually heating up your room. Put your hand behind your computer, whats that you feel? Hot air. Computer runs all day long outputting heat and a heater runs in short bursts topping up the rooms heat. Thus a computer adds a hell of a lot of heat to the room. Add in a dedicated video card and it is a lot more.

Your room radiates heat all day too, to other rooms, to furniture, to the floors, to the ground and to the outside air.

I didn't say it doesn't heat up your room, I said it wasn't significant.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
I liked it back when I could heat my house in winter with 4 light bulbs :(

You can! You'll just have to live in a very small house.

I actually can't find any figures for how much energy the GTX480 outputs as heat, so if anybody has that figure it'd be interesting. But assuming a worst-case scenario of 100%, that maxes out at what, 250W or so? Enough to heat a small confined space under a desk, certainly, but it would do very little to the temperature of the room.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
You can! You'll just have to live in a very small house.

I actually can't find any figures for how much energy the GTX480 outputs as heat, so if anybody has that figure it'd be interesting. But assuming a worst-case scenario of 100%, that maxes out at what, 250W or so? Enough to heat a small confined space under a desk, certainly, but it would do very little to the temperature of the room.

FYI, it's very close to 100%
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Wait, what? That seems somewhat unlikely, let's run the math.

The specific heat capacity of dry air in a room is about 1.006 J/g per degree Kelvin at room temperature. The density of the air is about 1.204 kg/cubic metre.

Assuming a "large room" has dimensions of 5m x 5m x 2m, that's 50 cubic metres of air. This air weighs in at 50*1.204 = 60.2 kg, and we want to increase the temperature of the air by about 2.8 K (5F). It will take 1.006 * 60200 * 2.8 ~= 169 kJ to do so.

1 Joule = 1 watt-second. A 60 W incandescent lightbulb only radiates about 90% of its output as heat, so if you turn it on for a second you generate a mere 60 joules of energy, only 54 of which is thermal energy. If you leave it on for about 52 minutes and thermodynamics otherwise decides to go for it's lunch break then yes, you can heat a large room with a 60W bulb. In reality, that won't happen for the reasons Earthwormjim cited.

In contrast, a human body radiates about 100 J of heat each second. So if you observe a temperature increase when you walk into the room and turn on the 60W light, about two-thirds of that increase is because of you!


LOL, you must be a teacher!...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Forget about heating up your room. It's like putting a 100W lightbulb in your case and turning it on (difference between 5870 and 480). Anyone here familiar with the "Easy Bake Oven"?
 

jbh545

Member
Jun 10, 2008
45
0
0
Gaming in general does heat up a room during an extended session. It's definitely noticeable during an all night BFBC2 marathon. However, the difference between GTX 480 sli and 5870+5970 trifire isn't that much.
 

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
in summer months, when heat has nowhere to go, i wouldn't be surprised if you get 5F more from room with GTX480 and room without it.
 

Ares202

Senior member
Jun 3, 2007
331
0
71
How efficent is the GTX 480?

10% would mean 1/10 of the energy is turned into pixel pushing, whereas 90% is turned into heat. surely its more efficient than a standard lighbulb? As The TDP is around 280 watts id expect it to be around 55-60% efficient making it output around 115 watts of heat.

If your room is already 75F and it increases to 85F youll really feel it thats for sure, but its complex to calculate the actual effect due to the amount of variables (room size, surrounding room temps, outside air temps, room insulation etc)
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
How efficent is the GTX 480?

10% would mean 1/10 of the energy is turned into pixel pushing, whereas 90% is turned into heat. surely its more efficient than a standard lighbulb? As The TDP is around 280 watts id expect it to be around 55-60% efficient making it output around 115 watts of heat.

If your room is already 75F and it increases to 85F youll really feel it thats for sure, but its complex to calculate the actual effect due to the amount of variables (room size, surrounding room temps, outside air temps, room insulation etc)

No, "pixel-pushing" is a colloquial term... No work is being done. Well, theres ~5-10w used by the fan but thats usually ignored. So essentially 100% of the power used goes directly to heat, which means up to 300w depending on what you're doing. Make no mistake, if a card uses XX power, it creates XX heat
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
5 degrees fahrenheit.. OMG! call the firedepartment. :D
Thats practically nothing.

I remember english isn't your first language, so you probably use celsius, but 5F can be the difference between "I think I'll put on a sweater" and "It's kinda warm in here"

Altough 60w heating up a room by 5f would be a fairly small room (like a bedroom or office) with door shut
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
How efficent is the GTX 480?
That is the big question, I wish I could find some useful numbers. What is a typical thermal efficiency of a GPU? i.e., if we were actually trying to use it to heat a room how much of it's electrical input would be converted into heat?

Tuxdave said the thermal efficiency is near 100% and he might well be correct, but that seems off to me simply because it's in the same realm as electric heating elements (and electrical fires). Elements are specifically designed to be thermally efficient; we want them to output heat. Computing devices of all sorts and brands are specifically designed to output as little waste heat as possible; we want them to be electrically efficient and thermally inefficient. Maybe that's just the best we can do at this point, and we'd lower it if we possibly could...what do I know about creating GPUs. :D

Unfortunately with respect to GPUs, efficiency frequently seems to be confused with output power, or confused with conserving input power. (Possibly because the efficiency nears 100%?) But is the GTX480 really more thermally efficient (less electrically efficient) than other GPUs? Maybe, but I suspect not significantly. It draws more input power, and unsurprisingly dissipates more heat -- likely at a very similar efficiency to other GPUs.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
That is the big question, I wish I could find some useful numbers. What is a typical thermal efficiency of a GPU? i.e., if we were actually trying to use it to heat a room how much of it's electrical input would be converted into heat?

Tuxdave said the thermal efficiency is near 100% and he might well be correct, but that seems off to me simply because it's in the same realm as electric heating elements (and electrical fires). Elements are specifically designed to be thermally efficient; we want them to output heat. Computing devices of all sorts and brands are specifically designed to output as little waste heat as possible; we want them to be electrically efficient and thermally inefficient. Maybe that's just the best we can do at this point, and we'd lower it if we possibly could...what do I know about creating GPUs. :D

Unfortunately with respect to GPUs, efficiency frequently seems to be confused with output power, or confused with conserving input power. (Possibly because the efficiency nears 100%?) But is the GTX480 really more thermally efficient (less electrically efficient) than other GPUs? Maybe, but I suspect not significantly. It draws more input power, and unsurprisingly dissipates more heat -- likely at a very similar efficiency to other GPUs.

No, They are 0% thermally efficient (100% was the right idea in terms of absolutes but the wrong end of the scale), because they do 0 work

You don't get a "thermally efficient" GPU by limiting heat output, you do it by getting more "digital work" (flipping bits) done per watt.

Ignoring fans (which convert most of their power to work done) and optical/magnetic drives (which convert some to work done) because they are a small minority of power drawn by a computer, it as a whole is 0% efficient, meaning it converts all energy to heat in the end
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
No, They are 0% thermally efficient (100% was the right idea in terms of absolutes but the wrong end of the scale), because they do 0 work
Pretend you actually want heat for a moment, and you'll see that can't be correct. ;)

0% thermal efficiency means a device outputs none of its energy as heat. (My god, it's all being wasted on silly things like graphics!) I think you mean to say they are electrically inefficient, because most of their output is wasted (output as heat, which in fact we don't want).
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Pretend you actually want heat for a moment, and you'll see that can't be correct. ;)

0% thermal efficiency means a device outputs none of its energy as heat. (My god, it's all being wasted on silly things like graphics!) I think you mean to say they are electrically inefficient, because most of their output is wasted (output as heat, which in fact we don't want).

Thermal efficiency is the relation of energy stored within the "fuel" (electicity here) and work done.

If a system takes 100widgets worth of "fuel" and does 25 widgets worth of work and creates 75 widgets of heat, it is 25% thermally efficient. If a system takes 100widgets of "fuel", does 0 widgets of work, and outputs 100 widgets of heat, it is 0% thermally efficient
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
Thermal efficiency is the relation of energy stored within the "fuel" (electicity here) and work done.
Sorry for wandering OT a bit folks, and emphasis mine.

If you remove the word "thermal" from the statement above, it is correct for efficiency in general. However, efficiency is a relative term based on the energy output you consider to be work and the energy output you consider to be waste. Your understanding of "thermal efficiency" is therefore reversed, likely due to mainstream marketing's incorrect usage. Thermally efficient devices are those that are good at outputting heat energy.

When you're talking about thermal efficiency, heat is the work being done. It isn't waste. For example, gas furnaces are 90% thermally efficient. Electric elements are 100% thermally efficient. It has been asserted that GPUs are close to 100% thermally efficient.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Sorry for wandering OT a bit folks, and emphasis mine.

If you remove the word "thermal" from the statement above, it is correct for efficiency in general. However, efficiency is a relative term based on the energy output you consider to be work and the energy output you consider to be waste. Your understanding of "thermal efficiency" is therefore reversed, likely due to mainstream marketing's incorrect usage. Thermally efficient devices are those that are good at outputting heat energy.

When you're talking about thermal efficiency, heat is the work being done. It isn't waste. For example, gas furnaces are 90% thermally efficient. Electric elements are 100% thermally efficient. It has been asserted that GPUs are close to 100% thermally efficient.

......


Heaters and their thermal efficiency are not relatable to GPUs. A heaters intent is to create heat, and it is very good at that. But that concept cannot be related to other systems.

A car engine converts roughly 25% of the energy stored in its fuel to mechanical work, and ~75% to heat. It is ~25% thermally efficient

Thermal efficiency is the ratio of usable work output of a system to the energy input. Since GPUs do no usable work, they are 0% thermally efficient, because heat generation is not the purpose of the system.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
I hope we don't go down this path again (we've had this argument before and I had to ask Idontcare to come explain it since some people wouldn't believe me lol)...computer components like GPUs and CPUs dissipate essentially all the power they use as heat.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
Heaters and their thermal efficiency are not relatable to GPUs. A heaters intent is to create heat, and it is very good at that. But that concept cannot be related to other systems.

A car engine converts roughly 25% of the energy stored in its fuel to mechanical work, and ~75% to heat. It is ~25% thermally efficient

Thermal efficiency is the ratio of usable work output of a system to the energy input. Since GPUs do no usable work, they are 0% thermally efficient, because heat generation is not the purpose of the system.
The same laws of thermodynamics apply to all systems, so I'm not sure why you think we can't relate the two. Who are we to tell people what to do with their car engines? :D

I'm not trying to be obtuse here, really. The only issue I'm having is that in your concluding statement you keep adding the word "thermally" in front of "efficient" when you should not. It's fine to take some shorthand and simply say the car engine is 25% efficient, because it is 25% mechanically efficient and that is a car engine's primary purpose -- outputting mechanical energy.

But if you wanted to use the car engine to cook breakfast on the engine block, it is clearly 75% efficient at producing heat! Heat suddenly is the work being done, and the engine is really quite good at converting fuel into heat. Thus, car engines do not become thermally inefficient simply because we elect to focus on their ability to move cars instead of their ability to produce heat. The percents don't change. (It's also incorrect to claim the egg-cooking car engine is suddenly 0% mechanically efficient...unless you get eggs in the engine or something.)

Anyway, if someone can point me in the direction of a test showing the electrical input power of a GPU roughly equaling the heat output, I'd appreciate a PM. People are getting tired of this rather argumentative tangent so I'm done posting on the subject.