The full truth revealed: GTX480 is one power sucking monster --Fudzilla

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
The same laws of thermodynamics apply to all systems, so I'm not sure why you think we can't relate the two. Who are we to tell people what to do with their car engines? :D

I'm not trying to be obtuse here, really. The only issue I'm having is that in your concluding statement you keep adding the word "thermally" in front of "efficient" when you should not. It's fine to take some shorthand and simply say the car engine is 25% efficient, because it is 25% mechanically efficient and that is a car engine's primary purpose -- outputting mechanical energy.

But if you wanted to use the car engine to cook breakfast on the engine block, it is clearly 75% efficient at producing heat! Heat suddenly is the work being done, and the engine is really quite good at converting fuel into heat. Thus, car engines do not become thermally inefficient simply because we elect to focus on their ability to move cars instead of their ability to produce heat. The percents don't change. (It's also incorrect to claim the egg-cooking car engine is suddenly 0% mechanically efficient...unless you get eggs in the engine or something.)

Anyway, if someone can point me in the direction of a test showing the electrical input power of a GPU roughly equaling the heat output, I'd appreciate a PM. People are getting tired of this rather argumentative tangent so I'm done posting on the subject.

You're reading too much into your elementary understanding of the word "thermal" and not enough into the strict definition of thermal efficiency. To be completely honest I don't know how to better explain it to you, other than to just say "sorry, that's not how it works"

Thermal efficiency is a relative term. The thermal efficiency of a car engine doing mechanical work as was its intention is ~25%. The thermal efficiency of a car engine running in your garage to heat it up in the winter cause it's unheated is closer to 75%. But those are (in terms of TE) two distinct and separate systems. One, a mechanical engine, and two, a heater. Just because it is technically the same object doesn't mean you can use it's TE when used as a heater and apply it when it's used as a mechanical engine

Now, if you were to rig up a car to have the engine drive the car, and gather some waste heat produced by the engine to run a steam engine, the TE of the system would theoretically shoot up, because you're using the engine as a heat engine (~75% TE) and a mechanical engine (~25% TE) at the same time, and getting a lot more work done with the same amount of input (gasoline)


I can understand how GPUs are confusing though. Once again excluding the fan for simplicity, they do no usable work from their input. We are also ignoring the pushing around of a few trillion electrons and switching of nm scale xtors as they are negligible

If a system does no usable work for it's intended purpose from it's input energy, it is 0% TE for that task. That's why a GPU used as a GPU is 0%TE. When you use a GPU as a heater, it is nearing 100%TE, and that's where the confusion is coming from.


And it's pretty ridiculous to ask for a test showing what percentage of the input power to a GPU is converted into heat, it would be nigh-impossible to do. To isolate a GPU from the other components completely, in an airtight chamber, and measure accurately how rapidly it heats up (from which you can extract the amount of heat being dumped into the known volume) would be quite infeasible
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Hum, I'd like to relay what I read on Wikipedia's thermal efficiency article and electrical efficiency article but I think it best to let you guys read it yourselves, but I think yh125d is right here thermal efficiency seems to be a very specific term that takes into account whether you want heat or are dealing with a heat engine, as opposed to other term efficiencies where you tack on the term you want to be efficient at.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Anyway, if someone can point me in the direction of a test showing the electrical input power of a GPU roughly equaling the heat output, I'd appreciate a PM. People are getting tired of this rather argumentative tangent so I'm done posting on the subject.

There is no mechanical work being done so essentially all the power is being converted to heat. If I had seen that kind of test I would share the details. Probably only CPU/GPU manufacturers do those kinds of tests..IF they are done at all.
 

epidemis

Senior member
Jun 6, 2007
794
0
0
Oh noes it uses a lot of power, whatever shall we do. Oh nothing cause it does not matter.

Think of the bleeping polarbear!!!111 and algore

There is no mechanical work being done so essentially all the power is being converted to heat. If I had seen that kind of test I would share the details. Probably only CPU/GPU manufacturers do those kinds of tests..IF they are done at all.

Omg, fail@thermodynamics, it doesnt matter if work is being done or not! laflz
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
Negative, and it only becomes really noticeable with games that really load up the GPU. STO is well documented to *really* heat up gfx cards from both ATI and Nvidia more so than most other games. Really gets the fan going too. Pretty much the worst thing I've come across save Furmark obviously.

Agreeing here. The heat generated by a computer can be uncomfortable in a small room. It's not noticeable if you just play like 30 minutes, but you'll start to feel the warmth and stuffy air after maybe 2 hrs. It's not just the area under the desk, the whole room gets warm and very stuffy. Opening the windows does worse during summer. It's definitely not a feeling from the stress of gaming (seriously gaming is not stressful) because it's something I can clearly feel when I walk out of the room, and besides my wife confirms it.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Think of the bleeping polarbear!!!111 and algore



Omg, fail@thermodynamics, it doesnt matter if work is being done or not! laflz
Yes, it does matter. Quite a bit, in fact. Energy doesn't actually get "consumed", it simply gets converted. In the case of a light bulb, it gets converted to light and heat. In the case of a ceiling fan, almost all of the energy used is converted to mechanical energy to move the air, while a little is converted to heat within the motor itself. In the case of a video card, almost all of the energy gets converted to heat, while a little is used by the cooling fan.

A GTX 480 is rated at somewhere between 250w and 300w (approximately). The fan is a 1.8A Delta. At 100% fan speed, that equals around 22 watts of power. Now, I highly doubt anyone will be running their fan at 100% all the time, so let's say 40%. That gives us roughly 9 watts used by the fan. All of the rest of the power gets converted to heat by the GPU.

When people say that a GTX 480 under load is using 250w to 300w, approximately 240w to 290w of that is being converted directly to heat. Other than the fan, there is nothing on a video card that can convert electricity to anything other than heat (no light, no radiation, no sound, etc).

So think about how much heat a 250w or 300w light bulb would give off and you have a fairly good idea of how much heat a GTX 480 under load. It's not an inconsiderable amount and will eventually heat up a small enclosed room. Even with my HD4890 and Q6600 I find myself having to open a window after a couple hour session of gaming.
 

pmv

Lifer
May 30, 2008
15,142
10,043
136
When people say that a GTX 480 under load is using 250w to 300w, approximately 240w to 290w of that is being converted directly to heat. Other than the fan, there is nothing on a video card that can convert electricity to anything other than heat (no light, no radiation, no sound, etc).

True, but still too conservative - even if the electricity were being converted to something else (light or sound etc) that would likely end up as heat. Pretty much everything ends up as heat. And most of it will be heat in the room. Unless it gave off gamma or x-rays! Those would end up as heat a long way away, I guess. Though if your video card is giving off gamma rays you probably have other things to worry about than heat.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Think of the bleeping polarbear!!!111 and algore



Omg, fail@thermodynamics, it doesnt matter if work is being done or not! laflz

Well that's an odd response, and also wrong.

True, but still too conservative - even if the electricity were being converted to something else (light or sound etc) that would likely end up as heat. Pretty much everything ends up as heat. And most of it will be heat in the room. Unless it gave off gamma or x-rays! Those would end up as heat a long way away, I guess. Though if your video card is giving off gamma rays you probably have other things to worry about than heat.


It's common to look at everything within a closed system. Engineers really don't care what eventually happens to energy at the end of time...
 

Havoc Ebonlore

Junior Member
May 21, 2008
24
0
0
My computer heats my room up enough as it is, and I have an 8800GTS 512. Sry, but heatwave fail is still heatwave fail. Game over, do not insert coin to continue.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
I just read on engadget that 3 way sli for the 480s requires a 1200w psu

that just nuts

Seriously, Nvidia should have waited for another respin or lowered specs (and price) so power, heat and noise wasn't so bad. To me it looks like power draw goes exponentially up with temperature as well. It's like nvidia is selling overclocked cards with the 480gtx.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Seriously, Nvidia should have waited for another respin or lowered specs (and price) so power, heat and noise wasn't so bad. To me it looks like power draw goes exponentially up with temperature as well. It's like nvidia is selling overclocked cards with the 480gtx.

Its because they are using high leakage chips hence you can expect a lot of leakage loss i.e higher than normal power consumption. Its another reason why they overclock so well too. Noise isnt too much of a problem and this can be seen in various threads (in different forums) with users that own these cards. Noise isnt a issue for most of them because its not as loud as what the people who dont own the cards make it out to be.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Its because they are using high leakage chips hence you can expect a lot of leakage loss i.e higher than normal power consumption. Its another reason why they overclock so well too. Noise isnt too much of a problem and this can be seen in various threads (in different forums) with users that own these cards. Noise isnt a issue for most of them because its not as loud as what the people who dont own the cards make it out to be.

Didn't know leakage made a card oc well. I would have went for available band width.

Does anyone well known on this forum own the dam card? I thought they all went to the pr guys.

Many well thought of reviewers have said noise and heat are an issue...
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I just read on engadget that 3 way sli for the 480s requires a 1200w psu

that just nuts

If you are willing to cough up $1500 for the best performing combination of cards you can get....a 1200w PSU isnt exactly nuts to you.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Didn't know leakage made a card oc well. I would have went for available band width.

Does anyone well known on this forum own the dam card? I thought they all went to the pr guys.

Many well thought of reviewers have said noise and heat are an issue...


Remember the TWKR chip from AMD made for overclocking? that CPU was a high leakage version of the PHII.

Look at nvnews.net, xtremesystems, hardocp forums, evga etc etc even at AT there are those who own these cards. What amazes me is the number of forum users buying SLi configurations.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
If you are willing to cough up $1500 for the best performing combination of cards you can get....a 1200w PSU isnt exactly nuts to you.

True. The upfront cost is high, but selling off the cards later on makes it viable. Though Tri-480 is still a waste of money IMO.
 

Schmide

Diamond Member
Mar 7, 2002
5,755
1,048
126
Lets disregard chemical energy (car engine) and plasmas since that is a whole different class of efficiencies.

In terms of conservation of energy and electricity, you basically have 3 factors that cause energy to move from one state to another.

A) Electromagnetism - (i.e. a fan) a magnetic field effects the flow of current.

B) Incandescence (Electroluminescence) - as the current passes through the medium it is excited enough for light photons to escape.

C) Resistance - as the current passes through the medium it is only excited enough for heat photons to escape.

B & C are actually related as light and heat are both types of radiation it is really a matter of how much excitement there is.

So basically A+B+C=energy used. You can pretty much discount A and B so you're left with C.

I guess if a chip uses enough energy it could glow and produce a small amount of B. There are some conditioners that use magnets to adjust the flow of electrons so you could make a case for A (VRM squeal?).
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The fan is a 1.8A Delta. At 100% fan speed, that equals around 22 watts of power. Now, I highly doubt anyone will be running their fan at 100% all the time, so let's say 40%. That gives us roughly 9 watts used by the fan. All of the rest of the power gets converted to heat by the GPU.

It doesn't work like that. The fan may consume 22W when ramping up to 100% fan speed, but once it does the power consumption will go down because the motor doesn't require 1.8A once its reached the desired speed. This is just the nature of a DC motor. So the power consumption by fans are quite negligible unless 1~4W is important to you (could be even less).
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
How do some of you handle it when the sun comes up? Seriously people, I just got finished running 480SLI in a Unigine loop for 12 hours in a 8x12 room in my basement with windows closed. Spent most of the day yesterday at the hospital visiting someone. Got home at 8pm. Went downstairs to check on the rig. Still running. I noticed the room was a touch warmer than the adjacent room, but I assure you all, I did not burst into flames walking into that room. Did not feel compelled to open the windows before suffocation.
Stop blowing this out of proportion folks. It's getting really ridiculous now. Talking light bulbs and such nonsense. I can easily tell you that even a single GTX470/480/5850/5870 whatever, will increase ambient temps in a room over long periods of time.
But I'm talking like 2-3 degrees. That's with hours and hours on end of gaming.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
My computer heats my room up enough as it is, and I have an 8800GTS 512. Sry, but heatwave fail is still heatwave fail. Game over, do not insert coin to continue.

Bullcrap. I have 2 computers in room, 2x24 inch monitors. One has a GTX 260 core 216 and other 8800gts and its never that hot in here. Perhaps stop putting wood in fireplace..
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
How do some of you handle it when the sun comes up?
The sun doesn't sound like a jet engine. Yeah, some of the power arguments are ridiculous, but all that extra power translates into higher noise levels. This can be objectively proven and cannot be debated away:

http://www.behardware.com/articles/747/page1.html

At load, a single GTX480 is louder than either a 5970 or a 5870 crossfire. Its load noise levels are not comparable to a single 5870 by any stretch of the imagination. We can see objective measurements that back these claims.

Those that try to brush away this fact with comments like "you can hear both cards under load" are simply being disingenuous.

GTX480 SLI is off the charts, being described as "really too noisy in load and disturbing at idle". The objective measurements again back these claims.

If I got the cards for free then I might be more tolerant of their noise levels, but if I’m paying for them, it's unacceptable. A GTX480 is not faster than a 5970 or a 5870 CF, so why is it louder?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
How do some of you handle it when the sun comes up? Seriously people, I just got finished running 480SLI in a Unigine loop for 12 hours in a 8x12 room in my basement with windows closed. Spent most of the day yesterday at the hospital visiting someone. Got home at 8pm. Went downstairs to check on the rig. Still running. I noticed the room was a touch warmer than the adjacent room, but I assure you all, I did not burst into flames walking into that room. Did not feel compelled to open the windows before suffocation.
Stop blowing this out of proportion folks. It's getting really ridiculous now. Talking light bulbs and such nonsense. I can easily tell you that even a single GTX470/480/5850/5870 whatever, will increase ambient temps in a room over long periods of time.
But I'm talking like 2-3 degrees. That's with hours and hours on end of gaming.
It's so nice that you can easily take into account all different sizes of rooms, varying thicknesses of insulation in the walls, ceiling heights, room locations, etc and come to the inescapable conclusion that everybody is "blowing this out of proportion" and that "It's getting really ridiculous now".

I noticed that you said you have your setup in your basement. Tell me, does it usually seem cooler in your basement than upstairs? Isn't it likely that the concrete blocks and cement floor act as heatsinks and transfer ambient heat in your basement directly to the colder underground soil? You don't think that could possibly have anything to do with your relatively stable room temps despite dumping an extra 700W of heat into the room? (BTW, 700W is approaching space heater levels of heat production)

http://www.heater-store.com/reflective_heater_soleus_ms_09_46_prd1.htm

Not everybody games in their basement. My gaming room is roughly 8x12 as well, but it is upstairs. I usually have the ceiling fan on when gaming, and yet I still find myself having to open a window after gaming for a couple hours because it begins to get uncomfortably warm. I probably wouldn't have to open a window if I were able to leave the door open, but my wife is a light sleeper, so I keep the door closed so I don't wake her.

Try to realize that not everybody has an identical setup and that while you may not have heat issues when gaming, others actually might.