Rumor: GTX 480 TDP is actually 250w

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Schmide

Diamond Member
Mar 7, 2002
5,788
1,093
126
As a guy who runs 2 GTX 280s on a 750w PSU, I can tell you that TDP does not equal "normal use."

These are basic definitions.

and you're skewing them. TDP is a figure they use to design the cooling system to and has little to do with actual power usage at various load levels.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
and you're skewing them. TDP is a figure they use to design the cooling system to and has little to do with actual power usage at various load levels.

The cooling system has to be able to dissipate the heat at any load.
 

Schmide

Diamond Member
Mar 7, 2002
5,788
1,093
126
The cooling system has to be able to dissipate the heat at any load.

If that is the case, why do all modern chips have a throttle feature? Layman terms: A system must maintain a balance of work vs environment, if there is too much work or the environment is crappy, heat can exceed design.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Sorry, you are incorrect.

http://en.wikipedia.org/wiki/Thermal_design_power

"The thermal design power (TDP), sometimes called thermal design point, represents the maximum amount of power the cooling system in a computer is required to dissipate."

No it is correct. You should read what you linked. TDP is not the absolute maximum power draw, which is partially why systems are designed to throttle back to prevent overheating.

TDP is a specification for OEMs. They are required to dissipate at maximum that amount of heat under reasonable/typical loads to be in spec. However a chip can put out more heat than the TDP. LinX is one example on CPUs as is furmark on GPUs.

It would not be realistic to require OEMs to design their computers around the absolute maximum theoretical power output of a component.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
No it is correct. You should read what you linked. TDP is not the absolute maximum power draw, which is partially why systems are designed to throttle back to prevent overheating.

TDP is a specification for OEMs. They are required to dissipate at maximum that amount of heat under reasonable/typical loads to be in spec. However a chip can put out more heat than the TDP. LinX is one example on CPUs as is furmark on GPUs.

It would not be realistic to require OEMs to design their computers around the absolute maximum theoretical power output of a component.
I dont know if this matters but my 65watt E8500 even at 3.8 only pulls around 40-45watts loaded during cpu stress test. my overclocked gtx260 most certainly dosnt use anywhere near its TDP even while in furmark. its been my experience that components never get close to their TDP. maybe with some extreme volts and putting a component within an inch of its life would the actual consumption match or exceed its TDP.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
I dont know if this matters but my 65watt E8500 even at 3.8 only pulls around 40-45watts loaded during cpu stress test. my overclocked gtx260 most certainly dosnt use anywhere near its TDP even while in furmark. its been my experience that components never get close to their TDP. maybe with some extreme volts and putting a component within an inch of its life would the actual consumption match or exceed its TDP.

It really just depends on how it is tested for, and what the manufacturer decides the load should be.

TDP is also assigned based on processor families, and not necessarily on individual processors. So it can be overestimated or underestimated.

RV770 cards will exceed their TDP spec in furmark. Their VRM circuitry was designed around that TDP, and can fail if exceeded.

How are you measuring that wattage btw? If it's from software on your motherboard I wouldn't necessarily trust it. Un-overclocked my Core i7 says it draws ~100watts in software. When I overclocked it to 4ghz, the software reports that it's drawing 40watts at full load...
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
However a chip can put out more heat than the TDP. LinX is one example on CPUs as is furmark on GPUs.

Ya the problem is Furmark, OCCT, LinX and Prime95/Orthos -- each of these programs have nothing to do with real world usage.

I used to test all my overclocked CPUs with Prime 95 and Orthos to monitor the maximum CPU temperatures at load. However, I've never encountered a single application that taxed my CPU as much. So now I strictly use them for stability and not temperature testing. Instead, I use the most intensive programs that run on my system to test maximum temperatures because this testing 100% applies to how I will use the system. At least for CPUs, those temps are generally +/-5*C within stress testing program results. For GPUs though, using stress testing programs to monitor maximum power usage is a complete waste of time.

Look at this:
http://www.xbitlabs.com/articles/video/display/gpu-power-consumption-2010_3.html#sect0

5970 - Crysis Warhead - 240 Watts
5970 - TDP - 294 Watts
5970 - OCCT: GPU - 355 Watts

GTX295 - Crysis Warhead - 312 Watts
GTX295 - TDP - 289 Watts
GTX295 - OCCT: GPU - 400 Watts

OCCT puts a completely unrealistic load on all of the components of the graphics card. There is little point in testing the "maximum" power consumption of a videocard since you'll never encounter that for your intended purposes. Also, based on the results above, strictly focusing on TDP numbers is equivalent to car buyers arguing fuel consumption MPG manufacturer specifications.

According to these results, GTX295 is expected to run at a similar load in games as a 5970, but is using at least 70 Watts more power!

Without real world power consumption figures, it's impossible to say how much power GTX480 will consume compared to 5870.
 
Last edited: