[H]ard Does New Heat / Noise Test (real world) for 480

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Somehow i knew you'd ask that next .. here's a C&P:

NVIDIA GeForce GM, Drew Henry, published a post on the NVIDIA blog responding to
community comments regarding GTX 480's heat and power consumption:


http://www.tgdaily.com/hardware-fea...-gtx-480-designed-to-run-at-high-temperatures

I give Drew Henry props for giving a reasonable explanation. However, one thing I would caution on is the statement that it can take a lot of heat. While that may be true of the GPU itself, how do we know the other components can handle the load? I have had two failed GTX 280's now and I am very leary over the longevity of these high end cards. In both cases, it doesn't appear that the GPU itself was the cause for failure, but the memory and memory controller... And this isn't in reference to nVidia only... I worry about ATI high end cards as well.

Let me state this - Thank God for the EVGA warranty... I needed it.
 

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
I thought this was clear since the first day GF100 paperlaunched rhetoricalquestionmark.
It's also quite clear that Nvidia changed its definition of TDP.

For clarification:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html
power_maximum.gif



http://en.wikipedia.org/wiki/GeForce_200_Series

  • GTX 280 - 236 - 249
  • GTX 295 - 289 - 320
  • GTX 480 - 250 - 320

Official
TPU Max
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
So the next few big questions are:
1) How does ATI define TDP for their boards?
It must be something similar. They go over spec on FurMark too; in fact this is the whole reason they have VRM thermal protection.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Seriously Apoppin, you didnt have to gulp up that quote again :D Id rather we have a discussion about it and not just take any producer statement as a fact.

Look at the geforce 8xxx series, 2 years and half the gpus broke because of solders. I have two laptops, one with 8400 and one with 8600 dying on me because of this issue. Ofcourse its a heat/power consumtion issue.

And the more important question was still: How does ATI define TDP, since you are own your own site, id appreciate it if you could get a response from ATI on the same question.

And lastly, If "TDP" is such a flexible term/envelope/word.. why is it such an issue for gpu producers to meet the PCIe TDP requirements, OEM whatever requirements? And how do they manage if it now happens that NVIDIAs, ATIs, and INTELs, definition of TDP are all different...?

It answered part of your question, didn't it? i did what i said i would do and we now know that Nvidia's measurement of TDP is averaged "over time and in real world apps"

i told this forum i would ask Nvidia for *their* explanation of how they measured TDP and i believe this is the first time anyone got an answer from them - ever. You got their response and now it is open for discussion. i am just the messenger.
:)

i am writing about it this weekend. Look for it; i'm doing research now and i'll try to answer your questions.
--i am really behind and i have 3 articles backed up atm. :(
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Evidently Nvidia doesn’t release *peak* board power specs publicly. Clearly it is possible to go over 250W with tests like FurMark. But now we know that "maximum board power = 250W" is measured over time in real world gaming.

Hmmm.... :\ Am i missing something?, i've read that show the GTX480 drawing more than 250w in real world gaming.

The explanation from nV clashes with power req's and numbers from other reviews and cards.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Hmmm.... :\ Am i missing something?, i've read that show the GTX480 drawing more than 250w in real world gaming.

The explanation from nV clashes with power req's and numbers from other reviews and cards.

It is an *average* .. Nvidia does not count nor report the peak wattage when they compute the GTX 480's Max board TDP like the tech sites do - and the tech sites do not report for for how long the peak actually lasted in games; was it an instantaneous draw?

In other words, Nvidia's "max" TDP is an "average maximum" which is averaged over the entire time that the application(s) runs.

i can explain it; it doesn't mean i agree with it. As an analogy, it is somewhat like when a speaker manufacturer specs his speaker from 20Hz to 20KHz :p
- we all know what that means
(i hope)
 
Last edited:

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
In other words, Nvidia's "max" TDP is an "average maximum" which is averaged over the entire time that the application(s) runs.

(i hope)

Thats the definition of an oxymoron. Something can be either an average or maximum, it cant be both. :\

Not directed at you apoppin I know your just passing along what they told you.
 

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
i told this forum i would ask Nvidia for *their* explanation of how they measured TDP and i believe this is the first time anyone got an answer from them - ever.

Hm.. no. Some reviewers were surprised by the official TDP back at the launch day and asked Nvidia. Here's an example.
http://www.geeks3d.com/20100328/geforce-gtx-480-tortured-by-furmark-300w-and-earplugs-required/
Damien asked to NVIDIA why the power consumption is greater than the official 250W TDP. And NVIDIA has replied that the TDP is the max power consumption during a gaming session and is not the maximal power consumption of the card…
Thus I'm puzzled that this is news to some. I'm sure this was mentioned several times on AT as well.


As to the question how ATI measures the TDP. I've no idea but it's usually furmark*.9, like Nvidia in the past.
 

pmv

Lifer
May 30, 2008
15,142
10,043
136
Thats the definition of an oxymoron. Something can be either an average or maximum, it cant be both. :\

Not directed at you apoppin I know your just passing along what they told you.

Don't think its an oxymoron. You can have an average maximum and also a maximum average (not the same thing, obviously).

If something oscillates wildly over a period of time then it will have a series of local maxima. If you take the average of those maxima you have something you could plauslibly call an average maximum.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
What it comes down to is it's more bullshit marketing spin by NVIDIA. There's no regulation on how/where/what to report for TDP, it's just a number used to consider when making a heatsink. The unfortunately side effect is that people who don't understand the concept think it means something more, and NVIDIA is using that to put the GTX4x0 series in a better light. In the end, look at real world testing (not just Furmark, but actual games) and you'll get your real power usage numbers.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
The fact is, the PCIe spec doesn't give a shit how nVidia chooses to describe TDP. If power consumption goes over that spec, the card is running outside of the system's normal operating parameters.

Note: the same applies to ATi if they ever exceed the TDP of PCIe.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
The fact is, the PCIe spec doesn't give a shit how nVidia chooses to describe TDP. If power consumption goes over that spec, the card is running outside of the system's normal operating parameters.

What effect does that have on the mobo?