[H]ard Does New Heat / Noise Test (real world) for 480

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Paratus

Lifer
Jun 4, 2004
17,759
16,108
146
Apoppin,

What I'm interested in is hearing how a guy who had an overclocked P4 AGP rig into 05 (06?) and used to argue that it wasn't time to upgrade to PCIe (like I used too ;) ) now runs a tech website and has a bleeding edge rig.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Apoppin,

What I'm interested in is hearing how a guy who had an overclocked P4 AGP rig into 05 (06?) and used to argue that it wasn't time to upgrade to PCIe (like I used too ;) ) now runs a tech website and has a bleeding edge rig.

You want my biography?
:D

You are talking ancient history in tech terms :p
- lessee, *years ago* i had a P4 EE @ 3.74 GHz

i guess it all started when i got that 2900XT, benched it for the forum .. then i got a 2900p and OC'd it for 2900XT CF (FrankenFire) .. then a 8800 GTX ..

... and one thing leads to another

.. INXS
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Apoppin,

What I'm interested in is hearing how a guy who had an overclocked P4 AGP rig into 05 (06?) and used to argue that it wasn't time to upgrade to PCIe (like I used too ;) ) now runs a tech website and has a bleeding edge rig.
Personally, I'm a bit surprised to find him posting here at all. I do pop over to ABT infrequently to check out the reviews/forums and happened to notice what he posted less than a week ago:

ATF video is made of mostly of rude trash-talking AI who don't have any capability to debate so they ridicule. They respect no one and i will not post there for that reason.

There is no even moderation there. The entire site reflects the giant ego of its owner and the Senior mods love to abuse the membership.

http://alienbabeltech.com/abt/viewtopic.php?p=35953#p35953

To be honest, I took offense to that. ATF is much better now that many of the verbally abuse members have been banned and, ironically, gone to to ABT. After browsing a few threads it becomes quickly evident to anyone that ABT is the one made up of those 'mostly rude trash-talking members' he's complaining about, not those of us here.

Ironically, even his own post on ABT contained significant 'ridicule' and 'trash talk':

"The entire site reflects the giant ego of its owner and the Senior mods love to abuse the membership".

Pot, kettle, etc...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Personally, I'm a bit surprised to find him posting here at all. I do pop over to ABT infrequently to check out the reviews/forums and happened to notice what he posted less than a week ago:
. . .
To be honest, I took offense to that. ATF is much better now that many of the verbally abuse members have been banned and, ironically, gone to to ABT. After browsing a few threads it becomes quickly evident to anyone that ABT is the one made up of those 'mostly rude trash-talking members' he's complaining about, not those of us here.
It was very difficult to post *anything* here without being subject to ridicule and trash talking that is poorly disguised personal attacks. i have said the same thing over here - over and over again - and that IS the reason i left ATF video.

i don't need to post here at all. However, there is still a "core" of enthusiasts here who are polite and knowledgeable, helpful tech enthusiasts. That is why i came back - to see if anything had changed. The way some of you have been attacking Keysplayr and calling him a liar is simply disgusting and offensive.

And you are quite right; it IS somewhat better here now than when i left before the forum transition many months ago - things changes nothing is static; you have quoted my comments about the "old" ATF video.

However, it sure looks like you are still stirring everyone up just like the old Creig. - i guess my comments still apply to you; you can remain offended. :p
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Things change, move on. And a hint to several of you who like to quote eachothers "less well written" or "ill written" posts. Stop it, it makes you look worse. And you know better!
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I am just not sure why everybody in this forum has become so eco-minded recently. Last time I checked this was an enthusiast forum, if somebody has a supporting case and PSU and doesn't mind the extra power draw or the possible added noise then why is GTX 480 a failure again?

Because of the competition. With the 4870x2 and 295 nobody said a peep about the harshness or power suck (both were similar to the 480) because there was nothing else which offered comparable performance at the time.

Today we have the 5890 which offers better much better performance at the same power use and noise and the 5870 which is indistinguishable in performance outside of looking at benchmark results, yet much more livable with.

I wonder how many of the Fermis will throttle to below 5870 performance after they get a bit of dust buildup or TIM aging. Or survive multiple of 20C -> 110C cycles per day for more than a few months. Or push other components to failure (IR photographs of the running 100+C fermi shows the case metal below it at 60C). None of this will be reflected in reviews or is an issue for enthusiasts demanding best of the best of the best of the best, but should be a consideration for the slightly more sane.

That's why many of us have written the GTX480 off as a failure. It would have been a great product at a good price if was available and had no competition. Unfortunately, it has competition -- which makes it a very much not so great product at a meh price. Just like the 2900XT.
 

Paratus

Lifer
Jun 4, 2004
17,759
16,108
146
Well I appreciate the effort that Apoppin, BFG, and yes Keys too have gone through to provide more information on these cards. I hope they continue to do so.

Rest assured the dogmatic fan boys on both sides are very obvious vs folks who simply have a preference.
 

lopri

Elite Member
Jul 27, 2002
13,329
709
126
Nvidia's 250W TDP refers to the card's power usage "in gaming" and not the card's total possible draw.
1) Where can I verify this?
2) Were NV's past TDPs measured the same way?

I have my thoughts, but the above questions are factually important, IMO.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Some people have theorized the 250 watt TDP is the GPU only. Once you add in the large, high speed fan and DDR5 you get the power *draw* observed in some games and Furmark. So technically the GPU is dissipating 250 watts max, but the whole card is using and radiating more.
 

lopri

Elite Member
Jul 27, 2002
13,329
709
126
That could be a possible explanation but not a plausible one, IMO. GTX 480's TDP is specified by NV as below.



What's curious is whether it's NV's official position that the TDP is measured 'in gaming'. And if that was the case for the past video cards from NV. Again, that may be as well but it's sort of disconcerting knowing what we've learned about Fermi's GPGPU prowess.
 
Last edited:

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I don't think NV has officially stated "in gaming", as you know that phrase comes from reviewers contacting NV re: observed power draw being waaay higher than the stated maximum. I don't think we'll see an official statement outside of a courtroom.

At least one site has observed the max GPU temperature higher than 105C as well -- even when the card was throttling to ~50% performance.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
1) Where can I verify this?
2) Were NV's past TDPs measured the same way?

I have my thoughts, but the above questions are factually important, IMO.

i don't remember where i saw/heard this. i am right now looking at the Nvidia chart and it says "250W TDP Maximum Board Power"

Now i am unsure; so let me shoot this question off to Nvidia and see what answer i get back
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Are we even talking about the same cards? There were a bunch of versions of the old G80 and new G92 out at that time. There was the g80 8800 GTS 640 with 112 SPs which beat the old GTX. i also remember that the new g92 512MB version has a difficult time beating the old GTX at higher resolutions as it was constrained by its 256-bit bus.

What i post i my site in my reviews is double checked; what i post from memory here is not the same standard, nor should it be.

otoh, you don't seem to have much clue about anything :p

http://benchmarkreviews.com/index.p...sk=view&id=137&Itemid=1&limit=1&limitstart=10
Given our posting history and the numerous times I've already corrected you, I'll disagree. I'm talking about G80 based GTS640, if I was describing the GTS640 112, I would have said just that. It's the same nomenclature as the GTX260 216 and has been in place for awhile. Oh, and since you already posted benchmarks of the 8800GT (a good bit slower than the 8800GTS 512) handily beating out the 8800GTS 640MB and coming close to the performance of an 8800GTX, thank you for participating in your own disparagement, let's continue.
i think you are right and i was confusing the G80 GTX with the GTS
- as i recall (now) the 8800 GTX (which i also still have) was faster than the G92 512 MB 256-bit GTS at the highest resolutions. That is the one i believe i was thinking about as more of a side grade than an upgrade.
Wow, what a surprise, wrong again. Who knew? :rolleyes: The other interesting thing is that if you actually know hardware and use it, you'd figure out that the G92 cores were amazing overclockers. I went from an 8800GTS 640 @ 621/1512/2000MHz to an 8800GTS running at 800/2000/2230MHz. The 8800GTS 512 also came with a free copy of Crysis, and I could finally play through the game with everything on "Very High" and it only cost me about $75 after selling my 8800GTS 640. What a dumb move indeed. :rolleyes:
It was very difficult to post *anything* here without being subject to ridicule and trash talking that is poorly disguised personal attacks. i have said the same thing over here - over and over again - and that IS the reason i left ATF video.

i don't need to post here at all. However, there is still a "core" of enthusiasts here who are polite and knowledgeable, helpful tech enthusiasts. That is why i came back - to see if anything had changed. The way some of you have been attacking Keysplayr and calling him a liar is simply disgusting and offensive.

And you are quite right; it IS somewhat better here now than when i left before the forum transition many months ago - things changes nothing is static; you have quoted my comments about the "old" ATF video.

However, it sure looks like you are still stirring everyone up just like the old Creig. - i guess my comments still apply to you; you can remain offended. :p
Yeah wow, what a shame you can't just run your mouth and no one will call you on it. A lot of decent, helpful members have left and been replaced by shills and ignorant fanboys that for some, unknown reason get hardware handouts and think that makes them legitimate. Newsflash, it doesn't, and your only saving grace is that unlike some of you, I don't live at my computer 24/7.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Originally Posted by apoppin - Posted: Wed Mar 31, 2010 12:36 pm
ATF video is made of mostly of rude trash-talking AI who don't have any capability to debate so they ridicule. They respect no one and i will not post there for that reason.

There is no even moderation there. The entire site reflects the giant ego of its owner and the Senior mods love to abuse the membership.
Ouch....:whiste:
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
i don't remember where i saw/heard this. i am right now looking at the Nvidia chart and it says "250W TDP Maximum Board Power"

Now i am unsure; so let me shoot this question off to Nvidia and see what answer i get back
Xbit recently did a test looking at Crysis load and OCCT for ATI and NV cards, so comparing that to existing card TDPs could indicate how they were specced.

http://www.xbitlabs.com/articles/video/display/gpu-power-consumption-2010_3.html#sect0

Now in chart form!
nvtdp.png

TDP is taken from http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#GeForce_200_Series
The numbers for the 9800, 9600 and GTX260 are a little questionable due to multiple TDP listings on wikipedia and me being lazy, I picked the lowest so those cards might be non-representative.

Also, is the GTX285 TDP really 204w as Wikipedia says...? Seems dubious when the 275 TDP is higher.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Personally, I'm a bit surprised to find him posting here at all. I do pop over to ABT infrequently to check out the reviews/forums and happened to notice what he posted less than a week ago:



http://alienbabeltech.com/abt/viewtopic.php?p=35953#p35953

To be honest, I took offense to that. ATF is much better now that many of the verbally abuse members have been banned and, ironically, gone to to ABT. After browsing a few threads it becomes quickly evident to anyone that ABT is the one made up of those 'mostly rude trash-talking members' he's complaining about, not those of us here.

Ironically, even his own post on ABT contained significant 'ridicule' and 'trash talk':

"The entire site reflects the giant ego of its owner and the Senior mods love to abuse the membership".

Pot, kettle, etc...

Creig, your making it real easy to see what needs to be cleaned up here at AT.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i sent off our question to Nvidia and they asked me for details.
---i linked to:
http://www.pcgameshardware.com/aid,...-Fermi-performance-benchmarks/Reviews/?page=2

When running Grid the GTX 470 easily tops the HD 5870 with 3.4 Sone but has a lower power consumption than the Radeon. On heavy workload we recorded a maximal power consumption of 231 watt for the GTX 470 - more than the TDP of 215 watt. Even in Grid the GTX 480 exceeds its TDP by 16 watt and when running Furmark it even requires more than 300 watt. The loudness in 3D mode is, with 7 up to almost 12 Sone, extremely high.


http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html
and this one measures 320 W - the highest
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Creig, your making it real easy to see what needs to be cleaned up here at AT.
You're certainly entitled to your opinion, Keys. I just happened to be browsing over at ABT, saw his comment and then came back to this thread and was shocked to see him posting in it. I would have quoted Kyle or any other competing website owner who had done the same. And if you'll notice, I did not respond to apoppin's last comments, but instead dropped the matter.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
You're certainly entitled to your opinion, Keys. I just happened to be browsing over at ABT, saw his comment and then came back to this thread and was shocked to see him posting in it. I would have quoted Kyle or any other competing website owner who had done the same. And if you'll notice, I did not respond to apoppin's last comments, but instead dropped the matter.
Yeah, you can't pick on the NVIDIA fan club, that's just not cool man. :rolleyes:
 

GlacierFreeze

Golden Member
May 23, 2005
1,125
1
0
Unimpressed with apoppin's review.

And lol @ heat feature and talking smack about this forum then posting on it. Oh well.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I got an answer from Nvidia today about their TDP spec of "250 W Maximum Board Power"

Here is my question to Nvidia:
I have a question for you that has appeared on several forums. "250W TDP Maximum Board Power" is given in the specs but we see the GTX 480 draws more than that in many reviews. Is this an "in game" specification - or how do we address the apparent discrepancy?

Check these out please:

http://www.pcgameshardware.com/aid,...-Fermi-performance-benchmarks/Reviews/?page=2

"On heavy workload we recorded a maximal power consumption of 231 watt for the GTX 470 - more than the TDP of 215 watt. Even in Grid the GTX 480 exceeds its TDP by 16 watt and when running Furmark it even requires more than 300 watt."

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html
-This one measures 320 W - the highest

Basically the reply was that TDP is a measure of maximum power draw over time in real world applications. It does not represent the maximum power draw in extreme cases such as FurMark.

i was advised to check out:
http://en.wikipedia.org/wiki/Thermal_design_power

The thermal design power (TDP), sometimes called thermal design point, represents the maximum amount of power the cooling system in a computer is required to dissipate. For example, a laptop's CPU cooling system may be designed for a 20 watt TDP, which means that it can dissipate up to 20 watts of heat without exceeding the maximum junction temperature for the computer chip. . . . The TDP is typically not the most power the chip could ever draw, such as by a power virus, but rather the maximum power that it would draw when running real applications. . .

Since safety margins and the definition of what constitutes a real application vary between manufacturers, TDP values between different manufacturers cannot be accurately compared. While a processor with a TDP of 100 W will almost certainly use more power at full load than a processor with a 10 W TDP, it may or may not use more power than a processor from a different manufacturer that has a 90 W TDP.

Evidently Nvidia doesn’t release *peak* board power specs publicly. Clearly it is possible to go over 250W with tests like FurMark. But now we know that "maximum board power = 250W" is measured over time in real world gaming.
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
So the next few big questions are:
1) How does ATI define TDP for their boards?

2) Does 480 using more "TDP" than it should (say you dont follow the "all applications" but just take metro2033 heavy gpu use scenarios, mean anything for the products longevity/stability?

3) What does the PCI-e spec say about TDP? Since that has obviously been made a point out of.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Somehow i knew you'd ask that next .. here's a C&P:

NVIDIA GeForce GM, Drew Henry, published a post on the NVIDIA blog responding to
community comments regarding GTX 480's heat and power consumption:
"When you build a high performance GPU like the GTX 480 it will consume a lot of
power to enable the performance and features I listed above. It was a tradeoff for us, but we wanted it to be fast.. The chip is designed to run at high temperature so there is no effect on quality or longevity.

http://www.tgdaily.com/hardware-fea...-gtx-480-designed-to-run-at-high-temperatures
 

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
I thought this was clear since the first day GF100 paperlaunched rhetoricalquestionmark.
It's also quite clear that Nvidia changed its definition of TDP.

Nice effort though.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Seriously Apoppin, you didnt have to gulp up that quote again :D Id rather we have a discussion about it and not just take any producer statement as a fact.

Look at the geforce 8xxx series, 2 years and half the gpus broke because of solders. I have two laptops, one with 8400 and one with 8600 dying on me because of this issue. Ofcourse its a heat/power consumtion issue.

And the more important question was still: How does ATI define TDP, since you are own your own site, id appreciate it if you could get a response from ATI on the same question.

And lastly, If "TDP" is such a flexible term/envelope/word.. why is it such an issue for gpu producers to meet the PCIe TDP requirements, OEM whatever requirements? And how do they manage if it now happens that NVIDIAs, ATIs, and INTELs, definition of TDP are all different...?