AT fermi review is up!

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SHAQ

Senior member
Aug 5, 2002
738
0
76
I believe Nvidia underrated the TDP of the 480. It uses 140 more watts than a 5870 and 55 more watts than a 470. I'm pretty sure the 5870 isn't a 110 watt TDP card.

Edit: 5870 is 188 watts. Also the 480 uses more juice than a 295 which is rated at 289 watts I believe. Fishy...

Edit2:Real TDP seems to be between 280-285 watts. LOL
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Power is indeed a problem, but I dont see why everyone is so up tight about noise and heat. Concluding the card is loud from some video is just pure nonsense. They are normally for hearing their tonal characteristics (whether or not they are high pitched/smooth etc).

But for actual noise measurement they arent a good indicator. Especially in an open test bed since most people have cases which further reduce noise. ATs reading make the noise to be around a GTX275/285/HD4890. I know people who own these cards and never have complained about noise. I guess some did for the HD4890 since it had more of a whiney tone to its noise but that didnt stop people from buying this card.

People are complaining that the cards are reaching ~90C but this isnt something new. Ive had my HD4850 reach those temps til I changed its cooler. The initial GTX280/260s were hot. 8800GTXs were hot that people started strapping 120mm fans on the stock heatsink. HD4870s were hot and had VRMs reach up to over 100C during furmark.

I think most people are over reacting because of the disappointment which is quite normal but wait 2~3 weeks when all the smoke/dust has settled. Kind of like the initial impact of the HD5850. By then, I think the GTX470 will look better than what most people make it out to be atm. As for the GTX480, well I just cant think of many viable reasons to even consider this card.


pretty much every review has come to the conclusion that the card is hot and loud as hell. sure the 4850 was hot BUT it was quiet so all you had to do was turn the fan up a bit. also the original gtx260 does NOT run hot at all. I have had one running well overclocked for 16 months and rarely even get much over 70C during gaming and idle at around 39-40C in the winter and 43-44C in the summer. not to mention it is basically inaudible even when running furmark.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
surely they would have made that known wouldnt they? I mean if they know its loud enough to warrant a change it seems like having a disclaimer for all those reviewers would have been there so people would know the final product isnt a dustbuster.
Well considering until about 2 weeks ago nVidia’s board partners didn’t even know the final clocks or TDP, you tell me.

sure the 4850 was hot BUT it was quiet so all you had to do was turn the fan up a bit.
Not only that, but simply comparing temperatures doesn't really work, for the simple reason that a 4850 @ 90c has less than half the heat to dissipate of a GTX480 @ 90c (115W TDP vs. 250W TDP). The 4850’s single slot cooler is perfect evidence of this.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,728
1,019
126
I believe Nvidia underrated the TDP of the 480. It uses 140 more watts than a 5870 and 55 more watts than a 470. I'm pretty sure the 5870 isn't a 110 watt TDP card.

Edit: 5870 is 188 watts. Also the 480 uses more juice than a 295 which is rated at 289 watts I believe. Fishy...

Edit2:Real TDP seems to be between 303-328 watts. LOL

Do realize, until xbitlabs hooks up its cool little measuring board all those estimates are at the wall. Figure 300w * 80% efficiency = 240w actual heat.

Edit: Unless you're already fudging the numbers for that?
 
Last edited:

SHAQ

Senior member
Aug 5, 2002
738
0
76
I lowered it to 280-285 based on the TDP's of the 295 and 5850. I don't think the 480 can actually be 250 watts based on the reviews I have seen so far.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
I'm SHAQ on this nV's TDP is bull. Simple math tells you that.

What I'm wondering is how much does extra 125W cost for four hours a day over a year? May destroy value proposition as well unless you live in dorms.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
pretty much every review has come to the conclusion that the card is hot and loud as hell. sure the 4850 was hot BUT it was quiet. also the original gtx260 does NOT run hot at all. I have had one running well overcclocked for 16 months and rarely even get much over 70C during gaming and idle at around 39C in the winter and 44C in the summer. not to mention it is basically inaudible even when running furmark.

My GTX260 216 can go above 80C running furmark. These temperatures aren't what people would consider cold. You guys make it out like if its the end of the world. If the heat caused the PCB to melt or something than sure it is a problem. I work with semiconductor devices, and most of these are rated to run normally at 90C.

Where does it say that it was loud as hell? All reviews ive read point to the GTX480/470 being quite at idle, and somewhat loud at max load (Less louder for the GTX470) which isnt anything new. Loud as hell? AT did say that the SLi setup is loud, but this doesnt mean single cards are as loud.

If you've had the chance to hear the FX5800 ultra in person, you will know what "loud" really means. Coming to think of it, I wonder what 2 FX5800 ultras would be like if they were capable of SLi back then.
 

Schmide

Diamond Member
Mar 7, 2002
5,728
1,019
126
It looks like pcgh has actual values that are card only and in line with xbitlabs.

If we excuse fur/occt as out of the ordinary they come up with a 266w or so figure.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
It does really well in some games and seems future proof, too bad the whole package is lacking, what with the temps and SPECIALLY the noise, that would be really disturbing

All in all if I had a previous gen card like 48xx or GTX280, Id skip this generation
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
My GTX260 216 can go above 80C running furmark. These temperatures aren't what people would consider cold. You guys make it out like if its the end of the world. If the heat caused the PCB to melt or something than sure it is a problem. I work with semiconductor devices, and most of these are rated to run normally at 90C.

Where does it say that it was loud as hell? All reviews ive read point to the GTX480/470 being quite at idle, and somewhat loud at max load (Less louder for the GTX470) which isnt anything new. Loud as hell? AT did say that the SLi setup is loud, but this doesnt mean single cards are as loud.

If you've had the chance to hear the FX5800 ultra in person, you will know what "loud" really means. Coming to think of it, I wonder what 2 FX5800 ultras would be like if they were capable of SLi back then.
well is your gtx260 LOUD when hitting 80C in furmark? of course not. the gtx480 was hitting over 90C in some reviews during gaming AND being loud at the same time. one or the other is understandable but being very loud while still running very high temps is not acceptable for many people including myself.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Looks like this release of fermi is a waste for me, as I am cheap and like quiet. I do think fermi has legs and hopefully nvidia will get their shit together.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Power is indeed a problem, but I dont see why everyone is so up tight about noise and heat. Concluding the card is loud from some video is just pure nonsense. They are normally for hearing their tonal characteristics (whether or not they are high pitched/smooth etc).

But for actual noise measurement they arent a good indicator. Especially in an open test bed since most people have cases which further reduce noise. ATs reading make the noise to be around a GTX275/285/HD4890. I know people who own these cards and never have complained about noise. I guess some did for the HD4890 since it had more of a whiney tone to its noise but that didnt stop people from buying this card.

People are complaining that the cards are reaching ~90C but this isnt something new. Ive had my HD4850 reach those temps til I changed its cooler. The initial GTX280/260s were hot. 8800GTXs were hot that people started strapping 120mm fans on the stock heatsink. HD4870s were hot and had VRMs reach up to over 100C during furmark.

I think most people are over reacting because of the disappointment which is quite normal but wait 2~3 weeks when all the smoke/dust has settled. Kind of like the initial impact of the HD5850. By then, I think the GTX470 will look better than what most people make it out to be atm. As for the GTX480, well I just cant think of many viable reasons to even consider this card.

We all have different threasholds. IMO if a fan is moving over 1000 RPM it's too loud and annoying. These appear to get into delta 4000 RPM territory, i.e. leaf blowers. Every reviewer has commented on insane high levels of loudness so I think it will be an issue for consumer.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
It looks like pcgh has actual values that are card only and in line with xbitlabs.

If we excuse fur/occt as out of the ordinary they come up with a 266w or so figure.

Hm, if you dont take into account the power consumed by the 1.8A delta fan, the memory modules and what not, it does indeed stay below 250W.

Does the TDP only refer to the GPU's maximum thermal design power or the board as a whole?
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
pretty much every review has come to the conclusion that the card is hot and loud as hell. sure the 4850 was hot BUT it was quiet so all you had to do was turn the fan up a bit. also the original gtx260 does NOT run hot at all. I have had one running well overclocked for 16 months and rarely even get much over 70C during gaming and idle at around 39-40C in the winter and 43-44C in the summer. not to mention it is basically inaudible even when running furmark.

I think the only real way to enjoy a GTX470/480 is to water cool it, of course this will be a more expensive and painstaking task for the uber leet enthusiasts...I wonder how much water cooling would add to the cost of a GTX480? Not to mention a bigger PSU.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Power is indeed a problem, but I dont see why everyone is so up tight about noise and heat. Concluding the card is loud from some video is just pure nonsense. They are normally for hearing their tonal characteristics (whether or not they are high pitched/smooth etc).

But for actual noise measurement they arent a good indicator. Especially in an open test bed since most people have cases which further reduce noise. ATs reading make the noise to be around a GTX275/285/HD4890. I know people who own these cards and never have complained about noise. I guess some did for the HD4890 since it had more of a whiney tone to its noise but that didnt stop people from buying this card.

People are complaining that the cards are reaching ~90C but this isnt something new. Ive had my HD4850 reach those temps til I changed its cooler. The initial GTX280/260s were hot. 8800GTXs were hot that people started strapping 120mm fans on the stock heatsink. HD4870s were hot and had VRMs reach up to over 100C during furmark.

I think most people are over reacting because of the disappointment which is quite normal but wait 2~3 weeks when all the smoke/dust has settled. Kind of like the initial impact of the HD5850. By then, I think the GTX470 will look better than what most people make it out to be atm. As for the GTX480, well I just cant think of many viable reasons to even consider this card.

I think it is the opposite. Many people (at least enthusiasts with robust computers) already have good PSUs, so not everyone is shafted by more power usage. Noise, however, is a big concern. Why do you think there is a shitload of new aftermarket vga coolers are coming to market? Someone with 7 fans might not care, but for the rest of us that visit spcr time to time, it is a grave concern.

I hated my x1950 GT for its loudness back when I had it and ditched it soon after.

happy medium, you are lucky to be insensitive about noise. I, too, had that same logic of muffling noise with speakers back in the days, but there is no going back after spending a few years with a quiet computing/gaming. It's not at all about not being able to hear sound due to vga noise while gaming (that would be just crazy), it's really about how annoying it gets and how much fatigue your ears must sustain over time. Maybe you should try a more quiet computer for a change...
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
Hm, if you dont take into account the power consumed by the 1.8A delta fan, the memory modules and what not, it does indeed stay below 250W.

Does the TDP only refer to the GPU's maximum thermal design power or the board as a whole?

It should be the whole board AFAIK.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Thanks Slowspyder.

Really though...Don't use guys have speakers or headphones? I never play my games without sound..do you? No way can I hear my video card even on 100%(and its loud) when I'm playing a game.

And to me while surfing the web and at idle the gtx 480 is just as silent as any other card.

So who cares about noise at load? ! You won't hear it anyway!
Speakers will not mask loud fan noise unless they’re cranked up to the point of being deafening. Headphones are better, but you can still hear loud fan noise when gaming. I have no trouble hearing my i5 750’s stock fan when gaming on headphones because it whines like a little bitch. It bothers me so much that I’m getting an Arctic Cooler to replace it.

[H]’s recordings clearly show the GTX480 is far louder than the 5870 at load, and they also state the videos accurately portray what they hear in real life. That part isn’t under debate. While it’s true that their open test-bed has more noise than a closed system, the fact remains that the GTX480 will still be loud in a closed system. and it’ll be far louder than a 5870.

Legion Hardware showed 110W less system consumption from the 5870 compared to the GTX480. That extra 110W is coming entirely from the GTX480 and is going straight to its cooler to dissapate, so of course it’s going to be much louder.

Argue about performance if you like, but it’s undeniable that this card is an absolute furnace. It has a higher TDP than even the 2900XT which was notorious for being hot and loud.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Stolen from Guru3d's forums.

2zrzvr6.jpg
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
FYI 64db (gtx480) vs 59db (5870) is over double loudness since every 3db is double.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Speakers will not mask loud fan noise unless they’re cranked up to the point of being deafening. Headphones are better, but you can still hear loud fan noise when gaming. I have no trouble hearing my i5 750’s stock fan when gaming on headphones because it whines like a little bitch. It bothers me so much that I’m getting an Arctic Cooler to replace it.

[H]’s recordings clearly show the GTX480 is far louder than the 5870 at load, and they also state the videos accurately portray what they hear in real life. That part isn’t under debate. While it’s true that their open test-bed has more noise than a closed system, the fact remains that the GTX480 will still be loud in a closed system. and it’ll be far louder than a 5870.

Legion Hardware showed 110W less system consumption from the 5870 compared to the GTX480. That extra 110W is coming entirely from the GTX480 and is going straight to its cooler to dissapate, so of course it’s going to be much louder.

Argue about performance if you like, but it’s undeniable that this card is an absolute furnace. It has a higher TDP than even the 2900XT which was notorious for being hot and loud.

dunno... I have an array of some nice headphones (HD580/HD650/AD2000/K701/D7000/MS1), only one of them being a closed design. Many if not the majority of higher grade phones are open which does little to nothing for muffling the ambient noise...
 

Schmide

Diamond Member
Mar 7, 2002
5,728
1,019
126
Hm, if you dont take into account the power consumed by the 1.8A delta fan, the memory modules and what not, it does indeed stay below 250W.

A delta fans though rated in the 1.5-3A range only draw that during spin up. So if it does pull 20-40w for short periods, it probably drops to 10w or less after that. Although probably ironically enough to be considered part of the budget.

Does the TDP only refer to the GPU's maximum thermal design power or the board as a whole?

Basically how much heat it has to dissipate. You have to factor the ram, VRM, support chips, etc into this as well.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Whatever dude, Guru is a great site, thats total bullshit.

Here is some Metro 2033 benches since AT didn't post them.

5870 and 5850 is getting it's ass kicked in this MODERN game.
Click the resolution tab to see performance in 1900x1080, it gets worse!!!!

http://www.pcgameshardware.com/aid,...Fermi-performance-benchmarks/Reviews/?page=13

And the Unigine 2.0 benchmark the 5850/5870 gets it's ass kicked again.
This is another look at the future.

http://www.pcgameshardware.com/aid,...Fermi-performance-benchmarks/Reviews/?page=16


.

That's not exactly what I'd call playable even on the GTX480. Awesome, so nVidia has technology that will be great to build on for the future, but buying the card now for games like Metro 2033 means I'm just going to get crappy frame rates for those games with a card I paid $500 for vs. waiting for an even faster card (GTX 560? HD 6850?) that costs only ~$300 at the time of release of those future games I wish to play.

I don't know about you, but what I might do with my card in the future doesn't weight in much on my decision making involving card purchases, its mostly about the games I play now or at most plan to play and already have an idea of what kind of power they need.

There aren't many enthusiasts who would enjoy playing the DX10 games used in these benches on an 8800GTX, but I'm sure many convinced themselves that's part of why they spent $600+ on the card.