Fermi is out. Thumbs up or down you decide with poll!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Are you Happy with Fermi?

  • Oh yeah! Gonna grab one or possibly more asap!

  • Pretty happy but expecting more. Might buy one.

  • Not at all. Looking for an 5XXX card

  • Not at all sticking with the last gen card (48XX,2XX series card)


Results are only viewable after voting.

Ramon Zarat

Junior Member
Mar 28, 2010
21
0
0
Nvidia = Master of unethical corporate practices.

I'm not trolling. Think what you want, it's a free world. My comment simply reflect what Nvidia have said and done in the last couple of years, and more recently with the whole Fermi project fiasco. ATI is far from being all white, but it's not even remotely close to how twisted Nvidia has become.

That being said, the 480 is a gigantic, abysmal, epic >FAIL< of cosmological proportion. 95c and 312W of power consumption under load for *ONLY* 10-15&#37; improvement on average over the 5870? All that despite *TWICE* the transistors, *50%* more RAM and 100.00$ more expensive? ARE YOU FREAKIN KIDDING ME??? How embarrassing for Nvidia to have made us all wait over 6 months for *THIS*. Especially when the 5970 is still beating it badly, runs cooler and quieter.

The 480 fails on all front: Price, power, heat, noise VS performance ratio. No value add either. Let me rephrase that: It has no value, period. Physx is great on paper but a joke in reality; only supported in about 20 games, out of which 17 are very bad. Both Physx and CUDA are closed, proprietary standard and will soon be replaced by OpenCL -open- platform anyway. No Dolby TrueHD Master audio over HDMI, no triple display support.

The only thing going for it is its great folding / GPGPU capabilities. But do you picture running this thing full load 24/7 at folding and sucking all the electricity in your house, over heating your case and components in the process? How about 2 of them in SLI requiring a monster 1.2KW, 350.00$+ power supply?? Your honor, I rest my case...

A little 5XXX series refresh / price adjustment will put this 480 abomination out of its misery and the HD 6000 series coming in Q3 2010 will make it just a bad dream all will want to forget.

I didn't like Nvidia very much before. But with all the lies, trickery, manipulation, re-badging, puppy wood screw hacked sawed board and everything else in between, I've learned to hate them. Profoundly.

Feel free to buy yourself I nice and shinny overpriced, under performing, power hog and simply uncompetitive 480 or 470 if you like. It's your choice. But never forget that you also vote with your wallet. Please, just for once, put your fanatical green ego on the side and don't support immoral corporate behavior. You'll feel good about it and won't get screwed buying something that don't deserve your hard earned cash.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Feel free to buy yourself I nice and shinny overpriced, under performing, power hog and simply uncompetitive 480 or 470 if you like. It's your choice. But never forget that you also vote with your wallet. Please, just for once, put your fanatical green ego on the side and don't support immoral corporate behavior. You'll feel good about it and won't get screwed buying something that don't deserve your hard earned cash

Will do and cant wait for my 470.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
no matter how crappy or expensive something is someone will always buy it.

What was the name of that crappy Apple console system? Nobody bought that. The thing is as long as it falls somewhere close to the mark people will buy it. They might pay the same for something 20% slower but they won't for something 35% slower for example.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
It's Hot!!!









68017710_123de4638c_m.jpg


Seriously last truly great card from nV was G80/G92. I suspect once they figure out how to stop leakage get it under 200W with all 512 cores Fermi will kick ass too.

Until then 2 x 5850 is where it's at high pro - 30-50&#37; faster, quieter, less power for same price.
 
Last edited:

jpeyton

Moderator in SFF, Notebooks, Pre-Built/Barebones
Moderator
Aug 23, 2003
25,375
142
116
Wow, over 88% say thumbs down.

Semi-Accurate can go ahead and change their name to Mostly Accurate.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Fermi is interesting from a design perspective. If I had the money to completely redo my system for it (more airflow, bigger power supply), I might build a system around one, but my current system couldn't even power it.

The 40nm generation of cards was late, so I think the wait to the 32nm or 28nm gen of cards isn't that far off. I can stick with my g92 part until then and see what's out. Northern Islands versus Fermi 2?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
GF100 is a powerful chip IMO. It needs more time to have drivers tailored to its new design. They added so many compute features that it created a major change to its basic blueprint for graphics processing. Fermi is pulling ahead in this synthetic benchmark... But it's solely GPGPU.
gpuo.png
 

Ramon Zarat

Junior Member
Mar 28, 2010
21
0
0
Will do and cant wait for my 470.


Genx87, I have no choice but to respect your *impressive* 24 734 post over the last 8 years, here at Anand's forum. That's 3091 posts a year, or 8.47 posts *every* single day for nearly a decade.. That's a lot of posts, my friend...

You surely like to share your opinion around here. Would you care to elaborate why the 470 is appealing to you?

Could it be because Nvidia have better OpenGL drivers, which they do have? If so, what game you play uses OpenGL? (BTW, ATI drivers are now OpenGL 4.0 compliant...) Is it because Nvidia is generally perceived to have overall way better drivers, which was certainty true... ages ago? Maybe you were burn in the past by an ATI products? Could it be because you are using some specialized vertical application that requires an Nvidia product?

Why would you spend 350.00$ on a paper launch item you wont get for weeks?
http://www.newegg.com/Product/Produc...gtx470&amp;x=0&amp;y=0

Get a 5850 for 70.00$ less and 90% of the 470 performance or spend 70.00$ more on a 5870 for a much cooler, silent, power efficient and overall better performing product! Did I mention 5850 and 5870 are in stock, now?


I just try to understand the logic behind your position. Unless this is a purely emotional decision on your part?
 
Last edited:

Ramon Zarat

Junior Member
Mar 28, 2010
21
0
0
GF100 is a powerful chip IMO. It needs more time to have drivers tailored to its new design. They added so many compute features that it created a major change to its basic blueprint for graphics processing. Fermi is pulling ahead in this synthetic benchmark... But it's solely GPGPU.
gpuo.png


Best Fermi slide EVER!!! LMAO!

Yes, I fully agree. Fermi is a GPGPU with a gaming appendages that was bolted-on, like an after thought. Where I don't agree with you is about the drivers. The Fermi drivers team had working silicone for MONTHS. Fermi A1 tapped out in October if my memory serve me well. The drivers you have now should in fact be mature with SOME optimization still possible. Unless Fermi architecture is so alien that even the mighty Nvidia powerhouse could not tame it within the last 6 months. Given their abundant experience and over the top raw man power, that would surprise me a lot.

Facing all the other delays, pressure from the market, partner and shareholders, and more importantly, because of the fact this 480 is a castrated 480sp, downclocked version of what is was supposed to be, I don't think Nvidia management would have tolerated anything else but perfectly optimized drivers at launch. Nvidia just couldn't afford that luxury this time around and surprise us all with a 30&#37; improved magic drivers release down the road. It won't happen this time.
 
Last edited:

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
When the AT review said it probably wasn't worth the wait, and noted that even the 470 is hotter and louder than the 5870, even though the 470 provides only 90-95&#37; of the performance on average, that sealed Fermi's fate for me.

There was another thread here, I think the thread about the Fermi fan noise being debunked then "re-"bunked, where a few other reviewers I do trust were practically gathered/compiled, and mentioned as fact that the card is indeed loud, not just in an academic measurement but in real world usage. Reading that after reading the full AT review felt like yet another nail in the coffin, shutting myself off from considering Fermi.

I understand other people may feel differently and will pick up these Fermi cards excitedly anyway. To each his own, they have their own reasons that they aren't honor-bound to explain to my satisfaction, and I respect them despite disagreeing strongly with it.

I did my best to stay out of ATi/nVidia war in the forums, especially Fermi-centered ones, while nVidia hadn't launched. Now that they did launch and the cards are on the table, and nobody has to go with rumors, I have to say that when AT thinks it's probably not worth the wait and compares a brand new launch with a product line that is half a year old, I am disappointed.

Inevitably, nVidia will get a handle on their 40nm process, will have Fermi-refresh parts that are awesome, and then we will see how the landscape is by then. It is not a total loss, as by now nVidia has learned (I am sure) how to deal with the bruises they got from 40nm. If not Q4 of 2010, early Q11 maybe, I expect we should have a much more exciting battle by then, and hopefully video card prices will be much lower.

For now, with the current prices and since I don't quite need the power, I am excusing myself from the GPU wars and will refrain from buying Cypress and Fermi. Sticking to a lowly 4770 for my 1280x1024 max res monitor. I will upgrade both the monitor and video card when "Q4 of 2010, early Q11 maybe" comes and the GPU wars grant us better cards from both camps.
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
I'll be willing to bet that Fermi's sales will be bad, also their supply issues need not be issues any more since not many will buy them.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
I'm a little confused about something though. TDP is still 250 Watts for the 480 right?
TDP is equal to the amount of power used by the card should it not?
Yet according to Anandtech's review the system with a 480 is doing 421 in Crysis and 479 in Furmark. I'm not sure how much power the system is using, but it must certainly be less than 160-226 (the amount of power used by 5750).

Therefore assuming the system draws something like 160W without a video card while running Furmark doesn't that mean the 480 is drawing something like 319 W. If so how can NVidia say it has a TDP of 250W? Even in Crysis it's like 261 W.
5970 has a TDP of 300 W I believe, but still using 160 that puts it at 305 W in Furmark
and only 218 in Crysis.

Does that mean NVidia is lying to us or is using a much lighter workload to define their maximum TDP?
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
Nvidia claims that Fermi's TDP is 250W. Actual benchmarks show it to use up to 320W.

You have to think that the fact that it already draws more power than the max PCIE spec allows is going to limit overclocking because even watercooled you are going to hit a power drawing wall real fast.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
Nvidia claims that Fermi's TDP is 250W. Actual benchmarks show it to use up to 320W.

I know there's no standard for measuring TDP, but that ought to qualify as false advertisement. TDP is supposed to be a maximum value that the product should almost never reach in normal usage, but it looks like the 480 is a good deal past that under normal and way past that in furmark.

Lol, I think I remember Toyota or someone else saying what Charlie reported as Fermi's TDP wasn't bad since real numbers would be lower, but this seems to be the exact opposite scenario.
 

Ramon Zarat

Junior Member
Mar 28, 2010
21
0
0
I'm a little confused about something though. TDP is still 250 Watts for the 480 right?
TDP is equal to the amount of power used by the card should it not?
Yet according to Anandtech's review the system with a 480 is doing 421 in Crysis and 479 in Furmark. I'm not sure how much power the system is using, but it must certainly be less than 160-226 (the amount of power used by 5750).

Therefore assuming the system draws something like 160W without a video card while running Furmark doesn't that mean the 480 is drawing something like 319 W. If so how can NVidia say it has a TDP of 250W? Even in Crysis it's like 261 W.
5970 has a TDP of 300 W I believe, but still using 160 that puts it at 305 W in Furmark
and only 218 in Crysis.

Does that mean NVidia is lying to us or is using a much lighter workload to define their maximum TDP?


Of course Nvidia lied to us all about this and many other things.

They have a bad case if narcissism god complex/syndrome, you never noticed that? They have demonstrated on multiple occasions that they think they can say and do anything, assuming their audience (us) are a bunch of clueless idiots retards. They are obviously mistaken, this thread is an indisputable proof of it.

More seriously (as if I was joking...), Nvidia real problem is with large GPGPU integrations. 320W VS 250W when you are installing hundreds of Tesla boards is a HUGE deal breaker. And when you are talking about multi-millions dollars deal, they will do their best to hide the truth about Tesla real power consumption. As if they could... Poor Nvidia.

The thing is, this has many implications. long terme power bills budget is the obvious one. Cooling a data center full of Fermi board exceeding their thermal envelop by 30% also add a LOT the the implementation cost of the solution.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
320W VS 250W when you are installing hundreds of Tesla boards is a HUGE deal breaker.

Yeah, the three parts that can best the performance per watt of Tesla in the HPC space really make them look bad-

1) Pipe Dream- this one isn't as cost effective as number three, but it only consumes 1Watt of power

2) Wet Dream- The cheapest solution but at 5Watts it is a bit more power hungry then the top spot

3)Unobtanius- Uses a massive 10 watts of power while only being three times the performance of Pipe Dream and not quite as fast as Wet Dream.

Of course, Fermi at number 4 has a shocking lead over the next closest competitor, being some 300&#37; more effective in the very limited situations that the 5870 can be used in- non ECC required tasks and those that don't require full IEEE FP64 support, and for anything outside of that its' performance per watt is closer to the 1000% better then anything else. Still, it is so far removed from the Pipe Dream/Wet Dream/Unobtanius level that it really should just be laughed at by anyone serious :)
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126

Snip pointless troll.

You surely like to share your opinion around here. Would you care to elaborate why the 470 is appealing to you?

Could it be because Nvidia have better OpenGL drivers, which they do have? If so, what game you play uses OpenGL? (BTW, ATI drivers are now OpenGL 4.0 compliant...) Is it because Nvidia is generally perceived to have overall way better drivers, which was certainty true... ages ago? Maybe you were burn in the past by an ATI products? Could it be because you are using some specialized vertical application that requires an Nvidia product?

Why would you spend 350.00$ on a paper launch item you wont get for weeks?
http://www.newegg.com/Product/Produc...gtx470&amp;x=0&amp;y=0

Get a 5850 for 70.00$ less and 90% of the 470 performance or spend 70.00$ more on a 5870 for a much cooler, silent, power efficient and overall better performing product! Did I mention 5850 and 5870 are in stock, now?


I just try to understand the logic behind your position. Unless this is a purely emotional decision on your part?

I like the 470 because it has a nice price\performance ratio for me. Nvidia also handles TVs better than ATI in my experience. My new rig if I build it will be used on a 67" DLP. I went ATI last generation with a 4850. I want to try the other side of the world now. Time frame isnt a big deal for me either. I have to gather the parts for a new rig anyways.
 

pmv

Lifer
May 30, 2008
13,049
7,976
136
@RamonZarat

As far as I'm aware nvidia hasn't done anything as morally questionable as what Citibank did in the Enron affair or Union Carbide's behaviour over Bhopal, so can't we keep a sense of proportion? So they are a bit pushy in their tactics in the video card market, it's hardly selling chemical weapons to Saddam, is it?

And I think its too early to say the 480 is a complete failure. It's not a very good card in itself, true, but given the constraints I think perhaps nVidia's engineers did a good job of getting it to work at all. They succeeded in removing just enough so it won't burn a hole through the bottom of your case and go on to contaminate the water-table on its way to China, while still keeping it just ahead of the 5870, so from their point of view its "job done". Pity about the price tag, but I bet that's the minimum they can get away with also.

And surely it can't be judged entirely on its own merits, there's the question of where it leads, what comes next? Vista wasn't exactly a great success (though software is more post-release fixable than hardware) but it did seem to get certain fundamental changes out of the way, leaving room for the problems to be cleaned up later. Time will tell whether the 480 is a Vista or an Edsel. Obviously I wouldn't actually buy one myself (I can't even afford a 5870), but at least they got this one out the door and out of the way.

I feel a bit sorry for them. Designing GPUs appears to be quite difficult - who'd have thought it?
 

Ramon Zarat

Junior Member
Mar 28, 2010
21
0
0
Yeah, the three parts that can best the performance per watt of Tesla in the HPC space really make them look bad-

1) Pipe Dream- this one isn't as cost effective as number three, but it only consumes 1Watt of power

2) Wet Dream- The cheapest solution but at 5Watts it is a bit more power hungry then the top spot

3)Unobtanius- Uses a massive 10 watts of power while only being three times the performance of Pipe Dream and not quite as fast as Wet Dream.

Of course, Fermi at number 4 has a shocking lead over the next closest competitor, being some 300% more effective in the very limited situations that the 5870 can be used in- non ECC required tasks and those that don't require full IEEE FP64 support, and for anything outside of that its' performance per watt is closer to the 1000% better then anything else. Still, it is so far removed from the Pipe Dream/Wet Dream/Unobtanius level that it really should just be laughed at by anyone serious :)


Very funny... Still, it is was Nvidia who "Pipe Dreamed" the power draw/heat to performance ratio and told the world + dog about their previsions they obviously never meet.

Fermi is *currently* the meanest pieces of hardware available for GPGPU application, no one will ever contest that. C++ compiler, ECC memory and great double precision floating point performance is impressive and all, but that's beside the point, isn't it???

The point is, there are falling quite short of being as impressive as they *said* it would be. Fermi draws a lot more power they said it would and with only 480sp and lower clock, it perform a lot worse they said it would. That can seriously f*ckup a budgetary forecast and business plan when it's gonna cost a lot more to do a lot less to your potential customers, wouldn't you say???

When you are trying to convince military / government / university / science lab grade customers of your seriousness and come to the table to close a 7-8 figures deal deal, 6 months late, with numbers that completely miss your own targets, you make a fool of yourself.

The fact is, Nvidia won't be alone in this market segment for long. Fortunately for them, they will win a couple projects here and there because there are the only option in town... for now. Enough to save their sorry ass on the long run? Only time will tell.

That was my point...