GTX580 reviews thread

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dust

Golden Member
Oct 13, 2008
1,328
2
71
I like how the 41db for the 580 is better than the 41 db for the 4870/50. That and the 10.7 which are prolly the newest ones to have the cf broken. TPU really seems to have tried to make the 580 shine. I wonder how many idiots will "upgrade" from 480 for just 15% performance extra.

Well, given the new price of the 5970, the card looks better than ever thanks to 580.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I like how the 41db for the 580 is better than the 41 db for the 4870/50. That and the 10.7 which are prolly the newest ones to have the cf broken. TPU really seems to have tried to make the 580 shine. I wonder how many idiots will "upgrade" from 480 for just 15% performance extra.

Well, given the new price of the 5970, the card looks better than ever thanks to 580.

I'll go on record giving TPU the benefit of the doubt that using 10.7 drivers is not being done maliciously. Before that W1zzard was using 10.3 and got called out for not using the latest drivers. He reasoned that he didn't really have time to retest all of cards in every game and benchmark he runs every time one of the 2 companies updates drivers. So, to get everyone off of his back he updated his drivers. The current drivers at the time? 10.7 It was bad luck for AMD and the 5970, but no conspiracy.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Yea it seems the gtx580 sucks so bad, they lowered the price of the 5970. Go figure. :)
It only held it's price, uncontested, for a year. Finally NVIDIA released something that's almost competitive with AMD's old tech, and "only competitive" because they're selling it more cheaply.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I'll go on record giving TPU the benefit of the doubt that using 10.7 drivers is not being done maliciously. Before that W1zzard was using 10.3 and got called out for not using the latest drivers. He reasoned that he didn't really have time to retest all of cards in every game and benchmark he runs every time one of the 2 companies updates drivers. So, to get everyone off of his back he updated his drivers. The current drivers at the time? 10.7 It was bad luck for AMD and the 5970, but no conspiracy.
I wouldn't blame TPU, but it certainly detracts from the value of the review.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
They already had the fastest single GPU and only made it faster, on the same architecture, with less power.

If the reports about them already having 28nm Kepler samples are true, then maybe they weren't lying when they said the GF100 problems would not slow down any other archs.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I wouldn't blame TPU, but it certainly detracts from the value of the review.

I would blame AMD for having terrible Xfire drivers for months and months, not the reviewer. This would still not effect single-gpu performance, however.

I am sure we will get a wider range of reviews from great reviewers like Anand, where there are no doubts.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
The first 5 months it was almost unavalable ,but then no one wanted one for 700$ anyway.

Keep on dreaming - AMD was smart not to flood the market when it was able to make more profit on full-speed 5870 chips but it was continuously available for ~11 months (it was quick to go during Christmas season, of course.)

With crossfire drivers a mess half the time ,who could blame them?

Nonsense. I've seen way more broken drivers from Nvidia in the past 2 years, single or SLI, does not matter.
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
Yea it seems the gtx580 sucks so bad, they lowered the price of the 5970. Go figure. :)

The 5970 was an a$$ deal at 600 and up, I didn't even care about it. Whether the rate came down because of the 580 launch or the new 69xx I don't really care, the card is now good for the money. Remember when posters such as yourself were touting 460 sli performance versus the 5970 given the rate ratio? It's no longer the case.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I would blame AMD for having terrible Xfire drivers for months and months, not the reviewer. This would still not effect single-gpu performance, however.
So in your eyes it's OK for a reviewer to use outdated drivers? I bet if he did the same for the GTX 580 you'd grabbing your pitchfork and torch :rolleyes:.
The first 5 months it was almost unavalable ,but then no one wanted one for 700$ anyway. With crossfire drivers a mess half the time ,who could blame them?
Make sure you change the subject every time you're wrong. Kind of like SLI vs. single GPU, amirite?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The 5970 was an a$$ deal at 600 and up, I didn't even care about it. Whether the rate came down because of the 580 launch or the new 69xx I don't really care, the card is now good for the money. Remember when posters such as yourself were touting 460 sli performance versus the 5970 given the rate ratio? It's no longer the case.

I also remember it was because ATI crossfire drivers were once again screwed up and made the $650+ 5970 lose to 2 gtx460's for 400$ in that review.

Just for kicks , what do you think of the gtx460 in sli now that its 350$ vs the 500$ 5970? The 5970 still don't look that great. :hmm:
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I've seen way more broken drivers from Nvidia in the past 2 years, single or SLI, does not matter.

SLI worked better than Xfire for much of this year for the 58xx series vs 4xx. For some reason 57xx scaled well on Xfire despite the 58xx's woes.

I am not surprised, given that SLI is the ONLY way NV has to get Surround, so they have more of an incentive to keep it working and up to date. Even so, they had broken Civ V SLI scaling and such recently, as an example, so both companies have had issues. All the more reason to avoid multi-GPU if possible.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Nonsense. I've seen way more broken drivers from Nvidia in the past 2 years, single or SLI, does not matter.

The reason we are even talking about this is because of the broken 10.7's right?
Everybody knows crossfire drivers have sucked for the better part of this year.

It seems your never around for the AMD bad, but are allways here when Nvidia makes a new release.
You know what kind of poster is called? Well I'm tired of getting vacations for callng you AMD markexxers/fan$oy$ out.
You make it obvious for the average reader.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I applaud you for your honesty. I just found a whole new respect for you.

If that wasn't a backhanded compliment I dunno what is... it implies that I was dishonest before? Wtf? Gee thanks. I've always hated multi-GPU setups for reasons of having to worry about having the proper mobo, enough PSU wattage, microstuttering, and general escalated levels of power/heat/noise. But when I heard that NV was going to make Surround via SLI, I knew that they would take better care of SLI than AMD did with Xfire (which is not necessary for Eyefinity).
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
I also remember it was because ATI crossfire drivers were once again screwed up and made the $650+ 5970 lose to 2 gtx460's for 400$ in that review.

Just for kicks , what do you think of the gtx460 in sli now that its 350$ vs the 500$ 5970? The 5970 still don't look that great. :hmm:

Sorry I should have made it clearer that I DONT LIVE IN US. You'd be surprised how much of a bad deal the 460's can be outside your yard. The 5970 wasn't selling at all the the high rates and I expect the retailers here to cut down the price accordingly. They have no reason of doing that for the 460's regardless of their newest price cuts, for them that equals more profit only.

Also not everyone has an I5-I7 setup, and the mb's before that don't have both capabilities(cf/sli), my case is an example.

Sure I'd take the 460's anytime over the 5970 IF the rates you stated would apply here as well, even if I would have to upgrade the entire system. Hell I'd go even for the 580 if the rates would be equal to US ones. The Radeon cards seem to have lower rates compared to Nvidia ones here.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
SLI worked better than Xfire for much of this year for the 58xx series vs 4xx. For some reason 57xx scaled well on Xfire despite the 58xx's woes.

I had a 4850 X2 2GB before my 5870 and it was an amazing card - pretty much beating everything sans 4870 X2 during its tenure. Never really had any problem.

Machine next to me ran a 4850 or 4870 in CF for almost a year too and I only saw one problem, this Summer when certain games were messed up completely when CF was enabled.

Several 5770 CF config is running here, on this forum and they never reported so much problems.

On the contrary NV kept regressing in its drivers when it came to CUDA or certain features in GTX480 (ie it crashes when you overload the GPU vs politely telling you it will shut down your app when you overload a Quadro etc.)

I am not surprised, given that SLI is the ONLY way NV has to get Surround, so they have more of an incentive to keep it working and up to date. Even so, they had broken Civ V SLI scaling and such recently, as an example, so both companies have had issues. All the more reason to avoid multi-GPU if possible.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Well in that case I guess that makes the whole AMD 5000 crossfire driver series perfect because your 4850 x2 ran good. Thats a great point.

He atleast had hands on experience with crossfire.. do you have any first hand experience with CFX or SLI?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
So when is the NDA on the GTX580 supposed to expire?

I thought it should be around now? Or is it the 10th?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Well now I hope reviewers stop testing Furmark power consumption. I thought this number was useless before, but now it most definitely is. Give us power numbers from real life gaming situations, please and thank you.

And as a request to AT: I would like to see average power consumption under a gaming situation along with maximum figures.

I hope not, I like seeing Furmark numbers as a complete worst case scenario.