HardOCP 6990+6970 CF vs 580 Tri Sli

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Resolution. As it goes up Cayman gets faster relative to Fermi.

Now, there is/was an issue with bandwidth through the crossfire connection that seems to bottleneck crossfire as you add more cards. This might be avoided/limited by going 6990/6970, only 2 cards, instead of 3, or more, discrete cards. This might be why this particular combo competes so well against the nVidia solutions. We've seen comparisons of tri/quad crossfire against tri/quad SLI not be as dominant. If so, and I'm only speculating, then it's not an issue of architecture, simply hardware implementation.

There's an article @ Kitguru that talks about the next gen crossfire being "the fastest yet" by improving the interface between the cards. Maybe, the best is yet to come?

http://www.kitguru.net/components/graphic-cards/dragan/3rd-generation-crossfire-will-be-fastest-yet/

No matter. For now, the 6990/6970 trifire setup is the dominant setup in it's class. this latest article at [H], leaves no doubt about that.

As their any evidence to that or are kitguru just making this up?
 

dac7nco

Senior member
Jun 7, 2009
756
0
0
I grabbed a pair of Palit 3GB GTX580s to replace my EVGA reference cards, after talking to Vega (http://hardforum.com/member.php?u=93212) @ HardOCP. I don't game at 3x(2560x1600) but 3x(1920x1200), and I don't liquid cool. I also run ECC memory, which limits my CPU overclocks. The cost was about the same as a 6990+6970, and to which may prove to be my dismay, I do prefer the green team.

I had the sudden feeling last night that nV being the compute monsters they are, that they were only severely VRAM limited. The reason for this thought was a guy at work who installed standard nV drivers for his Quadro, which then turned into (basically) a GTX470 with 6GB of memory, while loading a huge map on ARMA-II.

I'll post benchmarks (and pics) using CIV-5, Metro and BF2 @ 3600x1920, as they seem to be the most popular benchmarks (I only play Civ-V).

Daimon
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I grabbed a pair of Palit 3GB GTX580s to replace my EVGA reference cards, after talking to Vega (http://hardforum.com/member.php?u=93212) @ HardOCP. I don't game at 3x(2560x1600) but 3x(1920x1200), and I don't liquid cool. I also run ECC memory, which limits my CPU overclocks. The cost was about the same as a 6990+6970, and to which may prove to be my dismay, I do prefer the green team.

I had the sudden feeling last night that nV being the compute monsters they are, that they were only severely VRAM limited. The reason for this thought was a guy at work who installed standard nV drivers for his Quadro, which then turned into (basically) a GTX470 with 6GB of memory, while loading a huge map on ARMA-II.

I'll post benchmarks (and pics) using CIV-5, Metro and BF2 @ 3600x1920, as they seem to be the most popular benchmarks (I only play Civ-V).

Daimon

Don't forget pics of the system too.

So you use portrait mode?
 

dac7nco

Senior member
Jun 7, 2009
756
0
0
Don't forget pics of the system too.

So you use portrait mode?

A: System pics will be immediate; I use a Lian-Li 2120X, which has lots of room.
B: I use portrait mode, yes, on three 26" 16x10 NECs.
C: I use (exclusively) Aerocool Shark 120 & 140mm blue fans, which may not be to everyones liking, but oh well.
D: NewEgg is fast, so it won't be long.

Daimon

I didn't put that icon of that fucking horror face there on purpose - I tried to type the letter "D", with a colon afterward.
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Why test those games, they're not GPU intensive, two of them are console ports in fact. Hardly anyone gives a flying toss about gpu physx and the 15 games in the world that use it, 80% of which are utter trash.

They picked the most intensive games around and the 580 Tri-Sli got taken to the cleaners with it's $500 more expensive pricetag.

It's an excellent review, because it highlights for a buyer considering wasting $1000 on 580 Dual SLI the fact that for the same money they can get faster performance from 6990+6970 Tri-Fire for the same $1000. And the icing on the cake is that the $1000 6990+6970 Tri-Fire is faster than the $1500 580 Tri-Sli setup as well.

There is no argument here. If the best one can muster up is turning on physx in one game or resorting to benches of a console port used with drivers that didn't support Crossfire, they may as well not come to the party.

What % of people are using these card setups? And of those, what % are using the multi-monitor resolution?

The reviews are fun to look at, but mean nothing to pretty much anyone.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
What % of people are using these card setups? And of those, what % are using the multi-monitor resolution?

The reviews are fun to look at, but mean nothing to pretty much anyone.

So we should just ignore it? I think its pretty important if someone spends $1500 and find out there's a cheaper faster setup around. So even if it only matters to 10 people. Thats 10 people helped.

It matters to the above poster and people like Grooveriding who would buy as many cards as they need. There a a lot of people with Dual 580 setups who could have 3 6970s for the same price, but better performance. So I think its relevant to quite a few people.
 

Matrices

Golden Member
Aug 9, 2003
1,377
0
0
What % of people are using these card setups? And of those, what % are using the multi-monitor resolution?

The reviews are fun to look at, but mean nothing to pretty much anyone.

Wrong. You can rationalize all day but high-end holds symbolic and psychological importance for market share in the overall "gamer's" market.

Nvidia's failure to produce video cards with more than 1.5 GB vRAM this gen is an embarrassment for a company that's always prided itself on producing premium products at premium price. Especially for the 1GB and 1.28GB configurations, what you've got now is overpriced hardware compared to AMD that is hampered in multimonitor configurations or use of supersampling AA modes. Even 3D uses more vRAM than 2D, which is why I wish AMD had its own first-party 3D solution instead of the convoluted crap they have now.

Also, 3x1080p is actually not *that* expensive these days and thousands of gamers are moving toward Eyefinity/Surround, so yes it is important to those people - aka, people in this forum and other similar forums.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
What % of people are using these card setups? And of those, what % are using the multi-monitor resolution?

The reviews are fun to look at, but mean nothing to pretty much anyone.

For the $500 you save getting tri-Crossfire, you can upgrade a single 1920x1200 monitor to a triple monitor setup (if you get a could of cheapo extra monitors), AND it's faster at that resolution.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Wrong. You can rationalize all day but high-end holds symbolic and psychological importance for market share in the overall "gamer's" market.

Nvidia's failure to produce video cards with more than 1.5 GB vRAM this gen is an embarrassment for a company that's always prided itself on producing premium products at premium price. Especially for the 1GB and 1.28GB configurations, what you've got now is overpriced hardware compared to AMD that is hampered in multimonitor configurations or use of supersampling AA modes. Even 3D uses more vRAM than 2D, which is why I wish AMD had its own first-party 3D solution instead of the convoluted crap they have now.

Also, 3x1080p is actually not *that* expensive these days and thousands of gamers are moving toward Eyefinity/Surround, so yes it is important to those people - aka, people in this forum and other similar forums.
Embarrassment, LOL
Lot of drama . You are ignoring you can buy a 3gb 580 if that is the intended mountain to climb.
The 580 @1.5gb is enough for 25600x1600 30 inch monitors, some people decide to get 2 and 3 of these cards for added fps. Or for 3D
Multi-monitor is more of a niche than 3D, something AMD cards are way behind in. iz3d option for AMD 3D does not support crossfire.

Its good for gamers, that either brand has its advantages.

edit: The issue is not that 3x 1080p monitors don't cost that much. I'm never going to upgrade to that setup. Its just not a desirable option for some, that would consider a 30inch monitor ideal. Which does cost a lot. There are people with dual gtx 580/570 type power pushing single monitors.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Embarrassment, LOL
Lot of drama . You are ignoring you can buy a 3gb 580 if that is the intended mountain to climb.
The 580 @1.5gb is enough for 25600x1600 30 inch monitors, some people decide to get 2 and 3 of these cards for added fps. Or for 3D
Multi-monitor is more of a niche than 3D, something AMD cards are way behind in. iz3d option for AMD 3D does not support crossfire.

Its good for gamers, that either brand has its advantages.

edit: The issue is not that 3x 1080p monitors don't cost that much. I'm never going to upgrade to that setup. Its just not a desirable option for some, that would consider a 30inch monitor ideal. Which does cost a lot. There are people with dual gtx 580/570 type power pushing single monitors.

Im willing to wager 2 6990s would be faster than or equal to 4 3gb 580s at 2x2560x1600.

But your other points about AMD and nvidia having different advantages is noted and agreed with.
 

dac7nco

Senior member
Jun 7, 2009
756
0
0
There are people with dual gtx 580/570 type power pushing single monitors.

On 1080 monitors, I'd doubt it. A single 580 or 6970 would crush that resolution; my HTPC runs a std.clock GTX460, using a single slot, and is respectable @ 1080p on most games @ high settings, using a down-clocked quad-775. Dual GTX570s would drive a 30" monitor quite well, much less 1920x1200.

I used two GTX580s, until recently, to drive three 1920x1200 monitors in CIV-5, which DID underperform, I'll give you.

The high-resolultion gaming award, IMO, goes to dual-6990s, with a 580 in tow. Xfire scaling being what it is, across their bridges, you won't see the same performance by taking four 6970s, and ganging them up nV style.

Daimon
 

load81

Member
Jan 21, 2011
105
0
0
I tried staying quiet but lol a configuration that almost no one plays at gets so much attention because AMD wins but a case where the 570 stomps a more expensive 6970 in an overclock test that is far more usefull to alot more people gets ignored http://www.insidehw.com/Reviews/Gra...-HD-6970-Two-DirectCU-Cards-Head-to-head.html And very lol at the guy saying Nvidia failed for not adding more than 1.5gb of vram. That would have been useless to almost everyone. 2gb saved AMD because of noobs that think its nessesary for 1920x1200 and below. It would have been very ugly for AMD if they didnt add 2gb of vram. They would have lost on every front. You know resolutions that people actually play at. My worst mistake was being loyal to ATI for the last 8 years. Im sorry but their cards just feel castrated at the important resolutions.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
My worst mistake was being loyal to ATI for the last 8 years. Im sorry but their cards just feel castrated at the important resolutions.

Surprising considering GeForce 3 > Radeon 8500, GeForce 6800GT > X850Pro and GeForce 8xxx series walked all over HD29xx / 38xx. So how did you manage to only purchase ATI cards all this time? hhee

Funnily enough a GTX580 is barely faster than the 6970 despite being $100+ more expensive. Let's not forget that for 2-3 solid months consumers had the option of purchasing an HD6950 2GB for $230-250 and unlocking it. I'll gladly take 10% lower performance compared to an overclocked 570 and save that $100 towards Kepler/HD7000 when the performance increase will be 50-75%+. :thumbsup:

I do agree with you that for the mainstream gamer comparing 1920x1080/1200 is more valuable; but we have read 20+ reviews in regard to such cards as the GTX570/6970/GTX580, etc.

This comparison is a niche one, but it doesn't make it invalid. As it stands, NV doesn't have anything to compete with on the high end. 570s don't have enough VRAM, 580s are too expensive and cant beat HD6990+6970 or even HD6950s Tri-Fire (unless 3D gaming is a priority).

Either way, don't take it too personally. Once HD7000 and Kepler arrive, this 'refresh' generation will be forgotten rather quickly. What won't be forgotten are $100-200 saved by going with 6950/70 cards over the laughably overpriced 580, which will be just as obsolete in 12 months from now as the 15% slower 6970 cards. :D
 
Last edited:

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
I tried staying quiet but lol a configuration that almost no one plays at gets so much attention because AMD wins but a case where the 570 stomps a more expensive 6970 in an overclock test that is far more usefull to alot more people gets ignored http://www.insidehw.com/Reviews/Grap...d-to-head.html
HardOCP is more popular and established than that website, which I've never even heard of, so it's not suprising you are seeing their articles being made into topics of discussion. If you think your article should get attention, then create a new thread. So it's head-scratching how you come to the conclusion people are ignoring something (this article) they most likely didn't even know existed.

And very lol at the guy saying Nvidia failed for not adding more than 1.5gb of vram. That would have been useless to almost everyone. 2gb saved AMD because of noobs that think its nessesary for 1920x1200 and below. It would have been very ugly for AMD if they didnt add 2gb of vram. They would have lost on every front. You know resolutions that people actually play at. My worst mistake was being loyal to ATI for the last 8 years. Im sorry but their cards just feel castrated at the important resolutions.
AMD put 2GB on these cards so they wouldn't flounder at the more demanding resolutions. You know, the resolutions that actually put these powerful cards through the most work. Resolutions below 1080p can get by with 1GB of VRAM (for now), but they can also get by with slower GPUs.
 
Last edited:

AdamK47

Lifer
Oct 9, 1999
15,846
3,637
136
P15749.png

P59483.png

BM-980X-GTX580SLI-1.jpg

DMC4-980X-GTX580SLI.png

LP2A-980X-GTX580SLI-1.jpg

RE5-980X-GTX580SLI.jpg

Unigine_Heaven_SLI580_Extreme.jpg

Unigine_Heaven_SLI580_Normal.jpg

Unigine_Tropics_SLI580.jpg

CoP_SLI580.png



Intel Core i7 980X @ 4200MHz
Asus P6X58D Premium @ 21 x 200MHz
12GB Corsair XMS3 @ 2000 DDR
Three GTX 580 in Tri-SLI @ 850/2200
120GB Vertex 3 SSD
50GB Vertex 2 SSD
Two 3TB Deskstar 7K3000 in RAID-0 - 6TB
1TB SpinPoint F1
LG 32" 32LD450 LCD
Samsung 22X SH-S223L DVD-RW
Asus Xonar Essence STX sound card
Antec Twelve Hundred case
Corsair H70 cooling
Corsair AX1200 power supply
 
Last edited:

dac7nco

Senior member
Jun 7, 2009
756
0
0
Surprising considering GeForce 3 > Radeon 8500, GeForce 6800GT > X850Pro and GeForce 8xxx series walked all over HD29xx / 38xx. So how did you manage to only purchase ATI cards all this time? hhee

Either way, don't take it too personally. Once HD7000 and Kepler arrive, this 'refresh' generation will be forgotten rather quickly. What won't be forgotten are $100-200 saved by going with 6950/70 cards over the laughably overpriced 580, which will be just as obsolete in 12 months from now as the 15% slower 6970 cards. :D

I'll step in again and disagree there: If most people game at 1080p, then the field is prety even. One of the things I've noticed, is what eats VRAM is AA/MSAA...etc. At very high resolutions, nV and AMD are pretty equal, if you don't involve mind-bending settings like [H] does. If you triple that, then you get into the issue of CF/SLI scaling, which is another horse. Single-card CF scales badly over 3+ cards, while 2-card (6990+6970) CF is the shiznit.

In my opinion nV screwed themselves by neutering the compute oomph that was GF100; error correction is something sorely lacking in desktop computing, and IMO is why there hasn't been a Quadro refresh.

I'm doing my own experiment with 2x 3GB GTX580s; I don't have the cash, time or expertise of an Aigomorla([A] or Vega[H], to H2O some crazyness.

Daimon

Edit: @ AdamK47: I'll kill your i7 on Handbrake! (but not on FPS gaming, at all)
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Surprising considering GeForce 3 > Radeon 8500, GeForce 6800GT > X850Pro and GeForce 8xxx series walked all over HD29xx / 38xx. So how did you manage to only purchase ATI cards all this time? hhee

Its what fanboys do.

I have to give him a hand on finding such a poorly overclocked 6970 on old drivers. Looks more like he's promoting he's own card there.

If we are going to mention cards that have nothing to do with the thread.

You should mention this review too where the cheaper 6950 1gb stomps all over the 560.http://techgage.com/article/amd_hd_6950_1gb_vs_nvidia_gtx_560_ti_overclocking/2

Im sure more people buy 6950s and 560s than 570s and 6970s making it more relevant according to your logic.

I'd like to know what exactly you mean by "castrated at the important resolutions"? What are these "important resolutions?" 1680x1050? where the 570 is faster than a 6970 or 1920x1080 where its slower or 2560x1600 where the 570 is as fast as a 6950 1GB.

That also blows your "if AMD had 1gb of ram they would lose everywhere" theory out of water.
 

dac7nco

Senior member
Jun 7, 2009
756
0
0
I had to lookup what Handbrake is.

You're kidding me. I've spent the last fifteen years of my life transcoding video, you have an OCd 980X and you don't know what Handbrake is. I'm not explaining it to you; it's why I bought Tylersburg in the first place.

Sigh...
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Wrong. You can rationalize all day but high-end holds symbolic and psychological importance for market share in the overall "gamer's" market.

Uh...call me old fashioned, but what the extreme niche top end is doing on 3 monitors is not how I make a purchase, unless I have that setup.

I look at my budget, my resolution, and pick the best card(s) based on multiple reviews on the games I play or plan on playing.


If you make your card choice based off of what $1500 worth of product can do on $1k worth of monitors, I guess that is your decision...
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
De-activated. So these cards are not for sale anymore?

That has a few meanings at newegg for starters they don't appear in the searchs. From my experience is usually means they're sold out and not sure when they'll get more in. Those 3GB GTX 580s are in very limited supply state side. So they probably don't want that to appear in their site's search results.