2x6970 XFire, 2xGTX570 SLI, or 1 GTX 580 (later SLI)?

Rangoon

Member
Apr 19, 2008
48
0
0
I'm having a tough time deciding between these options. In the past I had used ATI cards, but several years ago switched to nVidia and have used them since. In the early days of multi-GPU setups, I saw too many videos of microstutter and knew that would drive me crazy.

I get this impression that XFire and SLI have come a long way, with better drivers and more comprehensive support for more/newer titles. I am still wary, but less so. I used to just discard the benchmark numbers because of microstutter. Now I look at those numbers and give them some credence.

Especially looking to the near future, I notice games like Call of Pripyat in the Anandtech reviews and see the benefit of 2x6970s.

I'm mainly going to be recording games at max settings and 1920x1200 resolution. I also will be water cooling the GPUs, so overclocking is part of the equation.

How is the image quality between these options? I know I like what I get from my GTX 285. Plus, PhysX. Is there a good argument to switch to AMD this time around and go with a XFire build? Or would it be better to start with a single GTX 280 and then add a second one when prices drop?

I have heard a few people on the forums praise their 2xGTX 570 SLI configurations, too, which would save a little money and seems to perform rather well.

Also, there is the strategy of flashing a couple of 6950s to 6970s. I'm not sure this is a better route over just buying a couple of 6970s, but would consider that as well.

Any advice?
Thanks!
 

lau808

Senior member
Jun 25, 2011
217
0
71
only advice i can give is if u go with 6950s and try to flash to 6970s, get 6950s with a dual bios switch, if the flash fails u can go back to 6950 bios and try flashing other bios again. as opposed to bricking the card
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Okay, several things:


  1. The time to buy such a setup is not optimal. The new generation will be out soon (hopefully and hopefully from both IHVs). Considering it will take again 2-2.5 years for a significant speed bump, I would advise against premature decisions and wait.
  2. Microstuttering can be minimized by using a frame limiter or by staying above a certain framerate that varies from game to game, scene to scene
  3. I would advise against 570 SLI on grounds of too little VRAM. But in general I would prefer SLI over CF any day because AMD drops the ball to often with this. Latest examples: Skyrim, Batman Arkham City (DX9, see HardOCP for this). Also you have way more possibilites to improve image quality on a Geforce setup than on AMD cards (see 4.). If you need more fps, you can still turn all the goodies off
  4. Geforces support SGSSAA in every API (DX9-DX11 + OpenGL), AA-bits for improved compatibility, ambient occlusion (not for every game though), better AF, PhysX, 3DVision and downsampling (a brute force AA method that always works no matter if a game supports AA or not). The latter can be tricky and might not work with your screen but I would still call it an asset.
  5. Buying a highend card now and adding a second card later is rarely a good idea. Because then new cards are available with more features, no multi-GPU woes and less power consumption. Of course, you can overlook these shortcomings if you get a really good deal.
 
Last edited:

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
only advice i can give is if u go with 6950s and try to flash to 6970s, get 6950s with a dual bios switch, if the flash fails u can go back to 6950 bios and try flashing other bios again. as opposed to bricking the card

Dual bios switch 6950s are getting harder and harder to find. Plus unlocking is not as common as it used to be.

OP. I was in your boat well. I avoided dual card setups because I feared the microstutter. It still exists, but varies a lot how much it shows up depending on the frame rate and the game.

To minimize microstutter I suggest a 120hz monitor, and sacrificing IQ with the goal of keeping your frame rates closer to 120. Higher the frame rate, the lower variance in drawing times between frames, thus less microstutter. "But wait, I'm buying two cards so I don't have to make such sacrifices!" Well, with two cards you have to sacrifice less to get higher frame rates ;)

To answer your original question. The last few major game releases AMD has really dropped the ball on getting crossfire support on new titles on day one. Just look at Skyrim. NVidia has done a much better job of getting drives released very quickly to support SLi in new games. As such, you might want to consider NVidia cards since they seem to make the dual card experience less painful.

The other side of the coin is that you need to consider what resolution you are running at and how much vram you will need. One of the biggest weaknesses for NVidia GPUs, is the lack of vram on their cards vs AMDs. Games such as BF3 will use 1.5gb+ of vram on the highest detail levels. The gtx 570 comes with 1.2gb of vram, and the 580 with 1.5gb. Going forward I believe the affective life span of the 570/580 will be less than the 6970 2gb, since we are already pushing the vram limits of the 570/580. People go back and forth on this, but it is still something to consider.

So AMD cards have the vram for longevity and future games, however nasty driver support... What may be a realistic option is checking for a used gtx 580 here on the forums and picking that up to use till the next gen of cards comes out. Do some research on the rumors and see if you are okay with waiting. A gtx 580 is a beast of a card and will do very well at everything but the highest detail levels.

IF MONEY IS NO OBJECT and you need something now. Get two of these:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130655
GTX 580 with 3gb of vram. Probably the best card you could put in SLi.

Keep in mind with your water cooling you will likely need to be looking at reference cards and be very careful about block compatibility.

Hope this helps.
 
Last edited:

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
It's a shame that eVGA makes the 2.5GB GTX 570s in non-reference. Those would be a wicked setup under water.
 

Rangoon

Member
Apr 19, 2008
48
0
0
To answer your original question. The last few major game releases AMD has really dropped the ball on getting crossfire support on new titles on day one. Just look at Skyrim. NVidia has done a much better job of getting drives released very quickly to support SLi in new games. As such, you might want to consider NVidia cards since they seem to make the dual card experience less painful.

That's what my understanding was as well, and one reason I am still leaning toward nVidia for now. I just don't want to become fixated and ignore what is clearly becoming better competition from AMD.

The other side of the coin is that you need to consider what resolution you are running at and how much vram you will need. One of the biggest weaknesses for NVidia GPUs, is the lack of vram on their cards vs AMDs. Games such as BF3 will use 1.5gb+ of vram on the highest detail levels. The gtx 570 comes with 1.2gb of vram, and the 580 with 1.5gb. Going forward I believe the affective life span of the 570/580 will be less than the 6970 2gb, since we are already pushing the vram limits of the 570/580. People go back and forth on this, but it is still something to consider.

Right, and my consideration of the 570s was only for the 2.5GB VRAM versions. Unfortunately they seem to be harder to find presently. I totally agree about the need for more VRAM especially going forward.

So AMD cards have the vram for longevity and future games, however nasty driver support... What may be a realistic option is checking for a used gtx 580 here on the forums and picking that up to use till the next gen of cards comes out. Do some research on the rumors and see if you are okay with waiting. A gtx 580 is a beast of a card and will do very well at everything but the highest detail levels.

You bring up another tough question, about waiting for Kepler which seems just around the corner. I am considering getting something that will tide me over until then. Once the next Gen comes out and matures a little, I could move the (now) new card to my secondary computer and pick up a (then) new card for the primary.

But in that scenario, I'm torn between something like a standard GTX580 and a 3GB version. I like the idea of the 3GB version, but it's not really necessary in my secondary computer. And it costs a lot more. I could choose that easily if it was going to be used for a long time. But if I plan to replace it within six months, that's a harder choice.

IF MONEY IS NO OBJECT and you need something now. Get two of these:
http://www.newegg.com/Product/Produc...82E16814130655
GTX 580 with 3gb of vram. Probably the best card you could put in SLi.

Indeed... Unfortunately, money is still a barrier.

Keep in mind with your water cooling you will likely need to be looking at reference cards and be very careful about block compatibility.

This is something I was going to be looking into next. Thanks for bringing that up.


A water cooled gtx580 @ 900+ core is just awesome.

:)

It's a shame that eVGA makes the 2.5GB GTX 570s in non-reference. Those would be a wicked setup under water.

Thanks for mentioning that. I was going to take a look at that if I could find one in stock, but I really do want to go with water cooling this time.
 

Rangoon

Member
Apr 19, 2008
48
0
0
  1. The time to buy such a setup is not optimal. The new generation will be out soon (hopefully and hopefully from both IHVs). Considering it will take again 2-2.5 years for a significant speed bump, I would advise against premature decisions and wait.
  2. Microstuttering can be minimized by using a frame limiter or by staying above a certain framerate that varies from game to game, scene to scene
  3. I would advise against 570 SLI on grounds of too little VRAM. But in general I would prefer SLI over CF any day because AMD drops the ball to often with this. Latest examples: Skyrim, Batman Arkham City (DX9, see HardOCP for this). Also you have way more possibilites to improve image quality on a Geforce setup than on AMD cards (see 4.). If you need more fps, you can still turn all the goodies off
  4. Geforces support SGSSAA in every API (DX9-DX11 + OpenGL), AA-bits for improved compatibility, ambient occlusion (not for every game though), better AF, PhysX, 3DVision and downsampling (a brute force AA method that always works no matter if a game supports AA or not). The latter can be tricky and might not work with your screen but I would still call it an asset.
  5. Buying a highend card now and adding a second card later is rarely a good idea. Because then new cards are available with more features, no multi-GPU woes and less power consumption. Of course, you can overlook these shortcomings if you get a really good deal.


1. I definitely hear you on this. As I mentioned in my above reply, this still brings up a tough choice, because I do need something now and want it to perform well. I have a way of shifting whatever I get now around (or sell it), but will need to figure out the best strategy.

2. Would you say it's a very reasonable choice (SLI/XFire) these days? I would be happier with 30 FPS and no microstuttering at max settings than microstuttering at 60 FPS. To get 120 consistently seems a stretch. But maybe it's not?

3. Thanks for the info!

4. Good point, and one I have adhered to thus far.

Just get one card & my vote goes to the MSI gtx580 Lightning XE 3GB

Looks great! Very much considering the 3GB version.

Keep in mind with your water cooling you will likely need to be looking at reference cards and be very careful about block compatibility.

Speaking of this, I know you are a water cooled kind of guy (read your long thread about that). Do you know if when Kepler comes out, blocks for that card will be available immediately? Or is there a delay? Days? Weeks? Months? And roughly how much does one GPU block cost? I guess I would be getting the full block.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
:snip:
Dual bios switch 6950s are getting harder and harder to find. Plus unlocking is not as common as it used to be.
Dual bios 6950's aren't hard to find at all. Sapphire has multiple models and they come with unlocked bios's in the 2nd position from the factory.

That said, I'm not too sure how much luck you'll have finding full coverage waterblocks for anything but reference models. This is true for both AMD and nVidia. You'll likely have to get universal GPU waterblocks and use individual heatsinks for the PCB components. If I were going to do it myself, I would get cards with waterblocks pre-installed.

HD 6970
GTX 580 3GB
HD 6990

As others have said, I would go with the 3GB version of the GTX 580, but it's pretty pricey. Also, it's not a reference design and it might be difficult to find an after market waterblock. Although EVGA gets them from someone and they might sell it at retail. Not sure if it would save you any coin or not buying separate waterblocks for them or not. The HD 6990 is readily available in reference models, unlike the others, and you could get an after market waterblock pretty easily. Probably easier than you can find the watercooled model.

I wouldn't let drivers decide for me. Modern games are pretty quirky and both companies are equally good (or bad) overall. Don't be surprised when a TWIMTBP (nVidia sponsored) game is buggy on AMD when first released. nVidia likes to make it difficult for AMD owners.

PS I wouldn't even consider a GTX 570. YMMV. :)
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
2. Would you say it's a very reasonable choice (SLI/XFire) these days? I would be happier with 30 FPS and no microstuttering at max settings than microstuttering at 60 FPS. To get 120 consistently seems a stretch. But maybe it's not?

Well at 60 fps you get little to no microstuttering. Even 60 consistently is a stretch, look at Skyrim with its hunger for CPU power. I would say it is very reasonable. If you use the right settings you almost double performance.

If you aim for 60, you're not afflicted by microstuttering anyway. If you aim for max image quality, you would be in the 30s or 40s with a single card, too. With a limiter that is smooth on SLI/CF as well. So either way you win.

Concerning watercooling, it took EK about 4 weeks until their coolers for GTX480 were listed her in Germany. EVGA Hydrocopper was listed one day after launch, using a Swiftech watercooler (although they are quite flow restrictive, so that wouldn't be my first choice depending on the rest of the watercooling setup). I guess 4-8 weeks for actual availability of watercoolers from the time of the card launch is realistic.

Considering that, the wait for Kepler could be quite long indeed. And for HD7000 also at least 3 months assuming a January release. It's a difficult choice. You could get a GTX570 SLI with 2.5GB and overclock it a bit, then you have basically GTX580 3GB SLI performance at a cheaper price.
 
Last edited:

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
:snip:
Dual bios 6950's aren't hard to find at all. Sapphire has multiple models and they come with unlocked bios's in the 2nd position from the factory.

I stand corrected. However, reference 6950s are hard to come by and for water cooling he would need a reference card.

To answer the OPs question about watercooling block availability. I'm not an expert by any measure, which is why my water cooling thread is 8 pages long and my computer loop has been revised about 3 times ^_^

What I have seen is that waterblocks are available pretty quickly after new cards are released.
 

Rangoon

Member
Apr 19, 2008
48
0
0
This may sound like a stupid question, but what exactly qualifies a card as being "reference"? Does it mean it's not overclocked in any way (from the manufacturer)? No physical additions like RAM and altered heat sinks? Does the product description generally make this explicit?

I have purchased so many cards over the years, from the cheapest to the most expensive. I have purchased OCd models, some alterations, some which I assumed would be considered reference. But until now I've never had to care whether a card was actually considered "reference" or not. So I haven't paid much attention to that detail.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
The PCB layout, placing of capacitors, resistors...obviously there has to be a standard, otherwise watercoolers won't fit because components are in the way and/or the contact areas for VRMs, VRAM and GPU are in the wrong place.
 

Rangoon

Member
Apr 19, 2008
48
0
0
Okay, that makes sense, but how do you know for sure a card is using reference standard?

For example, I just picked a 580 randomly here:
http://www.newegg.com/Product/Produc...82E16814125379

Is this reference? The term is not mentioned anywhere, but I assume it is.

And this one:
http://www.newegg.com/Product/Produc...82E16814130655

It has 3 GB VRAM, but is it reference? I would guess not, since they have to put that extra memory some place...

I see no mention on either card whether they are or aren't.

Could I find a total-coverage water block for that 3GB card? Or would I have to buy the $700+ version which comes with the Swiftech block already installed?
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
No. Neither of those are reference. Although, you might well be able to find a full cover waterblock for the EVGA one. They sell a 3Gb card, with one pre-installed.

As time goes by it gets harder and harder to find reference designs. What you tend to be left with, a year after release, is mostly custom designed PCB's. Either to make it cheaper, which is often the case, or better than the original reference design.

Edit: OK mate, just found this. EK makes blocks, etc.. for the 3Gb card.
 
Last edited:

Rangoon

Member
Apr 19, 2008
48
0
0
No. Neither of those are reference. Although, you might well be able to find a full cover waterblock for the EVGA one. They sell a 3Gb card, with one pre-installed.

As time goes by it gets harder and harder to find reference designs. What you tend to be left with, a year after release, is mostly custom designed PCB's. Either to make it cheaper, which is often the case, or better than the original reference design.

Edit: OK mate, just found this. EK makes blocks, etc.. for the 3Gb card.

I noticed on the cooling configurator, they only break it down to 03G-P3-1584 but don't tag on the AR, ER, etc. I assume then that anything with that code attached to it they have a block for?

However, the second card I linked to in my post above is: 03G-P3-1584-AR

It's the 3GB version and you say it's not reference. I'm confused a bit by that.


Whoa! And once again, I see the 3GB 03G-P3-1584-ER (ER, not AR) listed as a "Reference" board.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I noticed on the cooling configurator, they only break it down to 03G-P3-1584 but don't tag on the AR, ER, etc. I assume then that anything with that code attached to it they have a block for?

However, the second card I linked to in my post above is: 03G-P3-1584-AR

It's the 3GB version and you say it's not reference. I'm confused a bit by that.



Whoa! And once again, I see the 3GB 03G-P3-1584-ER (ER, not AR) listed as a "Reference" board.

The "reference" 580 has 1.5Gb vram. That's why I said that the 3Gb board isn't a reference design. Apparently there are some non reference cards you can get waterblocks for. In the end you need to do a bit of research. I believe the suffix on nVidia cards just denotes warranty coverage, but I'm not certain. You need to research that as well. All I did was Google EK waterblocks to find that chart.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
There are cards with 3GB that are identical to the reference design. They just have memory chips with double the density.
 

Pyro908

Junior Member
Nov 30, 2011
1
0
0
Dual bios switch 6950s are getting harder and harder to find. Plus unlocking is not as common as it used to be.

OP. I was in your boat well. I avoided dual card setups because I feared the microstutter. It still exists, but varies a lot how much it shows up depending on the frame rate and the game.

To minimize microstutter I suggest a 120hz monitor, and sacrificing IQ with the goal of keeping your frame rates closer to 120. Higher the frame rate, the lower variance in drawing times between frames, thus less microstutter. "But wait, I'm buying two cards so I don't have to make such sacrifices!" Well, with two cards you have to sacrifice less to get higher frame rates ;)

To answer your original question. The last few major game releases AMD has really dropped the ball on getting crossfire support on new titles on day one. Just look at Skyrim. NVidia has done a much better job of getting drives released very quickly to support SLi in new games. As such, you might want to consider NVidia cards since they seem to make the dual card experience less painful.

The other side of the coin is that you need to consider what resolution you are running at and how much vram you will need. One of the biggest weaknesses for NVidia GPUs, is the lack of vram on their cards vs AMDs. Games such as BF3 will use 1.5gb+ of vram on the highest detail levels. The gtx 570 comes with 1.2gb of vram, and the 580 with 1.5gb. Going forward I believe the affective life span of the 570/580 will be less than the 6970 2gb, since we are already pushing the vram limits of the 570/580. People go back and forth on this, but it is still something to consider.

So AMD cards have the vram for longevity and future games, however nasty driver support... What may be a realistic option is checking for a used gtx 580 here on the forums and picking that up to use till the next gen of cards comes out. Do some research on the rumors and see if you are okay with waiting. A gtx 580 is a beast of a card and will do very well at everything but the highest detail levels.

IF MONEY IS NO OBJECT and you need something now. Get two of these:
http://www.newegg.com/Product/Produc...82E16814130655
GTX 580 with 3gb of vram. Probably the best card you could put in SLi.

Keep in mind with your water cooling you will likely need to be looking at reference cards and be very careful about block compatibility.

Hope this helps.

VRAM is a small problem that many get fussed up about. Unless you have a dual or surround setup, VRAM matter very little. Yes battlefield 3 can take up 1.5gb of VRAM, however have any gtx570 1.2 gb failed to play the game on ultra with playable frame rates? Say you get a gtx580 3gb, by the time a game comes out that uses all that VRAM, the gpu itself would be outdated and it simply can't process fast enough and maybe there will be a successor card with half the VRAM but twice the performance. I'm not saying VRAM don't matter, it's just by the time the gtx570 becomes too slow, it most likely won't be it's VRAM but the gpu architecture itself. And by that time it's time to upgrade anyway.
 
Last edited:

mattdallastx

Member
Nov 30, 2011
78
0
0
580 and SLI later. =)

____________
COOLER MASTER HAF X RC-942-KKN1 Full Tower Computer Case
CORSAIR H70 Core High Performance Liquid CPU Cooler
Intel Core i7-980X Extreme Edition Gulftown 3.46GHz LGA 1366 Six-Core Desktop Processor
CORSAIR DOMINATOR GT 24GB (6 x 4GB) 240-Pin DDR3 SDRAM DDR3 2000 (PC3 16000) memory
CORSAIR Pro Series Gold AX1200 watt power supply SLI Certified 80 PLUS GOLD Certified
ASUS Rampage III Extreme LGA 1366 Intel X58 SATA 6Gb/s USB 3.0 ATX Intel Motherboard
LG Black Blu-ray Drive
(2)EVGA SuperClocked 015-P3-1582-TR GeForce GTX 580 (Fermi) 1536MB in SLI
Crucial 256gb SSD sata III 6gb/sec
Western Digital Caviar Black 1.5 gb sata III 6gb/sec
(2) Western Digital Caviar Blue 500gb - striped RAID
Microsoft Windows 7 Pro