SLI has gone to crap.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
Fury X is coming one card to rule them all

lmao only if high-end Freesync monitors actually launch instead being vaporware for 6 months. Since CF sucks Fury X is only compelling if there is a comparable Freesync monitor available to deal with the fact that your fps is lower because you can't run dual card consistently by going AMD.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
Crossfire AND SLI failed to live up to the promises, maybe some day but it is so far from a seamless/set it and forget it experience it's not even funny.
 

dave1029

Member
May 11, 2015
94
1
0
Crossfire AND SLI failed to live up to the promises, maybe some day but it is so far from a seamless/set it and forget it experience it's not even funny.
DX 12 hopefully will be our savior. From what I *understand* DX 12 removes the need for developers to code for SLI/xfire. As DX 12 will seamlessly utilize both GPU's through your Nvidia/AMD drivers. Or so has been claimed. Now does this mean older games can be patched for DX 12 multi-gpu support if you have win 10? Or will it only be new games? Only time will tell.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
I'm really hoping DX12 can indeed do this, imagine multi-GPU just working no matter the game. Sounds too good to be true honestly but here's hoping.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
You won't buy a 500 dollar video card that is mildly faster unless Nvidia neglects drivers for their 2 year old GPU's! What is Nvidia supposed to do? They're a business!

On a side note, I have 780's and haven't seen many SLI issues.
 

dave1029

Member
May 11, 2015
94
1
0
You won't buy a 500 dollar video card that is mildly faster unless Nvidia neglects drivers for their 2 year old GPU's! What is Nvidia supposed to do? They're a business!

On a side note, I have 780's and haven't seen many SLI issues.
Well AMD opted to not even release a card at all.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
DX12 is reported to have native multi-GPU configurations, so devs won't have to optimize for it.

DX 12 hopefully will be our savior. From what I *understand* DX 12 removes the need for developers to code for SLI/xfire. As DX 12 will seamlessly utilize both GPU's through your Nvidia/AMD drivers. Or so has been claimed. Now does this mean older games can be patched for DX 12 multi-gpu support if you have win 10? Or will it only be new games? Only time will tell.


You got it backwards.

Devs can write explicit multi adapter code instead of having to rely on after-the-fact AFR Crossfire or SLI in the driver in DX12. I hope this takes off. As it is currently, AMD/NV add profiles after the fact to do AFR and other techniques and each of these techniques has a myriad of undesireable side effects (AFR = longer frame queue depths = more percieved latency, inconsistent frame delivery compared to single gpu, etc.)
 
Last edited:

BonzaiDuck

Lifer
Jun 30, 2004
15,726
1,455
126
Let me comment that programs like GeForce Experience seem focused on accommodating single-card users. Also, there is a hitch to installing NVidia drivers for 2x SLI or greater.

When installing the drivers, turn SLI off. After the drivers are installed -- and ALWAYS, ALWAYS do a "Clean Install" -- and after all the scheduled reboots have completed, turn SLI back on.

Some symptoms associated with an SLI-enabled driver installation include excessive power consumption at idle. That is, the full-bore P0 may drop back to P2, but it will not drop to P8 when the graphics cards should run at idle speed.

It is also possible to overclock SLI configurations excessively. A single card may overclock to a certain core and memory speed; dual cards might mean you settle from something slightly less than single-card speeds. And you might need a little extra voltage.
 

Brahmzy

Senior member
Jul 27, 2004
584
28
91
Never have issues running SLi. Always worked great for me. And to be honest, do we have a choice? I've always had the cutting edge of resolution displays (2560x1600 like 9 years ago) (4K now) - show me a single card that can rock 4K with any sort of high settings - doesn't exist. Even Titan Pascal won't be able to pull off 4K with all the goodies cranked. This is how it goes folks.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
God bless DX12. VRAM stacking is going to be great.

Great if it happens, it probably won't. It's not as simple as you think it is, there are actually huge technical hurdles to overcome for that to happen. And you still need driver support for multi-gpu's.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I like my SLI most of the time. I'd rather have it than not. It works really good most of the time.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Great if it happens, it probably won't. It's not as simple as you think it is, there are actually huge technical hurdles to overcome for that to happen. And you still need driver support for multi-gpu's.

The huge technical hurdle was low enough level APIs where you can explicitly control memory allocation among multiple devices, which is now solved in all three main APIs.

The remaining hurdle is more economic than technical, and that's whether developers will see enough ROI in actually implementing multi adapter code that can utilize disparate pools of RAM. Technically speaking, it's now just like allocating RAM for CPU now that they have low level APIs that allow them to see that sort of thing.
 

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
MY 2 x 9800GT single slot cards worked great in SLI for the games I played with them such as CoD4:MW, Batman:Arkham Asylum and Starcraft 2:WoL from 2007 - 2010.

My only complaint was the amount of heat generated by them which was substantial.

However, now with VR and Direct X 12 my new Windows 10 build has 2 x Radeon R9 285 (Tonga) running with bridgeless crossfire. With all of the frostbite engine games utilizing Mantle I thought AMD has the better option with DX12, Mantle AND Vulkan support.
 

CP5670

Diamond Member
Jun 24, 2004
5,511
588
126
I did SLI once a long time ago, and have since avoided it and CF like the plague. It's a good option for some people, but not for me. My gaming time is limited these days and I want my games to just work with minimum effort (all of them, not just the AAA titles that get the driver teams' attention), without messing with settings and frequent driver upgrades. I would rather play at lower resolutions on a single card than spend time optimizing SLI/CF on a higher resolution.
 

HexiumVII

Senior member
Dec 11, 2005
661
7
81
SLI never been ideal. With VR though, 1 GPU per eye, we will probably get rid of all the inherent faults of SLI and get nearly double perf!
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Crossfire AND SLI failed to live up to the promises, maybe some day but it is so far from a seamless/set it and forget it experience it's not even funny.

Oddly, the best experience I have had with multi GPU setups was with 3dfx cards. Had an SLI voodoo 2 setup, and probably my favorite card I've ever owned a Voodoo 5500 I used for years. Maybe history has shaded my glases rose colored, but I don't remember the Voodoo 5500 needing game profiles or driver optimizations to run games. It just worked. The implementation that 3dfx used of each card rendering half the scanlines on screen is different than current implementations, but it seems odd that nearly 20 years later the technology has gone backwards rather than forwards.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Oddly, the best experience I have had with multi GPU setups was with 3dfx cards. Had an SLI voodoo 2 setup, and probably my favorite card I've ever owned a Voodoo 5500 I used for years. Maybe history has shaded my glases rose colored, but I don't remember the Voodoo 5500 needing game profiles or driver optimizations to run games. It just worked. The implementation that 3dfx used of each card rendering half the scanlines on screen is different than current implementations, but it seems odd that nearly 20 years later the technology has gone backwards rather than forwards.

That solution ate to much fps from what i remember so to compete they chose the quantity over quality. This is not 100% info, it's been so long since i looked at it that i dont remember so please if anybody got more info do provide, and if what im saying is wrong please do correct me. :wub:

Edit: Nvidia bought 3DFX so they are sitting on the tech.
 

Mako88

Member
Jan 4, 2009
129
0
0
SLi rocks, as does 4K. [/thread]

Spoken like a fellow Samsung 48" 4k owner...totally agree Brahmzy. Would never go back to sub-4k gaming, just incredible.

Also, have never had much of a problem with SLI/CF over the years, including alllllll the way back to the original Fury MAXX. Wish I had kept that card, would be fun to trot it out with the new Fury X now on the scene heh.
 

CP5670

Diamond Member
Jun 24, 2004
5,511
588
126
That solution ate to much fps from what i remember so to compete they chose the quantity over quality. This is not 100% info, it's been so long since i looked at it that i dont remember so please if anybody got more info do provide, and if what im saying is wrong please do correct me. :wub:

Edit: Nvidia bought 3DFX so they are sitting on the tech.

I think SLI still supports the SFR mode, which is similar to the alternating lines approach of 3dfx and doesn't have the timing and microstutter problems of AFR. I used it a lot over AFR back when I used SLI. However, its raw performance is lower than AFR and the companies seem to have moved away from it. It worked well on the Voodoo2 since video cards were much simpler back then and only did texturing and AA, not geometry or shaders.