Is SLI/Crossfire support dying?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Snafuh

Member
Mar 16, 2015
115
0
16
You should do some serious research before dropping inaccurate statements like that.
...
This works in theory but not in the real world.
The first few games are not relevant anymore. A change in the engine can completely break Sli/CF. But I didn't know Ruse and Evolve have proper multi GPU support. I actually know a dev of Ryse was continuously complaining about Sli/Cf and I am glad I don't have to work with it myself.

In the real world there are always money/time constrains. Sli/Cf implementation costs time/money. Optimizing a game without access to the previous frame costs more time/money. It's about opportunity costs.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I don't think support it dying at all. I think SLI/CF is necessary for certain resolutions and refresh rates. Also, without SLI/CF, the recent explosion of monitor tech might have taken a lot longer, or might not have happened at all. People with single GPU's don't tend to get excited about playing new games at 1440p/144hz or gaming at 4K. The reason is simple: They can't game like that with a single GPU.
 
Feb 19, 2009
10,457
10
76
nVidia just isn't pushing multi GPU right now.

It didn't occur to me but your point is very spot on. Anyone remember when Tahiti had poor frame times? 7970 CF faster than 680 SLI but more frametime variance. CF/SLI was all the rage then.

Nowadays CF 390/X > SLI 970/980 and even CF Fury X > SLI 980Ti... the mantra has certainly changed, "Multi-GPU is niche!" or "Only 300K PC gamers even use Multi-GPU"...

We see GameWork developers just outright say CF/SLI is too much effort/hassle so they won't do it. While the good devs continue to push on, improving visuals, giving excellent CF/SLI scaling from day 1.

ps. Enthusiasts have always been niche. Nothing has changed. Back then it was 1600p, you needed multi-GPU. Today it's multi-monitor or 4K, that's the new enthusiast playground, and they have always been niche. But these folks buy fancy setups to play fancy AAA games in all their glory. What happens when these AAA titles ship without CF/SLI? No buy.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
It didn't occur to me but your point is very spot on. Anyone remember when Tahiti had poor frame times? 7970 CF faster than 680 SLI but more frametime variance. CF/SLI was all the rage then.

Nowadays CF 390/X > SLI 970/980 and even CF Fury X > SLI 980Ti... the mantra has certainly changed, "Multi-GPU is niche!" or "Only 300K PC gamers even use Multi-GPU"...

We see GameWork developers just outright say CF/SLI is too much effort/hassle so they won't do it. While the good devs continue to push on, improving visuals, giving excellent CF/SLI scaling from day 1.

ps. Enthusiasts have always been niche. Nothing has changed. Back then it was 1600p, you needed multi-GPU. Today it's multi-monitor or 4K, that's the new enthusiast playground, and they have always been niche. But these folks buy fancy setups to play fancy AAA games in all their glory. What happens when these AAA titles ship without CF/SLI? No buy.

Someone was talking about a dual GPU Pascal chip from Nv, but there is no dual chip from NV now right? and is there one planned with Pascal?

So maybe SLI just is being slowly phased out as NV moves to a powerhouse single GPU solution instead?
Pure speculation, but it doesn't seem like SLI/CF has that many users to begin with, and NV is willing to do things that upset enthusiasts as long as it doesn't hurt the base.

We'll see how it plays out though with dX12
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It didn't occur to me but your point is very spot on. Anyone remember when Tahiti had poor frame times? 7970 CF faster than 680 SLI but more frametime variance. CF/SLI was all the rage then.

Nowadays CF 390/X > SLI 970/980 and even CF Fury X > SLI 980Ti... the mantra has certainly changed, "Multi-GPU is niche!" or "Only 300K PC gamers even use Multi-GPU"...

We see GameWork developers just outright say CF/SLI is too much effort/hassle so they won't do it. While the good devs continue to push on, improving visuals, giving excellent CF/SLI scaling from day 1.

ps. Enthusiasts have always been niche. Nothing has changed. Back then it was 1600p, you needed multi-GPU. Today it's multi-monitor or 4K, that's the new enthusiast playground, and they have always been niche. But these folks buy fancy setups to play fancy AAA games in all their glory. What happens when these AAA titles ship without CF/SLI? No buy.

Yes. When Tahiti was shown to have issues you couldn't ask for a recommendation for a card, even a single one, without hearing that it was so important in case you might decide to buy a second one. Since XDMA though? Not so important.

Seems like VR might bring back multi GPU though. We'll see.
 
Feb 19, 2009
10,457
10
76
This is a good blog post about what the future will bring:

http://blog.mecheye.net/2015/12/why-im-excited-for-vulkan/

Basically, as the onus shift to developers, we can expect a VAST separation from top devs to mediocre ones who relied heavily on AMD/NV to do the heavy lifting with driver hacks, workarounds referred to as "optimizations" or "game ready drivers".

Guys like Ubisoft are screwed in such a thin-layer API like DX12/Vulkan. I suspect they will desperately cling to DX11 for as long as they can.
 

Timmah!

Golden Member
Jul 24, 2010
1,565
914
136
nVidia just isn't pushing multi GPU right now.

Given the fact they apparently had dual maxwell board to show-off for journalists back in September, but its still nowhere to be seen for customers, i would say this might hold bit of truth.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Software solution is no solution. I dropped SLI myself and only run singel GPU, i play games the day they are released and i have no patience for incompentent developers to fix a game. This was the same reason i dropped trippel screen, to niche and not good enough support. ATM the golden standard is 1440p singel GPU if u ask me for enthusiast. Weird screen rations and ghetto fps with SLI/CF is just trouble, if u early adopter of 4K u need SLI / CF tough since no singel GPU card is fast enough.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Perhaps just coincidental, AMD recently had the sense to make it possible to turn off CF on their dual-GPU cards with the Crimson drivers. Previously it was forced on at all times.

nVidia just isn't pushing multi GPU right now.
Eh, I don't know. NVIDIA would prefer to sell you two video cards all 365 days of the year, even if they don't have a specific dual-GPU card.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Eh, I don't know. NVIDIA would prefer to sell you two video cards all 365 days of the year, even if they don't have a specific dual-GPU card.

Not if it's more costly to support multi GPU than what they make for the small numbers sold.
 

Timmah!

Golden Member
Jul 24, 2010
1,565
914
136
Perhaps just coincidental, AMD recently had the sense to make it possible to turn off CF on their dual-GPU cards with the Crimson drivers. Previously it was forced on at all times.

Eh, I don't know. NVIDIA would prefer to sell you two video cards all 365 days of the year, even if they don't have a specific dual-GPU card.

Unbelievable.

I had that option with my GTX590 from the day one, and that was in May 2011.

AMD went full retard here, if they made it possible only now, cause if one wants to use the card for other purpose than gaming, the SLI/Crossfire can be actually detrimental and shall be switched off.
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
I'm pretty happy with my 290x xfire setup overall. It works for most of the games I play and when it does work the scaling is very good. You can't beat Battlefield 4 at 100+ fps at 1440p@144hz and Highest settings. Crimson also vastly improved xfire in Guild Wars 2 that I play regularly.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Perhaps just coincidental, AMD recently had the sense to make it possible to turn off CF on their dual-GPU cards with the Crimson drivers. Previously it was forced on at all times.

Eh, I don't know. NVIDIA would prefer to sell you two video cards all 365 days of the year, even if they don't have a specific dual-GPU card.
Unbelievable.

I had that option with my GTX590 from the day one, and that was in May 2011.

AMD went full retard here, if they made it possible only now, cause if one wants to use the card for other purpose than gaming, the SLI/Crossfire can be actually detrimental and shall be switched off.



I was able to turn off CF on my 5970s, make a game profile and select disable CF, job done and the game was runing on a single GPU.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I was able to turn off CF on my 5970s, make a game profile and select disable CF, job done and the game was runing on a single GPU.
That didn't actually fully turn off CF. The card was still operating in linked adapter mode. The profile setting just tried to tell some games to ignore that. The latest driver update actually allows you to turn off linked adapter mode now.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
That didn't actually fully turn off CF. The card was still operating in linked adapter mode. The profile setting just tried to tell some games to ignore that. The latest driver update actually allows you to turn off linked adapter mode now.

But the fact is it worked and that's all that matters and not how it was achieved.
 

NotAgOat

Junior Member
Dec 1, 2010
8
0
0
What nobody has mentioned yet is the fact that we are on the precipice of VR.

The vive will require rendering two images at 90 FPS per eye. Star Citizen at 180FPS 2k res is not easy.

With Nvidia's GameWorks VR and ATI's Liquid VR both using one GPU per eye, I believe multi-GPU is about to see a huge boom in developer adoption for VR games.

I'd wait for H2 2016 to make judgments about the demise of Multi-GPU. ;)
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yes. When Tahiti was shown to have issues you couldn't ask for a recommendation for a card, even a single one, without hearing that it was so important in case you might decide to buy a second one. Since XDMA though? Not so important.

Seems like VR might bring back multi GPU though. We'll see.

What nobody has mentioned yet is the fact that we are on the precipice of VR.

The vive will require rendering two images at 90 FPS per eye. Star Citizen at 180FPS 2k res is not easy.

With Nvidia's GameWorks VR and ATI's Liquid VR both using one GPU per eye, I believe multi-GPU is about to see a huge boom in developer adoption for VR games.

I'd wait for H2 2016 to make judgments about the demise of Multi-GPU. ;)

I mentioned it. :D

This all hangs on whether or not nVidia is ready to support it. If not then the general perception will be poor. They will push every negative they can about VR like they are immense issues that no consumer would ever want to deal with.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
It didn't always work well. That was the problem.:(

Nothing always works in the context of gaming, but disabled CF profile always worked when i needed it too.
If we all started only going by only absolutes then we would have nothing and we would not be any need to worry about disabling multi GPU because there would not be any Multi GPU in the first place because Multi GPU is not absolute in itself which fails far more often than profile disabled CF not doing as it should.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
What nobody has mentioned yet is the fact that we are on the precipice of VR.

The vive will require rendering two images at 90 FPS per eye. Star Citizen at 180FPS 2k res is not easy.

With Nvidia's GameWorks VR and ATI's Liquid VR both using one GPU per eye, I believe multi-GPU is about to see a huge boom in developer adoption for VR games.

I'd wait for H2 2016 to make judgments about the demise of Multi-GPU. ;)
That's really the saving grace for mutli-GPU right now. It's one of the few cases where you absolutely need the power, and it doesn't suffer intra-frame scalability issues.:)