I sold one of my GTX 1080 Ti cards

urvile

Golden Member
Aug 3, 2017
1,575
2
96
#1
So I have been running crossfire setups since the 7970Ghz edition. I was running Fury X crossfire until I upgraded to GTX 1080 Ti SLI.

The support for multi GPU in games is not there and if it is. Its often poorly implemented. So I decided to get rid of one of my cards.

I game at 3440x1440 and I find that a single GTX 1080 Ti is more than enough. Especially when coupled with an overclocked 8700k.

Is SLI/Crossfire dead? Personally, I think so. Especially with games being developed for console first.
 
Last edited:
Dec 10, 2010
59
0
81
#2
Perhaps the need, and support, for it will rise again with hybrid rendering, given how a single RTX 2080 Ti is supposedly only capable of 1080p60fps with ray tracing enabled. 3440x1440 is nearly 5m pixels, whereas 1080p is only 2m; in theory, you wouldn't be able to play at 60fps on your current monitor even with two 2080 Tis.

Wasn't there a low-level API feature in DX12 that solves some, or many, multi-GPU issues? Developers need to put in more work to enable it, though if I'm not mistaken. Maybe some will now if they want their games using ray tracing to be playable at higher resolutions.

P.S. I think it'd be better if you added "Is SLI/Crossfire dead?" to the title.
 
Mar 10, 2004
28,523
238
126
#3
So I have been running crossfire setups since the 7970Ghz edition. I was running Fury X crossfire until I upgraded to GTX 1080 Ti SLI.

The support for multi GPU in games is not there and if it is. Its often poorly implemented. So I decided to get rid of one of my cards.

I game at 3440x1440 and I find that a single GTX 1080 Ti is more than enough. Especially when coupled with an overclocked 8700k.

Is SLI/Crossfire dead? Personally, I think so. Especially with games being developed for console first.
NVlink apparently pools VRAM, and is much faster, so maybe it will come back?
 

guskline

Diamond Member
Apr 17, 2006
5,340
65
126
#4
urvile, will you use the proceeds to buy a 2080ti?
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
227
96
#6
NVlink apparently pools VRAM, and is much faster, so maybe it will come back?
Actually that was question in the Tom Petersen interview. NVlink discussion happens around 38mins.
https://youtu.be/YNnDRtZ_ODM?t=38m2s

The exact question of pooling VRAM:
https://youtu.be/YNnDRtZ_ODM?t=42m50s

Basically NO on pooling VRAM.

Though Devs have the option of doing planned transfers between those buffers faster than before.

We are still really in the situation of SLI having two separated buffers, and developer intervention required.

Basically the only thing it really does today is make super high res, super high refresh rates that would choke SLI in the past, smoother.

All the other SLI negatives remain. It's just SLI with a faster bridge.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,191
134
126
#7
1080Ti in SLI is sort of a waste anyway, imo. I still run a pair of GTX 980 to game at the same resolution and, for the most part, it works really well when the support is there. The same happened with AMD's Crossfire. There were so many titles with lousy support that I just stopped caring; the last CF setup I had was a pair of 7970's. Not only did those cards use a ton of power and throw a ridiculous amount of heat but most games had issues when CF was enabled.
 
Mar 10, 2004
28,523
238
126
#8
Actually that was question in the Tom Petersen interview. NVlink discussion happens around 38mins.
https://youtu.be/YNnDRtZ_ODM?t=38m2s

The exact question of pooling VRAM:
https://youtu.be/YNnDRtZ_ODM?t=42m50s

Basically NO on pooling VRAM.

Though Devs have the option of doing planned transfers between those buffers faster than before.

We are still really in the situation of SLI having two separated buffers, and developer intervention required.

Basically the only thing it really does today is make super high res, super high refresh rates that would choke SLI in the past, smoother.

All the other SLI negatives remain. It's just SLI with a faster bridge.
I'm sure that when the 2080 and 2080ti pages were first posted on the NV site, there was a page or blurb that described NVlink2 as pooling the VRAM, even listing 22gb of VRAM, but it seems to now be gone.

The Quadro Turing cards still seem to say that the VRAM is pooled.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
227
96
#9
The Quadro Turing cards still seem to say that the VRAM is pooled.
The latency issue for Compute, is negligible compared to graphics.

Which is why all multiprocessor CPU systems have always had pooled memory, even over much slower interfaces than NVlink. Now you even have things like Threadripper 2990x, that don't even have any local memory for two of the dies.
 

Headfoot

Diamond Member
Feb 28, 2008
4,408
55
126
#10
DX12 Multi adapter despite great promise and usefulness, has failed to materialize outside of one game (to my knowledge). Civ:BE which wasnt even that good of a game in the first place.

Too bad. SFR/AFR with the app aware of it and making appropriate resource decisions is such a better idea than trying to hack it on later via drivers. Overall new DX12 feature usage has been pretty poor with a few exceptional outliers
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
98,274
519
126
#11
DX12 Multi adapter despite great promise and usefulness, has failed to materialize outside of one game (to my knowledge). Civ:BE which wasnt even that good of a game in the first place.

Too bad. SFR/AFR with the app aware of it and making appropriate resource decisions is such a better idea than trying to hack it on later via drivers. Overall new DX12 feature usage has been pretty poor with a few exceptional outliers
didn't everyone's favorite is it a game or is it an AMD demo AOTS have built in support for dx12 multi adapter that not only worked, but worked well?
 

SPBHM

Diamond Member
Sep 12, 2012
4,856
81
126
#12
the focus is clearly moving away from it, you used to have most of the range from low end supporting it, now that's mostly gone,
I think games are more complex to support it now, and there is not really more people using dual GPU solutions.
 

Guru

Senior member
May 5, 2017
646
234
86
#13
The big issue with SLI is that it does require extra effort from developers and Nvidia programmers. That is why it can never be really successful, support for it is always going to be lagging and performance gains are going to vary too much, its just the way it is.
 

Timmah!

Senior member
Jul 24, 2010
734
9
91
#14
Actually that was question in the Tom Petersen interview. NVlink discussion happens around 38mins.
https://youtu.be/YNnDRtZ_ODM?t=38m2s

The exact question of pooling VRAM:
https://youtu.be/YNnDRtZ_ODM?t=42m50s

Basically NO on pooling VRAM.

Though Devs have the option of doing planned transfers between those buffers faster than before.

We are still really in the situation of SLI having two separated buffers, and developer intervention required.

Basically the only thing it really does today is make super high res, super high refresh rates that would choke SLI in the past, smoother.

All the other SLI negatives remain. It's just SLI with a faster bridge.


"Now its true you can set it to up to do that...you could set up the memory map so that you know effectively it would look like giant frame buffer, but it would be a terrible performance there..."

I am not native english speaker, but this sounds to me like YES, BUT.... rather than NO. No would mean it got disabled - when compared to Quadros - but why would he then say that you can do that....

EDIT: Even though i dont agree on interpretation (might be wrong) - still thanks for the heads-up in the other thread.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
227
96
#15
"Now its true you can set it to up to do that...you could set up the memory map so that you know effectively it would look like giant frame buffer, but it would be a terrible performance there..."

I am not native english speaker, but this sounds to me like YES, BUT.... rather than NO. No would mean it got disabled - when compared to Quadros - but why would he then say that you can do that....

EDIT: Even though i dont agree on interpretation (might be wrong) - still thanks for the heads-up in the other thread.
I think he is speaking theoretically. I am sure end users won't be able to. It is possible a developer might be able to achieve something like that, but given that it would be more work for terrible performance they won't. So this isn't happening.
 

Timmah!

Senior member
Jul 24, 2010
734
9
91
#16
I think he is speaking theoretically. I am sure end users won't be able to. It is possible a developer might be able to achieve something like that, but given that it would be more work for terrible performance they won't. So this isn't happening.
Well, its possible he meant it that way. But he is no doubt aware those cards are gonna be used by people not just for gaming, but for compute too, where increased latency would not be issue.
 

alcoholbob

Diamond Member
May 24, 2005
5,924
87
106
#17
DX12 Multi adapter despite great promise and usefulness, has failed to materialize outside of one game (to my knowledge). Civ:BE which wasnt even that good of a game in the first place.

Too bad. SFR/AFR with the app aware of it and making appropriate resource decisions is such a better idea than trying to hack it on later via drivers. Overall new DX12 feature usage has been pretty poor with a few exceptional outliers
Not to mention a game like Civ is basically staring at static images at a fixed camera angle all game, not sure what even the point of programming SFR into that game was.
 
Mar 24, 2017
148
0
71
#18
SLI has been worthless for nvidia\ati since it came out. The technology will never take off until they are able to get the tech working without needing any developer support. I'm surprised SLI needs developer support at all. 3DFX seemed to master multi-gpu designs with the voodoo 5 5500. If I remember correctly it needed no developer support. I had a voodoo 4 4500 and 5 5500
and every game played was always significantly faster on the 5500 well after 3DFX disappeared.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
98,274
519
126
#19
SLI has been worthless for nvidia\ati since it came out. The technology will never take off until they are able to get the tech working without needing any developer support. I'm surprised SLI needs developer support at all. 3DFX seemed to master multi-gpu designs with the voodoo 5 5500. If I remember correctly it needed no developer support. I had a voodoo 4 4500 and 5 5500
and every game played was always significantly faster on the 5500 well after 3DFX disappeared.
rendering worked completely differently back then. once we started doing deferred rendering, scan line interleaving didn't work as a scaling method.
 

Stuka87

Diamond Member
Dec 10, 2010
4,256
182
126
#20
Removed the quoted spammer

AT Moderator ElFenix


Just a note, you don't have a sig ;)
 
Last edited by a moderator:

Headfoot

Diamond Member
Feb 28, 2008
4,408
55
126
#21
didn't everyone's favorite is it a game or is it an AMD demo AOTS have built in support for dx12 multi adapter that not only worked, but worked well?
That's true - I suppose I should clarify, just speaking to SFR rendering using DX12 to my knowledge was only Civ:BE. Worked great on a title that didnt really need it and wasn't even that good :(

At least with AOTS it was real time where higher frame rate can matter. Really though, I'm sad we've seen no SFR adoption (or even multi adapter AFR) in many (any?) FPS or twitch action games that really need it. If I could count on my main stay multiplayer FPS titles working with SLI/CF well I would probably do it so I could run both 144hz and high resolution/details. Would love to run high refresh rate 4k
 
Feb 27, 2003
15,110
19
126
#22
If it scales well for raytracing it might get a comeback.
 

bystander36

Diamond Member
Apr 1, 2013
5,153
10
106
#23
I'm sure that when the 2080 and 2080ti pages were first posted on the NV site, there was a page or blurb that described NVlink2 as pooling the VRAM, even listing 22gb of VRAM, but it seems to now be gone.

The Quadro Turing cards still seem to say that the VRAM is pooled.
It's most likely a case where they described multi-GPU support could use pooled VRAM with such high transfer rates. SLI is only one form of multi-GPU support, and not what they were referring too. SLI is pretty much locked into using a buffer per video card, but with the use of DX12, you can do multi-GPU support without SLI or CF.
 

Sergei Ivanov

Junior Member
Aug 7, 2018
6
0
6
#24
SLI and especially cross over will live as soon as the demand is there. The demand will come from console gaming and with the way prices are now, I don't see it in the near future. Its a narrow market for VR gamers and developers so you end up paying the early adopter fee's, but will it die? I doubt, just not worth the money.
 
Nov 10, 2005
153
0
76
#25
SLI never seems to be worth it in real world gaming in my experience- I just get the best single car I can afford and roll with that.
 


ASK THE COMMUNITY