I sold one of my GTX 1080 Ti cards

urvile

Golden Member
Aug 3, 2017
1,575
474
96
So I have been running crossfire setups since the 7970Ghz edition. I was running Fury X crossfire until I upgraded to GTX 1080 Ti SLI.

The support for multi GPU in games is not there and if it is. Its often poorly implemented. So I decided to get rid of one of my cards.

I game at 3440x1440 and I find that a single GTX 1080 Ti is more than enough. Especially when coupled with an overclocked 8700k.

Is SLI/Crossfire dead? Personally, I think so. Especially with games being developed for console first.
 
Last edited:
  • Like
Reactions: Headfoot

zliqdedo

Member
Dec 10, 2010
59
10
81
Perhaps the need, and support, for it will rise again with hybrid rendering, given how a single RTX 2080 Ti is supposedly only capable of 1080p60fps with ray tracing enabled. 3440x1440 is nearly 5m pixels, whereas 1080p is only 2m; in theory, you wouldn't be able to play at 60fps on your current monitor even with two 2080 Tis.

Wasn't there a low-level API feature in DX12 that solves some, or many, multi-GPU issues? Developers need to put in more work to enable it, though if I'm not mistaken. Maybe some will now if they want their games using ray tracing to be playable at higher resolutions.

P.S. I think it'd be better if you added "Is SLI/Crossfire dead?" to the title.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
So I have been running crossfire setups since the 7970Ghz edition. I was running Fury X crossfire until I upgraded to GTX 1080 Ti SLI.

The support for multi GPU in games is not there and if it is. Its often poorly implemented. So I decided to get rid of one of my cards.

I game at 3440x1440 and I find that a single GTX 1080 Ti is more than enough. Especially when coupled with an overclocked 8700k.

Is SLI/Crossfire dead? Personally, I think so. Especially with games being developed for console first.
NVlink apparently pools VRAM, and is much faster, so maybe it will come back?
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
NVlink apparently pools VRAM, and is much faster, so maybe it will come back?

Actually that was question in the Tom Petersen interview. NVlink discussion happens around 38mins.
https://youtu.be/YNnDRtZ_ODM?t=38m2s

The exact question of pooling VRAM:
https://youtu.be/YNnDRtZ_ODM?t=42m50s

Basically NO on pooling VRAM.

Though Devs have the option of doing planned transfers between those buffers faster than before.

We are still really in the situation of SLI having two separated buffers, and developer intervention required.

Basically the only thing it really does today is make super high res, super high refresh rates that would choke SLI in the past, smoother.

All the other SLI negatives remain. It's just SLI with a faster bridge.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,979
839
136
1080Ti in SLI is sort of a waste anyway, imo. I still run a pair of GTX 980 to game at the same resolution and, for the most part, it works really well when the support is there. The same happened with AMD's Crossfire. There were so many titles with lousy support that I just stopped caring; the last CF setup I had was a pair of 7970's. Not only did those cards use a ton of power and throw a ridiculous amount of heat but most games had issues when CF was enabled.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Actually that was question in the Tom Petersen interview. NVlink discussion happens around 38mins.
https://youtu.be/YNnDRtZ_ODM?t=38m2s

The exact question of pooling VRAM:
https://youtu.be/YNnDRtZ_ODM?t=42m50s

Basically NO on pooling VRAM.

Though Devs have the option of doing planned transfers between those buffers faster than before.

We are still really in the situation of SLI having two separated buffers, and developer intervention required.

Basically the only thing it really does today is make super high res, super high refresh rates that would choke SLI in the past, smoother.

All the other SLI negatives remain. It's just SLI with a faster bridge.
I'm sure that when the 2080 and 2080ti pages were first posted on the NV site, there was a page or blurb that described NVlink2 as pooling the VRAM, even listing 22gb of VRAM, but it seems to now be gone.

The Quadro Turing cards still seem to say that the VRAM is pooled.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
The Quadro Turing cards still seem to say that the VRAM is pooled.

The latency issue for Compute, is negligible compared to graphics.

Which is why all multiprocessor CPU systems have always had pooled memory, even over much slower interfaces than NVlink. Now you even have things like Threadripper 2990x, that don't even have any local memory for two of the dies.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
DX12 Multi adapter despite great promise and usefulness, has failed to materialize outside of one game (to my knowledge). Civ:BE which wasnt even that good of a game in the first place.

Too bad. SFR/AFR with the app aware of it and making appropriate resource decisions is such a better idea than trying to hack it on later via drivers. Overall new DX12 feature usage has been pretty poor with a few exceptional outliers
 
  • Like
Reactions: SirDinadan

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
DX12 Multi adapter despite great promise and usefulness, has failed to materialize outside of one game (to my knowledge). Civ:BE which wasnt even that good of a game in the first place.

Too bad. SFR/AFR with the app aware of it and making appropriate resource decisions is such a better idea than trying to hack it on later via drivers. Overall new DX12 feature usage has been pretty poor with a few exceptional outliers

didn't everyone's favorite is it a game or is it an AMD demo AOTS have built in support for dx12 multi adapter that not only worked, but worked well?
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
the focus is clearly moving away from it, you used to have most of the range from low end supporting it, now that's mostly gone,
I think games are more complex to support it now, and there is not really more people using dual GPU solutions.
 

Guru

Senior member
May 5, 2017
830
361
106
The big issue with SLI is that it does require extra effort from developers and Nvidia programmers. That is why it can never be really successful, support for it is always going to be lagging and performance gains are going to vary too much, its just the way it is.
 

Timmah!

Golden Member
Jul 24, 2010
1,396
603
136
Actually that was question in the Tom Petersen interview. NVlink discussion happens around 38mins.
https://youtu.be/YNnDRtZ_ODM?t=38m2s

The exact question of pooling VRAM:
https://youtu.be/YNnDRtZ_ODM?t=42m50s

Basically NO on pooling VRAM.

Though Devs have the option of doing planned transfers between those buffers faster than before.

We are still really in the situation of SLI having two separated buffers, and developer intervention required.

Basically the only thing it really does today is make super high res, super high refresh rates that would choke SLI in the past, smoother.

All the other SLI negatives remain. It's just SLI with a faster bridge.



"Now its true you can set it to up to do that...you could set up the memory map so that you know effectively it would look like giant frame buffer, but it would be a terrible performance there..."

I am not native english speaker, but this sounds to me like YES, BUT.... rather than NO. No would mean it got disabled - when compared to Quadros - but why would he then say that you can do that....

EDIT: Even though i dont agree on interpretation (might be wrong) - still thanks for the heads-up in the other thread.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
"Now its true you can set it to up to do that...you could set up the memory map so that you know effectively it would look like giant frame buffer, but it would be a terrible performance there..."

I am not native english speaker, but this sounds to me like YES, BUT.... rather than NO. No would mean it got disabled - when compared to Quadros - but why would he then say that you can do that....

EDIT: Even though i dont agree on interpretation (might be wrong) - still thanks for the heads-up in the other thread.

I think he is speaking theoretically. I am sure end users won't be able to. It is possible a developer might be able to achieve something like that, but given that it would be more work for terrible performance they won't. So this isn't happening.
 

Timmah!

Golden Member
Jul 24, 2010
1,396
603
136
I think he is speaking theoretically. I am sure end users won't be able to. It is possible a developer might be able to achieve something like that, but given that it would be more work for terrible performance they won't. So this isn't happening.

Well, its possible he meant it that way. But he is no doubt aware those cards are gonna be used by people not just for gaming, but for compute too, where increased latency would not be issue.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
DX12 Multi adapter despite great promise and usefulness, has failed to materialize outside of one game (to my knowledge). Civ:BE which wasnt even that good of a game in the first place.

Too bad. SFR/AFR with the app aware of it and making appropriate resource decisions is such a better idea than trying to hack it on later via drivers. Overall new DX12 feature usage has been pretty poor with a few exceptional outliers

Not to mention a game like Civ is basically staring at static images at a fixed camera angle all game, not sure what even the point of programming SFR into that game was.
 
  • Like
Reactions: Headfoot

slashy16

Member
Mar 24, 2017
151
59
71
SLI has been worthless for nvidia\ati since it came out. The technology will never take off until they are able to get the tech working without needing any developer support. I'm surprised SLI needs developer support at all. 3DFX seemed to master multi-gpu designs with the voodoo 5 5500. If I remember correctly it needed no developer support. I had a voodoo 4 4500 and 5 5500
and every game played was always significantly faster on the 5500 well after 3DFX disappeared.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
SLI has been worthless for nvidia\ati since it came out. The technology will never take off until they are able to get the tech working without needing any developer support. I'm surprised SLI needs developer support at all. 3DFX seemed to master multi-gpu designs with the voodoo 5 5500. If I remember correctly it needed no developer support. I had a voodoo 4 4500 and 5 5500
and every game played was always significantly faster on the 5500 well after 3DFX disappeared.

rendering worked completely differently back then. once we started doing deferred rendering, scan line interleaving didn't work as a scaling method.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Removed the quoted spammer

AT Moderator ElFenix


Just a note, you don't have a sig ;)
 
Last edited by a moderator:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
didn't everyone's favorite is it a game or is it an AMD demo AOTS have built in support for dx12 multi adapter that not only worked, but worked well?
That's true - I suppose I should clarify, just speaking to SFR rendering using DX12 to my knowledge was only Civ:BE. Worked great on a title that didnt really need it and wasn't even that good :(

At least with AOTS it was real time where higher frame rate can matter. Really though, I'm sad we've seen no SFR adoption (or even multi adapter AFR) in many (any?) FPS or twitch action games that really need it. If I could count on my main stay multiplayer FPS titles working with SLI/CF well I would probably do it so I could run both 144hz and high resolution/details. Would love to run high refresh rate 4k
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm sure that when the 2080 and 2080ti pages were first posted on the NV site, there was a page or blurb that described NVlink2 as pooling the VRAM, even listing 22gb of VRAM, but it seems to now be gone.

The Quadro Turing cards still seem to say that the VRAM is pooled.
It's most likely a case where they described multi-GPU support could use pooled VRAM with such high transfer rates. SLI is only one form of multi-GPU support, and not what they were referring too. SLI is pretty much locked into using a buffer per video card, but with the use of DX12, you can do multi-GPU support without SLI or CF.
 

Sergei Ivanov

Junior Member
Aug 7, 2018
6
0
6
SLI and especially cross over will live as soon as the demand is there. The demand will come from console gaming and with the way prices are now, I don't see it in the near future. Its a narrow market for VR gamers and developers so you end up paying the early adopter fee's, but will it die? I doubt, just not worth the money.
 

Highmodulus

Member
Nov 10, 2005
153
0
76
SLI never seems to be worth it in real world gaming in my experience- I just get the best single car I can afford and roll with that.