Discussion The Future of NVLink availability? (SLI viability has finally reached its lower threshold (personally))

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
I edited this to be more in the spirit of discussion rather than me just ranting.

[Original; skip unless you're interested in the full context] I'm running into a lot of titles that do not natively support SLI. That's cool. I knew that SLI support would be spotty at best, expected to have to play with it, and get progressively less supported as it would be an aging and niche technology. In the past, this wasn't that big of a deal since there would be a compatibility bits and settings that can be attempted with NvidiaInspector. However, now it seems more and more development teams have been forcefully removing SLI compatibility from their graphics engines with the majority, if not all of the compatibility bits and settings (Anthem did this and I'm seeing similar, but not as heavily, behavior from The Division 2). (To clarify, I had discovered several compatibility bits that would at least attempt to run both of my cards in SLI if set with NvidiaInspector. Not always well, but the bits would allow SLI to at least attempt it. Then, after patching was done, those same settings wouldn't even attempt to run SLI at all, and the engine/drivers would only allow 1 GPU to function.) Tweaking the settings to get an unofficially supported working SLI profile has become impossible.

Why would a company forcefully remove SLI from even attempting to function on their game unless specifically requested to do so from Nvidia? Is there some egregious issue with running a SLI compatibility bit, when it doesn't function very well? Does it cause explosions? (I haven't seen one yet.) I'm failing to see why a gaming company would deny an engine from even acknowledging SLI compatibility settings and attempt to use it unless expressly requested to do so from Nvidia.

Nvidia's lack of support for SLI makes sense in terms of development support and running a business, but it is none-the-less disagreeable as an end user. Nvidia Link certainly has a lot of advantages over SLI, and I'd really love to use it, but right now that's a minimum of $1400. That's insane.

I'm not oblivious that the market has been trending away from multiple GPU setups for many years now. NVLink I think offers a viable rebirth of it. The problem is Nvidia seems to realize this and wants to capitalize on profits far more than allowing end users to configure their systems more cost effectively as they see fit.

(Focused topic) I'll wait for the next line-up to see if NVLink is offered on anything but the 2 most expensive GPUs. What are the chances this will happen?
 
Last edited:

killster1

Banned
Mar 15, 2007
6,208
475
126
you are running two cards as one but 1400$ is insane? what kind of cards do you SLI with? Why not sell them both and buy one good card if money is tight.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Game companies are removing SLI support due to issues getting it to work properly and the fact not many people have dual and triple GPU setups.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
It is honestly a pity multi GPU has not caught on more recently. Sometimes 1 top tier card is just not enough for said resolution or setting, and one cannot wait another 2 years for the next gen to play the game.
 
  • Like
Reactions: pandemonium and ZGR

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
Lots of modern engines use information from across multiple frames for e.g. temporal AA, or motion blur. That makes it extremely difficult to render alternate frames on separate cards, as you need to shuffle a bunch of buffers back and forth between the cards, and serialise large parts of the pipeline.
 

DrMrLordX

Lifer
Apr 27, 2000
21,619
10,827
136
Lots of modern engines use information from across multiple frames for e.g. temporal AA, or motion blur. That makes it extremely difficult to render alternate frames on separate cards, as you need to shuffle a bunch of buffers back and forth between the cards, and serialise large parts of the pipeline.

DLSS?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
However, now it seems more and more development teams have been forcefully removing SLI compatibility from their graphics engines with the majority, if not all of the compatibility bits and settings
That's not how it works. Game engines don't enable/disable SLI outside of very rare cases under DX12 where the engine has to be specifically coded for it. I don't think multi-GPU is even part of the OGL/DX spec.

What actually happens is nVidia implements game-specific SLI code in the driver to work around specific versions of game engines.

You can take your chances with user flags, but then you're simply rolling the dice to see how a particular game version reacts to that.

An SLI bit on one driver doesn't necessarily mean the same thing on another driver. Also the profiles don't expose full driver functionality, meaning there's still a lot of detection and driver code behind the scenes you can't see or control.

So more than likely what's actually happening is nVidia is reducing driver code across the board to support SLI, and gradually shutting it down completely for newer titles.

The current SLI situation is a lot like forcing AA was about a decade ago.
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
That's not how it works. Game engines don't enable/disable SLI outside of very rare cases under DX12 where the engine has to be specifically coded for it. I don't think multi-GPU is even part of the OGL/DX spec.

What actually happens is nVidia implements game-specific SLI code in the driver to work around specific versions of game engines.

You can take your chances with user flags, but then you're simply rolling the dice to see how a particular game version reacts to that.

An SLI bit on one driver doesn't necessarily mean the same thing on another driver. Also the profiles don't expose full driver functionality, meaning there's still a lot of detection and driver code behind the scenes you can't see or control.

So more than likely what's actually happening is nVidia is reducing driver code across the board to support SLI, and gradually shutting it down completely for newer titles.

The current SLI situation is a lot like forcing AA was about a decade ago.

This makes me wonder why did they even bother with NVlink on RTX? Because the pcb's were already going to have it for the Quadro's and Teslas ?
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
This makes me wonder they did they even bother with NVlink on RTX? Because the pcb's were already going to have it for the Quadro's and Teslas ?

Makes the PCB's a bit cheaper. But mostly I think it was just to be at more of a parody with AMD. Crossfire has been significantly better than SLI ever since AMD moved to use the PCIe bus for data instead of the bridge cable. It got rid of all the weird hicups and stutters that multi-GPU was known for. I would be curious to see some testing done comparing them now vs the older implementation.

There is also still the rumor of GPU's moving to a multi-die setup as it gets harder and harder to shrink the processes. Instead of having a single die, a high end may have 2 or 4 dies. SLI and CF tech would come into play there. But, these rumors have not been quantified in any way, so who knows if it will happen.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
you are running two cards as one but 1400$ is insane? what kind of cards do you SLI with? Why not sell them both and buy one good card if money is tight.

I would've thought the $1,400 would've been an indication of what I was running (less expensive). I have 2 970's. I ran SLI (and prior to that, CF) to achieve better performance for less cost.

That's not how it works. Game engines don't enable/disable SLI outside of very rare cases under DX12 where the engine has to be specifically coded for it. I don't think multi-GPU is even part of the OGL/DX spec.

What actually happens is nVidia implements game-specific SLI code in the driver to work around specific versions of game engines.

That's how I thought it worked, but you missed some of what I said. SLI was working until Anthem did an update. I did not update Nvidia drivers and had several compatibility bits that did work no longer even attempt to. This indicates the compatibility bits were forcefully denied via the engine, not Nvidia drivers.

It is honestly a pity multi GPU has not caught on more recently. Sometimes 1 top tier card is just not enough for said resolution or setting, and one cannot wait another 2 years for the next gen to play the game.

I'm not even in that category, but the number of higher end users that have top-tier systems almost always require multiple GPUs to drive their displays at sufficient rates.
 

killster1

Banned
Mar 15, 2007
6,208
475
126
I would've thought the $1,400 would've been an indication of what I was running (less expensive). I have 2 970's. I ran SLI (and prior to that, CF) to achieve better performance for less cost.



That's how I thought it worked, but you missed some of what I said. SLI was working until Anthem did an update. I did not update Nvidia drivers and had several compatibility bits that did work no longer even attempt to. This indicates the compatibility bits were forcefully denied via the engine, not Nvidia drivers.



I'm not even in that category, but the number of higher end users that have top-tier systems almost always require multiple GPUs to drive their displays at sufficient rates.


perhaps a 970 sli will perform between a 1070 and 1080 performance best case scenario? (which is pretty darn cool) but realistically the extra electricity is not saving you the 30$ diff in gfx card prices. (and for the games that dont support ;( )

And then i see some games the 970 sli does worse than a single 970! Cool that you are trying to make the 970 last forever but a single card of ANY kind will be better (2060? 1080 used ). Or maybe you can throw in another 970 and do tri-sli. <-- joke .
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
They would usually perform at a little less than 1080 levels, yes. That's a decent comparison. Though, when not using the second in SLI (I'd normally tweak it to get both to work at 80%+ SLI effectiveness), it would be used for PhysX, which does alleviate processing from the primary card and/or the CPU. That difference is worth mentioning, since we're considering my long term ownership.

They're nearing use of ~4 years now. Of course, had I known about the 3.5GB memory issue I would've not purchased them and chosen something else at the time or waited. Also, Nvidia had a class settlement with $30 rebate per card, so that's -$60 to my initial costs. Initial price per card was $339. So, $620 total. A GTX1080 of the same tier and from the same company new would've been $950 at the time of this review. That's a difference of $330.

My entire rig draws around 410W at peak gaming load from the wall (I have a kill-o-watt permanently positioned for my PC). According to Tom's review of the 1080, it averages around 130W. A 970 also averages around 130W. From this I can estimate my total extra power consumption over the course of my ownership to be $93. So far, I've still saved myself a conservative $237.

Is it worth the extra headache? Subjectively? Yes. I don't mind tweaking my rig. Lately, though, it's becoming too frequent and I feel as though I'm being slapped in the face by Nvidia. So, no, it's not worth it today.

Though, to get this thread back on a relative topic, is anyone inclined to believe or privy to NVLink coming to future, lower-tiered GPUs?
 
  • Like
Reactions: Ranulf

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
Though, to get this thread back on a relative topic, is anyone inclined to believe or privy to NVLink coming to future, lower-tiered GPUs?

NVLink adds more pins to the GPU, more traces to the PCB, more complexity to the chip, and higher max power consumption (meaning higher rated power circuitry). It increases cost for 100% of customers- I don't see them adding it for the benefit of <1% of customers who will use it in SLI (especially when NVidia would rather just sell them one big GPU). I don't see it coming to low end GPUs.
 

killster1

Banned
Mar 15, 2007
6,208
475
126
93$ for 4 years of extra power for the 2nd card? I wonder at what point selling and repurchasing a card would have been optimal (few hours overtime vs few hours tweaking settings for each game)


I only can think of a few circumstances i would want to use low end sli. (Being in jail with no extra parts or free electricity in Antarctica? )

I have used lower end crossfire and was so pleased with it, BUT as soon as th enext summer 2 years ago i purchased a $300 980ti then 1+years ago went to a undervolted 1080ti for 450 off craigslist. I would love if someone hacked a way to make it all work perfectly but never would expect a budget lower end gpu to be more expensive because of it.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I have been very disappointed with support lately. I was a big time SLI user from Geforce 8800 through Maxwell and support was acceptable to me. I knew it took a dip with Pascal and the move to DX12/Vulcan, but I took the plunge anyway this time for a few reasons. 4k 144 needs it, NVlink made me think support would improve since they bothered with this, and finally RTX performance hit seems like a perfect case for SLI to counteract.

The reality has been terrible. None of new stuff I've been playing lately has support. I tried Apex and it was initially supported which was nice, and now they've even yanked that with the latest driver. Had to hack together a solution with NVinspector for BFV which was surprising as that series has traditionally supported multi gpu really well.

Does anyone know about SOTTR? heard rumors of dx12mgpu + new rtx effects would be nice.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
NVLink adds more pins to the GPU, more traces to the PCB, more complexity to the chip, and higher max power consumption (meaning higher rated power circuitry). It increases cost for 100% of customers- I don't see them adding it for the benefit of <1% of customers who will use it in SLI (especially when NVidia would rather just sell them one big GPU). I don't see it coming to low end GPUs.

This is what I figured, but perhaps we'll be surprised? With the way NVLink functions, it's extremely beneficial and I'm really intrigued by it. I'd also think it'd be a good thing to market to the mass of gamers out there, offering more sales to many more consumers.

I won't hold my breath. Lately the offerings from both Nvidia (far too expensive) and AMD (far too underwhelming) have me desperately hoping that Intel will swoop in and offer a decent 3rd option with their discrete GPUs. That could be a while though.

...NVlink made me think support would improve since they bothered with this, and finally RTX performance hit seems like a perfect case for SLI to counteract.

The reality has been terrible. None of new stuff I've been playing lately has support. I tried Apex and it was initially supported which was nice, and now they've even yanked that with the latest driver. Had to hack together a solution with NVinspector for BFV which was surprising as that series has traditionally supported multi gpu really well.

Exactly my sentiments. One would think they'd be promoting the use of multiple GPUs at this stage of the game to drive more sales - especially with how NVLink functions - but that doesn't seem to be the case
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
93$ for 4 years of extra power for the 2nd card? I wonder at what point selling and repurchasing a card would have been optimal (few hours overtime vs few hours tweaking settings for each game)


I only can think of a few circumstances i would want to use low end sli. (Being in jail with no extra parts or free electricity in Antarctica? )

I have used lower end crossfire and was so pleased with it, BUT as soon as th enext summer 2 years ago i purchased a $300 980ti then 1+years ago went to a undervolted 1080ti for 450 off craigslist. I would love if someone hacked a way to make it all work perfectly but never would expect a budget lower end gpu to be more expensive because of it.

I've always considered buying used for cheaper, but could not bring myself to doing it. The reason I couldn't is because while you pay the premium for new, it can last that much longer. When you save money initially buying used, the architecture is less efficient and won't last nearly as long. I've always consider buying used was just playing a perpetual catch-up game. I'm not knocking those that do it at all, but I just prefer to open my case as little as necessary because I'm lazy. :p

With the way newer GPU costs are being driven up, buying used seems to be creeping up on its efficacy though so that option is becoming more relevant.
 

killster1

Banned
Mar 15, 2007
6,208
475
126
I've always considered buying used for cheaper, but could not bring myself to doing it. The reason I couldn't is because while you pay the premium for new, it can last that much longer. When you save money initially buying used, the architecture is less efficient and won't last nearly as long. I've always consider buying used was just playing a perpetual catch-up game. I'm not knocking those that do it at all, but I just prefer to open my case as little as necessary because I'm lazy. :p

With the way newer GPU costs are being driven up, buying used seems to be creeping up on its efficacy though so that option is becoming more relevant.

If you cant own the best then why own the 2nd best used? YIKES. you want a 2nd tier card anyway (you are complaining nvlink is only on the highest model) but yet you complain used wont provide a new enough model. I guess anyone can argue about anything. i understand yea it would be cool if the company put the Nvlink on the cards for free, but these cards already cost twice what they should we dont need more cost going into it..

so if you bought a 1080ti it wouldnt work better then your card until something amazing (in two years) comes out? Really thats what you think? IF they actually upgraded anything in the past 3 years ID agree but this 2080ti bs has me super thrilled to be a 1080ti owner.. and ya right now 1080ti used prices are terrible.. but guess why? they are the best bang for you money ;( Im really curious how much electricity a 970 really does cost. I guess it depends how much time it idles/games/is even turned on and if you factor in that your game has WAY less detail. What screen do you game on ?
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
You're misinterpreting what I'm saying. Newer graphics cards typically last longer because of advances in their architecture and engineering, and higher demands from newer technologies going into the game engines that run on them. I'm speaking generally. Look at any of the latest games benchmarks. The newer cards generally produce effectively higher results than the older in general.

Sort this list by Avg. bench % to get the idea. If you buy used for cheaper, it won't last as long as a new card for more. All that's happening is you're increasing your upgrade frequency and playing behind the curve. Microprocessors have continuously gotten more powerful, so there isn't much argument to me for buying old because it's objectively better for overall use in gaming.

My point in that aspect is that you can buy once in 4 years new, or twice in 4 years used. I prefer the former. I also said there's a growing room for considering buying used due to the exaggerated cost of the newer cards. The market balances performance/cost out quite well, especially in the used sector. It's all relative and adjacent.

I estimated my energy expense at 7 hours of full load for 310 days out of the year to provide a realistically high estimation over my actual use (but not completely out of the question). I already stated my kill-o-watt readings and Tom's articles citing the actual energy costs as well.

Also, I'm pretty acquainted with energy costs. :)

I game at 2560x1440 @60Hz.
 

killster1

Banned
Mar 15, 2007
6,208
475
126
Oh it figures you would use electricity prices from 10 years ago too. Read a little it said 40 bux a year extra for the 200watt extra pc. The 1080ti barely loses to newer cards so how is this huge gap coming about. Ray tracing ? My electric bill is 300 to 400 often so cool u pay 4x less
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Oh it figures you would use electricity prices from 10 years ago too. Read a little it said 40 bux a year extra for the 200watt extra pc. The 1080ti barely loses to newer cards so how is this huge gap coming about. Ray tracing ? My electric bill is 300 to 400 often so cool u pay 4x less

...what? I wrote that.

Either you're trolling or severely lack reading comprehension. Either way, we're done here as you're not staying on topic and seem to be focused on attacking my intellect.
 

killster1

Banned
Mar 15, 2007
6,208
475
126
U say used cards are so far behind the new ones its false. If a review she shows it because they modified drivers for the newer card to show more gain. You complain about SLI. But u could have greater performance with a single card and sell the two.. I said nothing about it intellect just that our electricity usage is so far off. U want to do sli then they assume you want better performance then a single card so it makes sense only works with more expenaive models. What stops u from buying used every 4 years what's the diff