• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GTX 1080 Ti SLI Performance in 25 Games

http://www.babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/

main-chart.jpg
 
I wish Nvidia could improve their SLI scaling, crossfire has been ahead for about five years. I would like 80-95% but SLI seems forever stuck at 40-70%. It's piss poor performance and needs sorting out.
 
But... why would you not test on 1440p surround or 4K surround? Surely no one buys two 1080 Ti's just for one display. I mean, I can imagine someone buying two cards for a single 4K display or maybe for a 3440x1440, but to even bother testing SLI scaling on 1080p or 2560x1440 seems pointless.
 
Last edited:
did anyone actually read the article? or just look at the benchmarks.

Quote:
"We have tested SLI and CrossFire before with rather mixed results. We concluded from our last evaluation of the TITAN X vs. GTX 1070 SLI: “It is pretty clear that CrossFire or SLI scaling in the newest games, especially with DX12, are going to depend on the developers’ support for each game "


Quote:
"We also note that GTX 1080 Ti SLI is overkill for 1920×1080 as it generally scales rather poorly compared with scaling at higher resolutions. Ideally, a gamer would pick GTX 1080 Ti SLI for 4K resolution"
 
Last edited:
A few decent ones in there, but by and large this is exactly why I ditched 980ti SLI for single 1080ti. Couldn't be happier. Unless scaling improves magically, I will never go back to SLI and I've had SLI since Nvidia started doing it with the Geforce 6 series. Whether or not multi GPU will work is a huge gamble, and the odds are not in our favor. The one thing that is for 100% certain though is we will pay exactly 100% more money for a second card. If Nvidia/AMD priced a customer's second card based on the average MGPU scaling for that year, then I'd consider it.
For instance, you pay 100% the price for 1 card, and if the average scaling for games that year or the previous year was only 18%, then you pay 18% of the cost for your second card. Sounds like a legit idea to me. More than fair. In most of those games you can get similar or more performance just by OCing and properly cooling a single card.
 
did anyone actually read the article? or just look at the benchmarks.
Read it. Wish I could get that time back. The numbers explain it well. SLI on this level is a bad joke. Single card performance for nV is strong, SLI has gotten weaker and weaker. Maybe Vega-Fire will scale better, but I generally doubt it.

And single-card remains the way to minimize headaches.

Not sure why you assumed people unimpressed by this performance hadn't read it.

What's your opinion?
 
Well, there are 5 or 6 games that go from below 60 fps to clearly above it at 4k, and the negative scaling tends to be fairly small except in a game or two. So if you have money to burn, I suppose knock yourself out. Overall though, I have never thought a dual card solution was a very desirable choice.
 
Dual card driver level AFR is pretty much dead. Too many new techniques are incompatible without directed work from the developers, and too many developers dont care. Our only hope is a plug and play DX12 + Vulkan multi-card middleware or framework that's easy to use, cheap or free, and well designed. I dont see developers stepping up to do multi card development in DX12 or Vulkan until there's more marketshare, and I can't see more marketshare on PC until there's more support.

Only dropping a multi-card setup into upcoming consoles somehow would change this death spiral, IMO. I dont find that very likely at all, at least not in the next 5 years. Maybe this would happen around Navi time frame. I could see a console where they bifurcate again (for the sake of argument, PS5 and PS5 Pro/VR) have a base chip and add some extra power to it via extra GPU on interposer using Infinity Fabric or similar high-performance fabric tech, and then doing developer-code-level load balancing across them (SFR/AFR/New future techniques). And even that's unlikely.
 
Only dropping a multi-card setup into upcoming consoles somehow would change this death spiral, IMO. I dont find that very likely at all, at least not in the next 5 years. Maybe this would happen around Navi time frame. I could see a console where they bifurcate again (for the sake of argument, PS5 and PS5 Pro/VR) have a base chip and add some extra power to it via extra GPU on interposer using Infinity Fabric or similar high-performance fabric tech, and then doing developer-code-level load balancing across them (SFR/AFR/New future techniques). And even that's unlikely.

Reminds me a bit of the Expansion Pak for the Nintendo 64. Unless they planned on doing this for any of the upcoming consoles, I don't see it happening for at least five years. It would be an interesting way to handle mid-cycle upgrades though.
 
I suspect at the lower resolutions you're just committing the GPU bound CPU benchmark sin in reverse. That would explain poor scaling. I don't know if you can test lower resolutions than 1080p to find out the limit of your CPU, but if so, that would be valuable insight.

That said, multi-gpu is, in my opinion, dead.
 
Confirms why I can't do Nvidia sli. Performance is just horrendous for sli. No wonder they slowly drop support for it.
 
Exactly what one would expect to happen when the burden of making it work moves from the specialists (hardware vendors) to the generalists (game devs).

Not sure what you mean, most of the DX12 engines have good MGPU support. It takes driver support as well though, Nvidia doesn't allow 1060 MGPU in DX12 last I saw for Deus Ex and Hitman.

AMD gets very close to 100% scaling on 480s in Hitman, Sniper Elite 4, ROTTR and others.
 
I'm not sure about SLI (gave up on it years ago due to more expensive motherboards, proprietary bridge is just silly today, and generally worse scaling performance) but crossfire works pretty well from my recent experience (with Fury's and 480's). It's pretty common to see with Crimson driver releases new crossfire profiles and support added for AAA games. I agree it's not really worth if buying two cards with the intent to crossfire (unless you can't wait for Vega and need the fastest Radeon cards) but there are worse ways to spend money to get more performance. I think it's important to evaluate the games you play before going off and splurging on a second card but it's not that big of a hit to the wallet to add another 470 or 480, especially if you have the power supply and cooling capabilities to handle it.

http://amdcrossfire.wikia.com/wiki/Crossfire_Game_Compatibility_List

Quite a few games listed, many with "good" or "excellent" ratings but to each is own. Also keep in mind if you have a Freesync display with a wide range a lot of the traditional frametime issues are mitigated.
 
Eh I'd disagree that crossfire works pretty well, having 2x290 myself. That list doesn't take into account the time component. The only AAA games I've bought in recent memory with working CF on day 1 was Star Wars Battlefront. So if there's a game I am excited about as an enthusiast and buy day 1, I dont get my other card. The Division had game breaking bugs with CF until 6 months after it came out, and I had already finished playing it months ago. Pretty useless to me.

Crossfire is maybe worth it if you're a "Buy relatively new games at half off 6 months later" kind of buyer. But if you wait longer than 6 months, you're already getting into a new generation of faster single cards. So if you wait a year before you buy the games then crossfires back to being a bad idea.

I'd say in my own fairly typical gaming usage patterns (50% of my AAA games are day 1 buys of franchises im a huge fan of, 50% are steam sale pickups later), the times I need the extra Crossfire GPU power and that Crossfire is actually working is one in four. It's extremely disappointing and I will never be doing crossfire or SLI again.

Really nice list though, im bookmarking that
 
A few decent ones in there, but by and large this is exactly why I ditched 980ti SLI for single 1080ti. Couldn't be happier. Unless scaling improves magically, I will never go back to SLI and I've had SLI since Nvidia started doing it with the Geforce 6 series. Whether or not multi GPU will work is a huge gamble, and the odds are not in our favor. The one thing that is for 100% certain though is we will pay exactly 100% more money for a second card. If Nvidia/AMD priced a customer's second card based on the average MGPU scaling for that year, then I'd consider it.
For instance, you pay 100% the price for 1 card, and if the average scaling for games that year or the previous year was only 18%, then you pay 18% of the cost for your second card. Sounds like a legit idea to me. More than fair. In most of those games you can get similar or more performance just by OCing and properly cooling a single card.

This pretty much. I used SLI for a long time myself, but I grew tired of the bad scaling, lackluster implementation per game and compatibility issues that would cause weird bugs and glitches. Single GPU is WAY better overall, to the point where it's not even funny. 😀
 
Here are three reasons why people continue to purchase two or more higher-end GPUs:
  • Several encoding and 3D apps like Rhino and 3D-Studio Max render faster with multiple GPUs-- but most don't use SLI, directly managing the cards themselves.
  • Several folding applications like BOINC recognize and utilize multiple GPUs to massively crunch WUs.
  • Despite the noted neglect, benchmarks indicate 4K and multiple-display gaming still benefits from SLI in a number of games.
There are other scientific apps that also benefit from multiple GPUs but these are very specialized applications.
 
Not sure how SLI scaling fell off a cliff with Witcher 3. It was like 60%+ scaling at launch. Then around patch 1.07 it just started plummeting in perf and CDPR just never bothered to fix it.
 
Here are three reasons why people continue to purchase two or more higher-end GPUs:
  • Several encoding and 3D apps like Rhino and 3D-Studio Max render faster with multiple GPUs-- but most don't use SLI, directly managing the cards themselves.
  • Several folding applications like BOINC recognize and utilize multiple GPUs to massively crunch WUs.
  • Despite the noted neglect, benchmarks indicate 4K and multiple-display gaming still benefits from SLI in a number of games.
There are other scientific apps that also benefit from multiple GPUs but these are very specialized applications.
I thought we were talking about SLI. Multi GPU apps don't use SLI. SLI is specific to gaming.
 
Back
Top