Gears of War 4 May Update Includes MGPU Support

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Multi-GPU Support Arrives on Windows 10

We are excited to announce that Multi-GPU support arrives with the May Update for our Windows 10 players! Our Windows 10 team have been hard at work implementing support for users who use multiple graphics card setups, and now fans can take full advantage of their setup to really push the stunning visuals of Gears of War 4.​

https://news.xbox.com/2017/05/01/gears-4-may-content-update/

Will be interesting to see how it does and which type of mgpu they are supporting.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Wish they would patch in shader model 6 or the new shader model 6.1 instead ...

Would be a dream like how Doom used SPIR-V all over again ...
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
When I think of mGPU I think of mixing different brands of dGPU:

http://www.anandtech.com/show/10067/ashes-of-the-singularity-revisited-beta/4

MixedGPU2_575px.jpg
 
  • Like
Reactions: AntonioHG

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Woot. Hope it scales well on both vendors! And if they manage to make it so you can mix and match cards from different vendors, that would be tons of fun...

Wish they would patch in shader model 6 or the new shader model 6.1 instead ...

Would be a dream like how Doom used SPIR-V all over again ...

Patience, grasshopper. The Shader Model 6.1 games will come. :p
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I assume it's just your standard alternate frame rendering.

Not sure haven't seen any tech sites cover it, but that is basically 100% scaling on 980 Tis. I don't think I've ever seen that good scaling from SLI. A few games have 100% on CFX / MGPU on AMD but first time I've seen Nvidia hit 100%. That's just awesome. It would be even better if more devs used the cross-vendor mGPU version (multi display adapter / explicit multi adapter)

Would be perfect to be able to use AMD + Nvidia in the same system if you wanted, would really boost competition and everyone would win. You could use whatever as the primary card for gsync/freesync monitor and the other card to do half the frame.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Not sure haven't seen any tech sites cover it, but that is basically 100% scaling on 980 Tis. I don't think I've ever seen that good scaling from SLI. A few games have 100% on CFX / MGPU on AMD but first time I've seen Nvidia hit 100%. That's just awesome. It would be even better if more devs used the cross-vendor mGPU version (multi display adapter / explicit multi adapter)

Would be perfect to be able to use AMD + Nvidia in the same system if you wanted, would really boost competition and everyone would win. You could use whatever as the primary card for gsync/freesync monitor and the other card to do half the frame.

I've seen 95% scaling or so before, and it's only ever been alternate frame rendering. Split frame rendering (the ideal method visually, as it reduces microstutter and discarded frames significantly) rarely scales better than 35% per extra GPU.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
I've seen 95% scaling or so before, and it's only ever been alternate frame rendering. Split frame rendering (the ideal method visually, as it reduces microstutter and discarded frames significantly) rarely scales better than 35% per extra GPU.

Well that's because split frame rendering doesn't truly split the workload between the two viewports. You still have to render the same geometry load on both GPUs and then you have other issues like resource synchronization with some engines which can cause some PCIE transfer bottlenecks. Another issue with the scalability of split frame rendering is uneven workloads between the two viewports such as the top of the frame rendering mostly an empty sky with clouds but the bottom frame is rendering dense foliage ...

Alternate frame rendering is ideally more performant of the two methods but the sad part is that it doesn't have to have frame pacing issues when we have solutions like adaptive refresh rates to let the display refresh whenever the frame has completed rendering ...

Performance, good frame pacing, and no screen tearing, we could have it all with alternate frame rendering and adaptive refresh rates ...
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Not sure haven't seen any tech sites cover it, but that is basically 100% scaling on 980 Tis. I don't think I've ever seen that good scaling from SLI. A few games have 100% on CFX / MGPU on AMD but first time I've seen Nvidia hit 100%. That's just awesome. It would be even better if more devs used the cross-vendor mGPU version (multi display adapter / explicit multi adapter)

Would be perfect to be able to use AMD + Nvidia in the same system if you wanted, would really boost competition and everyone would win. You could use whatever as the primary card for gsync/freesync monitor and the other card to do half the frame.
It's (probably?) not SLI, but it's still AFR.

Multi GPU in DX12 works really well though. All three examples have great scaling and smooth frametimes - Ashes of the Singularity, Deus Ex and now this.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
It's (probably?) not SLI, but it's still AFR.

Right, I meant in the old SLI (pre-DX12 mGPU) I don't recall ever seeing close to 100% scaling. We've seen it from CFX in ROTTR DX11 with 390s and other titles for AMD, but I don't recall seeing such good scaling ever for Nvidia.

Multi GPU in DX12 works really well though. All three examples have great scaling and smooth frametimes - Ashes of the Singularity, Deus Ex and now this.

ROTTR also has DX12 mGPU which is very good along with their good DX11
 

dogen1

Senior member
Oct 14, 2014
739
40
91
Well that's because split frame rendering doesn't truly split the workload between the two viewports. You still have to render the same geometry load on both GPUs and then you have other issues like resource synchronization with some engines which can cause some PCIE transfer bottlenecks. Another issue with the scalability of split frame rendering is uneven workloads between the two viewports such as the top of the frame rendering mostly an empty sky with clouds but the bottom frame is rendering dense foliage ...

https://forum.beyond3d.com/posts/1919323/

According to Sebbbi, with a GPU driven renderer you can split the viewport in half and still evenly balance the load between GPUs.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Multi GPU games are dead. Nobody is making them anymore.

Might be an increase in the amount of mGPU capable games in the future, as UE4 now supports SLi by default starting with version 4.15. In fact, I wonder if The Coalition implemented mGPU in Gears of War 4 because of UE4's now standardized support?

Gears of War 4 uses UE4.11, but I doubt it would be too hard for them to backport aspects of 4.15 into their engine.
 

AdamK47

Lifer
Oct 9, 1999
15,805
3,611
136
It had always seemed like Nvidia and AMD would focus on SLI / Crossfire implementations if they knew a game would be used as a benchmark for hardware reviews. As frustrating as that sounds, it seemed as though they did it to drive sales over their competitor. Since AMD has been drastically lagging behind, Nvidia's efforts to support SLI has diminished quite a bit. I don't think we would be in this situation if AMD was more competitive. I feel as though it would have driven Nvidia to at least focus more on multi-GPU configurations.