Microsoft Refines DirectX 12 Multi-GPU with Simple Abstraction Layer

biostud

Lifer
Feb 27, 2003
18,238
4,755
136
Microsoft is sparing no efforts in promoting DirectX 12 native multi-GPU as the go-to multi-GPU solution for game developers, obsoleting proprietary technologies like SLI and CrossFire. The company recently announced that it is making it easier for game developers to code their games to take advantage of multiple GPUs without as much coding as they do now. This involves the use of a new hardware abstraction layer that simplifies the process of pooling multiple GPUs in a system, which will let developers bypass the Explicit Multi-Adapter (EMA) mode of graphics cards.

This is the first major step by Microsoft since its announcement that DirectX 12, in theory, supports true Mixed Multi-Adapter configurations. The company stated that it will release the new abstraction layer as part of a comprehensive framework into the company's GitHub repository with two sample projects, one which takes advantage of the new multi-GPU tech, and one without. Exposed to this code, game developers' learning curve will be significantly reduced, and they will have a template on how to implement multi-GPU in their DirectX 12 projects with minimal effort. With this, Microsoft is supporting game developers in implementing API native multi-GPU, even as GPU manufacturers stated that while their GPUs will support EMA, the onus will be on game-developers to keep their games optimized.

https://www.techpowerup.com/223923/...tx-12-multi-gpu-with-simple-abstraction-layer
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
if this becomes a reality in the near future, 2+ 480 lvl cards could become king for everything.
 

brandonmatic

Member
Jul 13, 2013
199
21
81
if this becomes a reality in the near future, 2+ 480 lvl cards could become king for everything.

My guess is that this "quick and easy" method would not be the most efficient way of doing multi-GPU. So you're probably not looking at ideal scaling for games that use this. But it's certainly better than not having any support for multi-GPU at all.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I think they just mean same card scaling, similar to current SLI (crossfire could do different GPU same arch for a while now).

Cross vendor / advanced functionality will take more work.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Gotta go that extra mile before the free Windows 10 upgrade date passes July 29! Then, coast...

The implication of that press release is that we would experience the same effects of EMA done the "hard way", but using a simpler abstraction layer. This can only be good for DX12 gamers.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
Gotta go that extra mile before the free Windows 10 upgrade date passes July 29! Then, coast...

The implication of that press release is that we would experience the same effects of EMA done the "hard way", but using a simpler abstraction layer. This can only be good for DX12 gamers.

If its the same tech I read about elsewhere you have to use similar GPUs for it. EMA you can use whatever. The other version was basically the same as xfire/sli.
 

cytg111

Lifer
Mar 17, 2008
23,174
12,835
136
My guess is that this "quick and easy" method would not be the most efficient way of doing multi-GPU. So you're probably not looking at ideal scaling for games that use this. But it's certainly better than not having any support for multi-GPU at all.

We recently saw that CF 480 didnt surpass 1080/1070

http://arstechnica.co.uk/gadgets/2016/07/amd-rx-480-crossfire-vs-nvidia-gtx-1080-ashes/

CF/SLI/mGPU being what it is with scaling as is .. I wouldnt plan my purchasing options now for going mgpu later down the line... IMO its single gpu or bust.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
We recently saw that CF 480 didnt surpass 1080/1070

http://arstechnica.co.uk/gadgets/2016/07/amd-rx-480-crossfire-vs-nvidia-gtx-1080-ashes/

CF/SLI/mGPU being what it is with scaling as is .. I wouldnt plan my purchasing options now for going mgpu later down the line... IMO its single gpu or bust.

Once more games support MGPU DX12 CFX 480 will do very well.

It already passes 1070 in DX11 games

ruYEPB0.png


And does pass 1080 in DX12 so far

9A1jBEK.png


Considering how much CPU limitation AMD has in DX11, especially in CFX, these cards will be monsters in DX12 MGPU as shown by ashes.

Would like to see how well it does in TWWH, Nano scaled very well:

TV3WAocaoi9gvGp5AJSueY.png
 

biostud

Lifer
Feb 27, 2003
18,238
4,755
136
DX12 TWWH CF/SLI is not working, so I wonder how they got the R9 nano to run in CF.
 

Krteq

Senior member
May 22, 2015
991
671
136
DX12 TWWH CF/SLI is not working, so I wonder how they got the R9 nano to run in CF.
It is possible to run TWWH in multi-GPU mode on DX12
Patch 1 also allows for beta DirectX12 multi-GPU support (Crossfire and SLI) . However, this has to be enabled manually as it requires the Steam Client beta and a manual edit of the preferences.script. The Steam Client beta is required to get round a DirectX12 Steam issue which prevents launching the game in DirectX12 multi-gpu mode.
First please opt into the Steam client beta. You can do this via the following steps: - Open the Steam client - Select Steam in the top left corner - Select Settings in the drop down menu - In the Account settings, look for the 'Beta participation' settings - Opt into the Steam Beta Update - Click OK to accept this change and then reboot the Steam client
You then need to make a change to your Warhammer preferences.script to enable multi GPU support in DX12. You can do this via the following steps: - Boot Warhammer on Patch 1 and then quit the game on the main menu (this forces the game to generate/update the preferences.script) - Open the preferences.script.txt located in \%AppData%\The Creative Assembly\Warhammer\scripts - Search for the line: gfx_dx12_multi_gpu false; - Change it to: gfx_dx12_multi_gpu true; - Save and close the preferences.script
After that you should have multi GPU support enabled in DirectX 12.
Taken from redit - Dx12 - Crossfire support gone?
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Either way, multi-GPU at similar performance is not as good as a single card, but I wonder if this means cards that are not in CF or SLI can be used with this system, such as 1060 SLI, or 1080 3-way SLI.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
You'd be insane to think ms isn't laying the groundwork for a multi gpu console
 

cytg111

Lifer
Mar 17, 2008
23,174
12,835
136
Once more games support MGPU DX12 CFX 480 will do very well.

It already passes 1070 in DX11 games

ruYEPB0.png


And does pass 1080 in DX12 so far

9A1jBEK.png


Considering how much CPU limitation AMD has in DX11, especially in CFX, these cards will be monsters in DX12 MGPU as shown by ashes.

Would like to see how well it does in TWWH, Nano scaled very well:

TV3WAocaoi9gvGp5AJSueY.png

You know how in academia you employ bayesian reasoning? Remove the uttermost left and right studies(ie. ashes), evaluate whats left, if 90 out of 100 says left, it is problary left. My conclusion here is that we need more studies, not enough data... from AIB's, we problary wont know the complete picture until a few months from now..
On a general note, i hold Ars's validity and integrity pretty high.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
You know how in academia you employ bayesian reasoning? Remove the uttermost left and right studies(ie. ashes), evaluate whats left, if 90 out of 100 says left, it is problary left. My conclusion here is that we need more studies, not enough data... from AIB's, we problary wont know the complete picture until a few months from now..
On a general note, i hold Ars's validity and integrity pretty high.

I'm sorry, did you miss the first graph that had 17 games in it all of which it was faster than the 1070 in?

If the game supports MGPU, 2 480s are faster. GCN gets big boosts from DX12 / Vulkan. More MGPU games in DX12/Vulkan = very fast GCN cards and amazing price/performance.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm sorry, did you miss the first graph that had 17 games in it all of which it was faster than the 1070 in?

If the game supports MGPU, 2 480s are faster. GCN gets big boosts from DX12 / Vulkan. More MGPU games in DX12/Vulkan = very fast GCN cards and amazing price/performance.

Except that you still have to live with twice the latency, less consistent frame times, more noise and heat. Along with some games not working with it.

When it's close, even when it's 10-20% slower, the single card will give a better overall experience.
 
Feb 19, 2009
10,457
10
76
Except that you still have to live with twice the latency, less consistent frame times, more noise and heat. Along with some games not working with it.

When it's close, even when it's 10-20% slower, the single card will give a better overall experience.

Do you remember CF in BF4 & Hardline, DX11 it was as you describe, but Mantle, it was smooth, fluid. Though the more noise & heat is an issue ofc.

But if 2x RX 480 is 30% faster than a 1070 in BF1, the extra power and total system power, say 450W vs 300W, isn't that much behind in terms of perf/w.

Well, at least that's what I expect given DICE is at the helm of DX12 in BF1.
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
Except that you still have to live with twice the latency, less consistent frame times, more noise and heat. Along with some games not working with it.

When it's close, even when it's 10-20% slower, the single card will give a better overall experience.

This doesn't work like SLI or xFire with its AFR. You will be using split frame rendering which address the latency and brings that back inline with a single card. More noise and heat will still be an issue for some and as for frame times, we will see if that is a big issue or not. I suspect that might vary game to game.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Except that you still have to live with twice the latency, less consistent frame times, more noise and heat. Along with some games not working with it.

When it's close, even when it's 10-20% slower, the single card will give a better overall experience.

This thread is about DX12 MGPU, so how would games not work with it?

Also the latency and frametimes are much better with DX12 MGPU over regular SLI/CFX.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Do you remember CF in BF4 & Hardline, DX11 it was as you describe, but Mantle, it was smooth, fluid. Though the more noise & heat is an issue ofc.

But if 2x RX 480 is 30% faster than a 1070 in BF1, the extra power and total system power, say 450W vs 300W, isn't that much behind in terms of perf/w.

Well, at least that's what I expect given DICE is at the helm of DX12 in BF1.

Where does it say they are using split frame rendering? If they did use split frame rendering, the scaling will be much worse, which means you either get all the issues of CF/SLI, or you get poor scaling.

Neither out come is going to make CF and SLI as good as a single card.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
This thread is about DX12 MGPU, so how would games not work with it?

Also the latency and frametimes are much better with DX12 MGPU over regular SLI/CFX.

As long as they use AFR, latency is double. It's not an API thing, but a simple fact that every frame you see, took 2 frames of time to render (every other frame is done by a different GPU, that took 2 frames worth of time to render).

If they use SFR, scaling will be much worse.

Neither results in a great mid range CF/SLI experience. Multi GPU's are really only good when you want to go beyond what a single GPU can do at the high end.
 
Feb 19, 2009
10,457
10
76
Where does it say they are using split frame rendering? If they did use split frame rendering, the scaling will be much worse, which means you either get all the issues of CF/SLI, or you get poor scaling.

Neither out come is going to make CF and SLI as good as a single card.

It doesn't have to be SFR. It's DX12 multi-adapter explicit (or other modes). It works without a SLI bridge.

It minimizes latency because of direct hardware access, bypassing the driver/OS layer.

Since Johan is still at DICE & it's NOT a GameWorks title, I have no concerns for BF1's DX12 implementation.
 
Feb 19, 2009
10,457
10
76
As long as they use AFR, latency is double. It's not an API thing, but a simple fact that every frame you see, took 2 frames of time to render (every other frame is done by a different GPU, that took 2 frames worth of time to render).

If they use SFR, scaling will be much worse.

Neither results in a great mid range CF/SLI experience. Multi GPU's are really only good when you want to go beyond what a single GPU can do at the high end.

What?

You need to re-think that.

Even in DX11, there are some titles where CF/SLI has zero issues with latency. Higher frame rate and smoother.

Let's imagine a 60 FPS scenario, 16ms per frame.

1 GPU = 60 FPS = 16ms per frame.
2 GPU with perfect scaling (95% is possible) = 120 FPS = 8ms per frame.

The problem is when it's done poorly, GPU #1 and #2 are not in sync well, leading to big frame time variance.

All DX12/Vulkan mGPU does is give developers more control. If they are capable, the result should be better. If they are not, well, no mGPU support at all. :/