DirectX 12 Will Allow Multi-GPU Between GeForce And Radeon

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html

One of the big things that we will be seeing is DirectX 12's Explicit Asynchronous Multi-GPU capabilities. What this means is that the API combines all the different graphics resources in a system and puts them all into one "bucket." It is then left to the game developer to divide the workload up however they see fit, letting different hardware take care of different tasks.

The article goes further to explain that this would also apply for APUs, eg. any iGPU. I believe it when it's here. Still pretty skeptical anyone will actually implement this? Maybe BF series and very few other AAA titles.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Scary stuff! :eek:

Kinda reminds me of this.
Ghostbusters said:
Dr Ray Stantz: Fire and brimstone coming down from the skies! Rivers and seas boiling!

Dr. Egon Spengler: Forty years of darkness! Earthquakes, volcanoes...

Winston Zeddemore: The dead rising from the grave!

Dr. Peter Venkman: Human sacrifice, dogs and cats living together... mass hysteria!
 
Feb 19, 2009
10,457
10
76
"Split Frame Rendering"

Works very nice in Civ BE Mantle mode, extremely flatline smooth frame latency but overall less performance raw scaling than AFR.

It also doesn't need duplication of the frame buffer so that multi-GPU will truly scale with their vram! :D

Looks like MS and AMD have been working very very close with Mantle, Xbone -> DX12 evolution.
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91
Still pretty skeptical anyone will actually implement this? Maybe BF series and very few other AAA titles.

Yup. I'm basically going to copy and paste one of the comments from /r/games, and say that, since the developer is given complete control over this, most of them probably aren't going to bother. (but hey, I'd like to be proven wrong.)

"...It is then left to the game developer to divide the workload up however they see fit"

Most developers don't even bother nailing multicore support at the moment, so I think it will take a big shift in attitude for the benefits of this system to be seen.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Yup. I'm basically going to copy and paste one of the comments from /r/games, and say that, since the developer is given complete control over this, most of them probably aren't going to bother. (but hey, I'd like to be proven wrong.)
I wonder if this means the end of driver level shader optimizations as well. (and such.)
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I didn't write the following. But I know of no better summary for why the concept is absurd.

You make it sound like having a multi-GPU setup's performance limited to whichever vendor's card is slowest for a given scene, featureset limited to the intersection of card features both could support reliably, on drivers never meant to work in concert with their competitors, with a PC installed with two proprietary driver packages with separate update methods and schedules, drivers that frequently have problems with conflicts with older versions of themselves, with different .NET or other environmental requirements, on hardware that was not designed/developed/tested in the presence of a competitor that frequently has problems with differing hardware from the same vendor, with low-level differences in execution behavior, little history of cooperative implementations, multiple render paths in an engine that were never designed/coded/tested to run simultaneously, in a platform that has at best problematically supported switching between one GPU or the other in mobile, between vendors who have every interest and an ongoing history of getting in each other's way could lead to undesirable outcomes.
https://forum.beyond3d.com/posts/1827166/
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
"Split Frame Rendering"

Works very nice in Civ BE Mantle mode, extremely flatline smooth frame latency but overall less performance raw scaling than AFR.

It also doesn't need duplication of the frame buffer so that multi-GPU will truly scale with their vram! :D

Looks like MS and AMD have been working very very close with Mantle, Xbone -> DX12 evolution.

The sfr thing in civ may be "more playable" compared to afr, but it's barely above single gpu performance. Actually makes a great case agianst multigpu.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
The sfr thing in civ may be "more playable" compared to afr, but it's barely above single gpu performance. Actually makes a great case agianst multigpu.

When you say single gpu performance I assume you mean average frame rate.

The difference which can be seen in the instantaneous frame rates on anandtech's review is that instead of the frame rate jumping between 140 and 20 fps it stays at a constant 60. In this case it is not the mean that will be most affected but the median and mode which are just as important for playability. The intent was never to get a higher mean, although they did at 2560x1440 the minimum frame rate was 33% higher with the Crossfire Mantle solution and the average was 26% higher. It isn't 90% scaling but also this is Civilization we are talking about and really wouldn't benefit too much from 144fps.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
When you say single gpu performance I assume you mean average frame rate.

The difference which can be seen in the instantaneous frame rates on anandtech's review is that instead of the frame rate jumping between 140 and 20 fps it stays at a constant 60. In this case it is not the mean that will be most affected but the median and mode which are just as important for playability. The intent was never to get a higher mean, although they did at 2560x1440 the minimum frame rate was 33% higher with the Crossfire Mantle solution and the average was 26% higher. It isn't 90% scaling but also this is Civilization we are talking about and really wouldn't benefit too much from 144fps.

Minimum fps, 290x crossfire mantle is only 15% higher than single 290x mantle in the best case, on fullhd it's slower. So that's why I say it's a great case agianst multigpu.

civilization-beyond-earth-minimums-single-gpu-vs-crossfire-fixedsfr.png
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Nvidia will block this, so who cares? This could be cool with IGPs though. Finally giving gamers a reason to care about IGP progress... at least if you have an AMD card.
 

Spjut

Senior member
Apr 9, 2011
931
160
106
Nvidia will block this, so who cares? This could be cool with IGPs though. Finally giving gamers a reason to care about IGP progress... at least if you have an AMD card.

I was just about to say that.
I've previously seen graphics programmers post about how good it would be if Intel's or AMD's IGPs could be used for dedicated GPGPU when paired with a discrete GPU.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
if we can't SLI between camps, what about with different cards in DX12? Say I pair up my 770 with a 780 Ti?
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It might theoretically be possible but you know it'll never actually work. Not that I suspect that will stop some evangelizing this as the solution to all our problems.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I cant see AMD/nVidia cross card working due to politics. I'm unsure whether Intel will get their iGPUs to work with dGPUs in tandem, but I doubt they will care enough without Microsoft getting directly involved.

I could see this allowing mixed-card-same-vendor multi-GPU. E.g. running a 7950 with a 290 or a 770 with a 780
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
It would take a dev to realy optimize this to take advantage of 2 different architectures. I am dubious this will happen, but that is even assuming that NV doesn't 'update' their drivers to restrict this.

Based on NV's actions recently, I highly doubt they will allow this to happen.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Lets just hypothesize that it did work perfectly.

What is the market? No meaningful amount of people are running both cards in the same system. Yes it could be said that people in the future would then be allowed to buy whatever was best at the time, and use it with their older stuff.

But I just do not see it being worthwhile from the developers point of view.

And of course, nVidia will do everything they can to break AMD cards in the system.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Lets just hypothesize that it did work perfectly.

What is the market? No meaningful amount of people are running both cards in the same system. Yes it could be said that people in the future would then be allowed to buy whatever was best at the time, and use it with their older stuff.

But I just do not see it being worthwhile from the developers point of view.

And of course, nVidia will do everything they can to break AMD cards in the system.

Yep, even in a perfect world it doesnt even have any relevance. Not that it had any to begin with.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I can believe a developer could make almost anything possible along these lines, but to believe this is any chance of happening is like fantasizing about winning powerball.

example: I could sli? if that term would apply a old gtx 460 with hd4600. But that would probably take many man hours to get working properly, and that would have to be done for multitudes of various hardware.
We can't even get multi-gpu profiles for some AAA games on current class hardware!
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
There is also the potential for GPU Compute use split between GPUs.
I.e. the Intel GPU does the GPGPU work and the discrete card does the rendering.