DX12 Multi-GPU is a challenge

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tential

Diamond Member
May 13, 2008
7,348
642
121
Define "scales well"

It's all about raw FPS. No one cares about experience. Freesync/Gsync don't matter anymore because they don't improve raw FPS....
So because SFR doesn't scale at even close to the rates of AFR, it doesn't matter.

I guess I'll throw my Freesync monitor away when it gets here. Don't need it, doesn't increase raw FPS!
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I guess I'll throw my Freesync monitor away when it gets here. Don't need it, doesn't increase raw FPS!

It does!
AMD-FreeSync-Official-PR-3.jpg

:awe:
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Define "scales well"

Scaling well generally means 70% or better performance added from a 2nd card.

Scaling poorly is getting less than 50% more performance from the 2nd card.

Yes, I get it, SFR gets a better experience, but we are talking about spending double money for a little better experience than a single card will give. There will be people who will buy into it, but it'll be a lot less than current implementations, unless some how people can convince the average user they will get twice as good of an experience.

I'm not arguing to argue. You guys are simply ignoring an obvious fact. Users don't spend double the money unless they are getting something worth that investment. Tons of people see benchmarks of current AFR SLI/CF numbers, and perceive that means they will get 75% more frames and that much better of an experience. Are they going to feel the same way when they see 40% better frames on average? I highly doubt it.

And lets not forget, that until recently, most attempts at SFR has resulted in far worse experiences than AFR. I wonder how many games that will use SFR will generate a better experience. It is not a given. It will result in lower latency, but it may not result in smoother frame delivery in all, or most cases.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Scaling well generally means 70% or better performance added from a 2nd card.

Scaling poorly is getting less than 50% more performance from the 2nd card.

Yes, I get it, SFR gets a better experience, but we are talking about spending double money for a little better experience than a single card will give. There will be people who will buy into it, but it'll be a lot less than current implementations, unless some how people can convince the average user they will get twice as good of an experience.

I'm not arguing to argue. You guys are simply ignoring an obvious fact. Users don't spend double the money unless they are getting something worth that investment. Tons of people see benchmarks of current AFR SLI/CF numbers, and perceive that means they will get 75% more frames and that much better of an experience. Are they going to feel the same way when they see 40% better frames on average? I highly doubt it.

And lets not forget, that until recently, most attempts at SFR has resulted in far worse experiences than AFR. I wonder how many games that will use SFR will generate a better experience. It is not a given. It will result in lower latency, but it may not result in smoother frame delivery in all, or most cases.
This isn't about selling sfr. It's the fact that I couldn't care less if afr gives 3 times the frames and sfr gives 1.3x the frames. If sfr feels better than afr than I'll use sfr.

That's why there are reviews and tests done.

I guess you ignore frame time testing and other tests too since they aren't raw fps?

Very few people purchase multigpu setups as it is..... If you think people purchase mgpu setups for the efficiency then just lol at you.....
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
This isn't about selling sfr. It's the fact that I couldn't care less if afr gives 3 times the frames and sfr gives 1.3x the frames. If sfr feels better than afr than I'll use sfr.

That's why there are reviews and tests done.

I guess you ignore frame time testing and other tests too since they aren't raw fps?

Very few people purchase multigpu setups as it is..... If you think people purchase mgpu setups for the efficiency then just lol at you.....

There is only one recent game that I know of with SFR (I don't mind seeing others if you have them). The concept is nice, especially when it comes to latency. And in that one game it works well, but if you go back and Google about previous attempts, SFR had a far worse experience. It has far less scaling. It is far harder to make work well It stuttered like crazy. I do not expect it to be of much relevance in the future.

And while in that one example that looks pretty good, it still gave less than 50% improvements. It will only be useful for those wanting to use the fastest cards, and even then, is it really giving you much return for the money?

Gsync and Freesync are likely much better investments if you had to pick one. At least those will be good for several generations of GPU's. Buying multiple cards every generation kind of makes it a little tough to get behind.

Maybe SFR will be much improved from the past, but I'll want more than one example, in a game that it may actually matter.

Edit:
This isn't about selling sfr. It's the fact that I couldn't care less if afr gives 3 times the frames and sfr gives 1.3x the frames. If sfr feels better than afr than I'll use sfr.
Would you go out and buy a 2nd 980ti if the games you played only gained 30% more FPS and slightly less consistent frame times?
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
There is only one recent game that I know of with SFR (I don't mind seeing others if you have them). The concept is nice, especially when it comes to latency. And in that one game it works well, but if you go back and Google about previous attempts, SFR had a far worse experience. It has far less scaling. It is far harder to make work well It stuttered like crazy. I do not expect it to be of much relevance in the future.

And while in that one example that looks pretty good, it still gave less than 50% improvements. It will only be useful for those wanting to use the fastest cards, and even then, is it really giving you much return for the money?

Gsync and Freesync are likely much better investments if you had to pick one. At least those will be good for several generations of GPU's. Buying multiple cards every generation kind of makes it a little tough to get behind.

Maybe SFR will be much improved from the past, but I'll want more than one example, in a game that it may actually matter.

Edit:
Would you go out and buy a 2nd 980ti if the games you played only gained 30% more FPS and slightly less consistent frame times?
You're still so insistent on raw FPS, you completely forget about minimums and experience.

I game for the experience, not because of a raw FPS counter. I've already sunk a lot into my rig though and the things surrounding it, so if I needed another 980ti, then that'd not really be a massive cost in comparison to the things I already have purchased to make my gaming experience as immersive as possible.

At the bolded no one is forcing you to buy multiple cards every generation or even mgpu in the first place. I already am going mgpu. I HAVE to to utilize 4K. So I really don't care. I'm not the person to talk to though about the cost of PC gaming. IF you're worried about the cost of a mGPU setup, you're not ready for it. I want smooth 4K gaming. If SLI GTX 980Ti was what it took, I'd get that.

Crossfire R9 290s with Freesync is what I need for the games I'm currently playing. I'll then upgrade to Arctic Island's dual GPU card next year. Wasabi Mango's 65 inch 4K Freesync monitor made the decision for me. It really seems like you think raw FPS is everything, and that just isn't the case. If you need raw FPS, then great, go for that. But if SFR is an option and provides a smoother experience than AFR by testing. Then I don't care what the FPS counter says, as long as my experience is better.

Experience > FPS.

Simple as that, it's why I'm getting a $1500 Freesync monitor, and not SLI GTX 980Ti. Because I don't need high FPS right now, I need my moments below 60 FPS to be smooth....
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Scaling well generally means 70% or better performance added from a 2nd card.

Scaling poorly is getting less than 50% more performance from the 2nd card.

Yes, I get it, SFR gets a better experience, but we are talking about spending double money for a little better experience than a single card will give. There will be people who will buy into it, but it'll be a lot less than current implementations, unless some how people can convince the average user they will get twice as good of an experience.

I'm not arguing to argue. You guys are simply ignoring an obvious fact. Users don't spend double the money unless they are getting something worth that investment. Tons of people see benchmarks of current AFR SLI/CF numbers, and perceive that means they will get 75% more frames and that much better of an experience. Are they going to feel the same way when they see 40% better frames on average? I highly doubt it.

And lets not forget, that until recently, most attempts at SFR has resulted in far worse experiences than AFR. I wonder how many games that will use SFR will generate a better experience. It is not a given. It will result in lower latency, but it may not result in smoother frame delivery in all, or most cases.

70% more what? 50% more what? Frame rate? Frame rate minus latency increase? Frame rate minus latency increase only when it goes over 50ms?

You ignore that Frame Rate is not the only measure of actual gaming performance (e.g. How good it looks at the users eyeballs). Smoothness in motion is obviously important in making something look good. Higher frame rate = smoother, all else being equal and this is why we even started evaluating frame rate in the first place!

The thing is, SFR vs AFR not everything else is equal. You can have the same frame rate and have it be smoother. If you've ever played a game with Mantle turned on (even without SFR) you'll know what I'm talking about.

You are ignoring that AFR adds latency, which is a negative scaling factor while SFR decreases latency which is a positive scaling factor. You're throwing out valuable data and coming to an ill informed conclusion because of it.

It could be that in DX12 we will see in-engine SFR mgpu schemes which scale worse than in-engine AFR schemes with all relevant performance factors included. It could also be that SFR scales better with all relevant performance factors included.

Relevant performance factors: subjective smoothness of frame delivery, objective frame time 99th percentile figures, objective frame time "time spent over 16.6ms" figures, objective frame rates as recorded at the monitor via FCAT, objective frame rates as recorded by FRAPS or other ingame utility, minimum frame rates, maximum frame rates, average frame rates, maximum frame times, minimum frame times, average frame times, even objectively measured frame persistence figures when displayed on the monitor (e.g. blur reduction modes and the like). There are probably more that I haven't thought of.

Looking only at frame rate as measured by FRAPs alone has fallen by the wayside as incomplete.
 
Last edited:

AdamK47

Lifer
Oct 9, 1999
15,675
3,529
136
I see no mention in Brad Wardell's tweeting that he is implementing DX12 SFR. Where are you guys getting this from?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You're still so insistent on raw FPS, you completely forget about minimums and experience.

I game for the experience, not because of a raw FPS counter. I've already sunk a lot into my rig though and the things surrounding it, so if I needed another 980ti, then that'd not really be a massive cost in comparison to the things I already have purchased to make my gaming experience as immersive as possible.

At the bolded no one is forcing you to buy multiple cards every generation or even mgpu in the first place. I already am going mgpu. I HAVE to to utilize 4K. So I really don't care. I'm not the person to talk to though about the cost of PC gaming. IF you're worried about the cost of a mGPU setup, you're not ready for it. I want smooth 4K gaming. If SLI GTX 980Ti was what it took, I'd get that.

Crossfire R9 290s with Freesync is what I need for the games I'm currently playing. I'll then upgrade to Arctic Island's dual GPU card next year. Wasabi Mango's 65 inch 4K Freesync monitor made the decision for me. It really seems like you think raw FPS is everything, and that just isn't the case. If you need raw FPS, then great, go for that. But if SFR is an option and provides a smoother experience than AFR by testing. Then I don't care what the FPS counter says, as long as my experience is better.

Experience > FPS.

Simple as that, it's why I'm getting a $1500 Freesync monitor, and not SLI GTX 980Ti. Because I don't need high FPS right now, I need my moments below 60 FPS to be smooth....

You are assuming that there is a big experience improvement with SFR.

Are you under the impression that the minimums will be improved more with SFR? I find that hard to believe. Most the time minimums are fixed because of CPU bottlenecks, not due to a lack of GPU power. If the CPU is the cause of low minimums, SFR will not help. If the CPU is not the bottleneck, AFR helps minimums too. What makes the most sense is improvements will be made due to dev's being directly in charge of multi-GPU, and that should improve both SFR and AFR alike.

Yes, experience is the goal, but is 30% more FPS, and less consistent frame times better than just using 1 card? It's going to be close. You seem to think that the experience is going to be massively better. That isn't likely the case. Latency improvements is the only given.

And you do know the reason why every game developer uses AFR now, right? AMD and Nvidia asked them too, because they want to sell cards, and it gave a better experience in the past. Maybe this time around they can work out the problems, but if it suffers low scaling and less consistent frame rates, it won't catch on.

What good is 30% more FPS if the frame times are not as consistent as one card? The key to this all working is to either have equal frame consistency as 1 card and reasonable scaling, that or have great scaling. If one of those isn't true, is it going to be better than just using a single card?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
70% more what? 50% more what? Frame rate? Frame rate minus latency increase? Frame rate minus latency increase only when it goes over 50ms?

You ignore that Frame Rate is not the only measure of actual gaming performance (e.g. How good it looks at the users eyeballs). Smoothness in motion is obviously important in making something look good. Higher frame rate = smoother, all else being equal and this is why we even started evaluating frame rate in the first place!

The thing is, SFR vs AFR not everything else is equal. You can have the same frame rate and have it be smoother. If you've ever played a game with Mantle turned on (even without SFR) you'll know what I'm talking about.

You are ignoring that AFR adds latency, which is a negative scaling factor while SFR decreases latency which is a positive scaling factor. You're throwing out valuable data and coming to an ill informed conclusion because of it.

It could be that in DX12 we will see in-engine SFR mgpu schemes which scale worse than in-engine AFR schemes with all relevant performance factors included. It could also be that SFR scales better with all relevant performance factors included.

Relevant performance factors: subjective smoothness of frame delivery, objective frame time 99th percentile figures, objective frame time "time spent over 16.6ms" figures, objective frame rates as recorded at the monitor via FCAT, objective frame rates as recorded by FRAPS or other ingame utility, minimum frame rates, maximum frame rates, average frame rates, maximum frame times, minimum frame times, average frame times, even objectively measured frame persistence figures when displayed on the monitor (e.g. blur reduction modes and the like). There are probably more that I haven't thought of.

Looking only at frame rate as measured by FRAPs alone has fallen by the wayside as incomplete.

I have made numerous notes about latency improvements with SFR. I am not ignoring that.

I'm not ignoring that experience matters most, but you are ignoring that 30% scaling isn't not a lot, and any multi-GPU setup is not going to deliver as smooth of performance as a single card.

I'm not only looking at FRAPs, I just happen to know that it takes a fair bit of FPS over a single card setup to over come the flaws of multi-GPU's. You don't seem to understand this. You seem to believe SFR will have perfect frame times, and I don't believe this will be the case. In the past, it was far worse than AFR, this time around will have to be better for it to be worth it.

If they can up the scaling, and minimize the frame time inconsistencies, it can catch on, but if the past has any indication of things, that seems unlikely.

The only hope for it is that game dev's will do it better this time around. I'm all for lower latency, but low scaling and less consistent frame times would kill the advantages.

And lets not forget the influence of AMD and Nvidia, who want to sell cards. They killed SFR in the past, they can go the same route if things aren't lining up.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I'm not ignoring that experience matters most, but you are ignoring that 30% scaling isn't not a lot, and any multi-GPU setup is not going to deliver as smooth of performance as a single card

Yet again, 30% of what? How are you incorporating all performance figures into your 30% figure? You are NOT comparing apples to apples.
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
And you do know the reason why every game developer uses AFR now, right? AMD and Nvidia asked them too, because they want to sell cards, and it gave a better experience in the past. Maybe this time around they can work out the problems, but if it suffers low scaling and less consistent frame rates, it won't catch on.

And lets not forget the influence of AMD and Nvidia, who want to sell cards. They killed SFR in the past, they can go the same route if things aren't lining up.


Do you have proof for these claims?


We could just read the Fireaxis' thoughts on SFR, we don't need to make stuff up.

One of the less publicized advantages of AMD’s Mantle API is how it exposes explicit control over all GPUs in a machine to the game developer.

So what makes the most sense for Civilization: Beyond Earth? We believe that response time, the time between a user action and when that action is displayed on the screen, is one of the most important factors in providing a good user experience for gaming.

Current multi-GPU solutions are implemented in the driver, without knowledge of, or help from, the game rendering engine. With the limited information available drivers are almost forced to implement AFR, or Alternate Frame Rendering, which is an approach where individual frames are rendered entirely on a single GPU. By alternating the GPU used each frame, rendering for a given frame can be overlapped with rendering of previous frames, resulting in higher overall frame rates. The cost, however, is an extra frame of latency for each GPU past the first one. This means that AFR multi-GPU solutions have worse response time than a single GPU capable of similar frame rates.

In Civilization: Beyond Earth we have decided to go in a different direction. Rather than trying to maximize frame rates while lowering quality, we asked ourselves a question: How fast can we get a dual-GPU solution without lowering quality at all? In order to answer this question, we implemented a split-screen (SFR) multi-GPU solution for the Mantle version of the game. Unlike AFR, SFR breaks a single frame into multiple parts, one per GPU, and processes the parts in parallel, gathering them into the final image at the end of the frame. As you might expect, SFR has very different characteristics than AFR, and our choice was heavily motivated by our design of the Civilization rendering engine, which fits the more demanding requirements of SFR well. Playing the game with SFR enabled will provide exactly the same quality of experience as playing with a single, more powerful GPU.
http://www.firaxis.com/?/blog/single/mantle-multi-gpu-for-civilization-beyond-earth
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
Also...

Some views on the matter by Nvidia employee.

SFR was an adequate alternative several years ago when games were not using such advanced rendering techniques, but now that we see geometric tessellation and complex shading effects becoming much more common, the pitfalls of screen-portioned rendering (split-frame, scissor frame, supertiling, etc.) become a lot more pronounced. Overdrawing is the biggest problem here; all of the vertices for scene geometry have to be transformed by each GPU even if they are not within the GPU's assigned region, meaning geometry performance cannot scale like it does with AFR, and any polygon between multiple rendering regions has to be fully textured and shaded by each GPU whose region it occupies, which is wasteful. Of course, there are also complications that can rise from inaccurate workload allocations.

https://forums.geforce.com/default/topic/527523/modern-sfr-split-frame-rendering-compatabilty/
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Its interesting reading Bystander's points today, and remembering the smoothness fad when Nvidia disinterestedly handed PcPer with the FCAT tool (cause they are such good guys). Now people dismissing a smoother approach to multi-GPU because "fps scaling matters".


It is the textbook definition of moved goalpost.


Pro tip: FPS only matters as a broad indicator of gaming experience and even today, it is widely used to determine how good is the gaming running on your PC/console/whatever. The moment frametimes came into discussion, it became a new indicator of performance because it can more precisely tell you about the gaming experience than FPS. You cant dismiss frametimes and smoothness now because we all know Maxwell sucks at it in multi-gpu configurations. I feel sorry for the brand loyals, but I wont be moving the point of discussion regarding gaming performance just because some company changed their performance parameters from one uarch to the other. Frametimes and smoothness are here to stay and the introduction of SFR reflects this. The moar FPS mantra should be put to rest already.

PS: Just reading the above quote, you can devise techniques to reinforce the seaming of tesselated polys when the screens divide without needing to render the whole geometry for both GPUs. I think even today's SFR solutions are taking higher than final buffer rendering regions, overlapping a prefixed ammount of pixel rows in the middle to make a seamless rendered frame. I would any day take a small FPS hit because of this if it guarantees a smoother experience.
 
Last edited:

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Its interesting reading Bystander's points today, and remembering the smoothness fad when Nvidia disinterestedly handed PcPer with the FCAT tool (cause they are such good guys). Now people dismissing a smoother approach to multi-GPU because "fps scaling matters".


It is the textbook definition of moved goalpost.


Pro tip: FPS only matters as a broad indicator of gaming experience and even today, it is widely used to determine how good is the gaming running on your PC/console/whatever. The moment frametimes came into discussion, it became a new indicator of performance because it can more precisely tell you about the gaming experience than FPS. You cant dismiss frametimes and smoothness now because we all know Maxwell sucks at it in multi-gpu configurations. I feel sorry for the brand loyals, but I wont be moving the point of discussion regarding gaming performance just because some company changed their performance parameters from one uarch to the other. Frametimes and smoothness are here to stay and the introduction of SFR reflects this. The moar FPS mantra should be put to rest already.

PS: Just reading the above quote, you can devise techniques to reinforce the seaming of tesselated polys when the screens divide without needing to render the whole geometry for both GPUs. I think even today's SFR solutions are taking higher than final buffer rendering regions, overlapping a prefixed ammount of pixel rows in the middle to make a seamless rendered frame. I would any day take a small FPS hit because of this if it guarantees a smoother experience.

If you have an issue with the post content of another member, you address that member directly.
-- stahlhart
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126

After-the-fact SFR is a completely different conversation compared to an in-engine SFR. There should be a different name for these things because they are entirely different. With dx12 we finally have an API that allows in-engine SFR. Previously it was only after the fact by GPU manufacturers which has very different limitations and challenges, which led to AFR being the most effective method in those circumstances
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
After-the-fact SFR is a completely different conversation compared to an in-engine SFR. There should be a different name for these things because they are entirely different. With dx12 we finally have an API that allows in-engine SFR. Previously it was only after the fact by GPU manufacturers which has very different limitations and challenges, which led to AFR being the most effective method in those circumstances


Not sure why you're quoting me?
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Not sure why you're quoting me?

to elaborate, the nVidia employee you're quoting is talking about old SFR methods like supertiling which are after-the-fact GPU provider solutions. AFR made the most sense in the context of "I'm a vendor without control or code access to most games, and i have to get a solution to scale maximally across as many games as possible, and even if I did have source access, my graphics API doesnt allow me to explicitly handle multi-gpus" But the context is changing in DX12, now it's "I'm a game dev, as I design my engine and game how do I best address multi-gpu systems?" This is why in-engine SFR is fundamentally different than after-the-fact SFR. AFAIK the only modern in engine SFR implementation is Civ:BE
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
If you have an issue with the post content of another member, you quote them.
-- stahlhart

Fixed it for you, I didnt have time to quote each of his posts , so I just mentioned him as means to summarize his toughts in this thread. Save me the lecture please.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
to elaborate, the nVidia employee you're quoting is talking about old SFR methods like supertiling which are after-the-fact GPU provider solutions. AFR made the most sense in the context of "I'm a vendor without control or code access to most games, and i have to get a solution to scale maximally across as many games as possible, and even if I did have source access, my graphics API doesnt allow me to explicitly handle multi-gpus" But the context is changing in DX12, now it's "I'm a game dev, as I design my engine and game how do I best address multi-gpu systems?" This is why in-engine SFR is fundamentally different than after-the-fact SFR. AFAIK the only modern in engine SFR implementation is Civ:BE


I thought that was obvious seeing as how Nvidia didn't have access to a close to metal API until now. And how Fireaxis stated that because of Mantle they were able to access the gpu hw directly they were able to utilize SFR.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Fixed it for you, I didnt have time to quote each of his posts , so I just mentioned him as means to summarize his toughts in this thread. Save me the lecture please.
I found it more convenient than a massive quote but anyway it does help to quote even if you just quote and delete all of their text and just say (snip)..

Anyway even if we use the 30% vs 80%.
Let's spell it out and I'll give my scenario.

I'll be gaming at 4k (mgpu needed).
If I use afr, I may go from 50 fps to average to 90 fps. Not bad!
But my display only goes to 60hz at 4k. So 90fps doesn't help me as much. With afr it's known minimums can be bad(correct me if I'm wrong guys!).
So my minimums can drop out of freesync range.
With sfr, I may go from 50 fps average to 65 fps average. However, now, I'm above 60 fps, and my minimums can stay within freesync range, and I have a better respite time and smoothness..

So afr gives me a high framerate(that I can't use in an optimal mgpu setup gaming at 4k), higher latency, worse experience etc.

Sfr, gives me an experience that gets me close to a 60 fps lock whole still keeping me withing freesync range for low latency smooth gaming.

Without mgpu, I'm out of freesync range and having a bad experience.

So yes, mgpu is worth it in this scenario where I "only" gain 30%. Because I'm not worried about raw fps, I just want to game without stutters, screen tearing, and with decent input lag.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yet again, 30% of what? How are you incorporating all performance figures into your 30% figure? You are NOT comparing apples to apples.

Also that 30% figure is just a made up convenient number. We don't know what it will be with DX12/Vulkan. It might be 50% or 80%.