D3D12 is Coming! AMD Presentation

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
How is it up to nVidia? Star Swarm run great on nVidia hardware with DX12. So at this point in time their drivers were good enough for the engine.

Developers have asked for a low level API. Now it's their turn to make the best out of it. Blaming a graphics company for worse performance is just wrong...
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
How is it up to nVidia? Star Swarm run great on nVidia hardware with DX12. So at this point in time their drivers were good enough for the engine.

Developers have asked for a low level API. Now it's their turn to make the best out of it. Blaming a graphics company for worse performance is just wrong...

I said the blame is on all. The developers and the hardware vendors need to work together to fix the problem. If they're unable to, both will be blame for it. Believe me, if games continue to run like crap on a particular vendor, expect people to switch to a working one. It doesn't matter who's fault it is. In the end, Nvidia will lose customers if DX12 games run worst than AMDs or vice versa. How can you not get this?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I got it.
You didnt answer the question how nVidia could have done more. They are not responsible for the performance of the DX12 path. It is up to the developer to work with nVidia and to optimize their low level api for the hardware.

Like i said: Snail ported their DX11 Cryengine 3 game to DX12 and got 20% more performance. So why is Oxide not able to get a positive performance scaling on nVidia with an engine they are working on for over a year?
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I got it.
You didnt answer the question how nVidia could have done more. They are not responsible for the performance of the DX12 path. It is up to the developer to work with nVidia and to optimize their low level api for the hardware.

Like i said: Snail ported their DX11 Cryengine 3 game to DX12 and got 20% more performance. So why is Oxide not able to get a positive performance scaling on nVidia with an engine they are working on for over a year?

Who knows? Maybe because AMD's work with Mantle gave them a head start with DX 12 (considering DX12 is nearly identical to Mantle)? Maybe because Oxide and AMD are partners and therefore, they are able to better optimize the game? Maybe Nvidia's hardware is less suited to run this DX12 engine when compared to AMD? Maybe Nvidia's DX12 driver isn't as optimized at this point in time? It could be a number of things.

But I'm glad you got my point. In the end, it hurts Nvidia's customer base if not fixed.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Maybe because AMD's work with Mantle gave them a head start with DX 12 (considering DX12 is nearly identical to Mantle)?

Star Swarm Benchmark? :|

Maybe Nvidia's hardware is less suited to run this DX12 engine when compared to AMD? Maybe Nvidia's DX12 driver isn't as optimized at this point in time? It could be a number of things.

DX12 on nVidia is slower than DX11. We dont need to compare it with AMD.
This should be a red flag for the developer and it is his job to prevent such misbehaviour of the engine. That they released the benchmark in this state say more about the company than the DX12 drivers from nVidia.

Maybe because Oxide and AMD are partners and therefore, they are able to better optimize the game?

Bingo. Reading the blog from Oxide it is clear that they just dont use nVidia hardware. Otherwise they would have found the MSAA bug earlier and dont need nVidia's help to optimize their code...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Star Swarm Benchmark? :|



DX12 on nVidia is slower than DX11. We dont need to compare it with AMD.
This should be a red flag for the developer and it is his job to prevent such misbehaviour of the engine. That they released the benchmark in this state say more about the company than the DX12 drivers from nVidia.



Bingo. Reading the blog from Oxide it is clear that they just dont use nVidia hardware. Otherwise they would have found the MSAA bug earlier and dont need nVidia's help to optimize their code...

What MSAA bug?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106

Wow!!!!!! That is the single most cherry picked statement I've ever seen. EVER!

By Dan Baker, Co-Founder, Oxide Games

This past week we made available a pre-beta of Ashes of the Singularity, our upcoming massive-scale real-time strategy game. Amongst other firsts, it utilizes DirectX 12 which became available as part of the Windows 10 launch last month. Our game also includes a 3D benchmark for users to play with.

Unfortunately, we have to make some corrections because as always there is misinformation. There are incorrect statements regarding issues with MSAA. Specifically, that the application has a bug in it which precludes the validity of the test. We assure everyone that is absolutely not the case. Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months. Fundamentally, the MSAA path is essentially unchanged in DX11 and DX12. Any statement which says there is a bug in the application should be disregarded as inaccurate information.

So what is going on then? Our analysis indicates that any D3D12 problems are quite mundane. New API, new drivers. Some optimizations that the drivers are doing in DX11 just aren’t working in DX12 yet. Oxide believes it has identified some of the issues with MSAA and is working to implement workarounds on our code. This in no way affects the validity of a DX12 to DX12 test, as the same exact workload gets sent to everyone’s GPUs. This type of optimization is just the nature of brand new APIs with immature drivers.

Immature drivers are nothing to be concerned about. This is the simple fact that DirectX 12 is brand-new and it will take time for developers and graphics vendors to optimize their use of it. We remember the first days of DX11. Nothing worked, it was slower then DX9, buggy and so forth. It took years for it to be solidly better then previous technology. DirectX12, by contrast, is in far better shape then DX11 was at launch. Regardless of the hardware, DirectX 12 is a big win for PC gamers. It allows games to make full use of their graphics and CPU by eliminating the serialization of graphics commands between the processor and the graphics card.

I don’t think anyone will be surprised when I say that DirectX 12 performance, on your hardware, will get better and better as drivers mature.

Why don't we post the whole quote? I've highlighted where they specifically stated that there is no bug. Then I highlighted the one sentence which you used to claim that there was.
 
Feb 19, 2009
10,457
10
76
He failed to mention this follow up from Oxide directly:

Nvidia mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode. More on that issue here. On top of that, the effect on their numbers is fairly inconsequential. As the HW vendor's DirectX 12 drivers mature, you will see DirectX 12 performance pull out ahead even further.

http://forums.oxidegames.com/470406

So either NV is lying again or Oxide is lying. Who do you trust? 970 4GB!! 3.5GB... It's a feature.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
He failed to mention this follow up from Oxide directly:



http://forums.oxidegames.com/470406

So either NV is lying again or Oxide is lying. Who do you trust? 970 4GB!! 3.5GB... It's a feature.

Well considering Oxide called nVidia's statement incorrect and told them to "tread lightly" on Twitter, and nVidia didn't defend themselves. I'd say it's pretty obvious who was correct in their statements.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Wow!!!!!! That is the single most cherry picked statement I've ever seen. EVER!



Why don't we post the whole quote? I've highlighted where they specifically stated that there is no bug. Then I highlighted the one sentence which you used to claim that there was.

And yet they have found problems on their site and they are working to fix it on their site.

Funny, huh? :hmm:

This happens when you only use hardware from one company.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
And yet they have found problems on their site and they are working to fix it on their site.

Funny, huh? :hmm:

This happens when you only use hardware from one company.

It's typical to have to optimize both drivers and code for brands. It's not the same as a bug and it's the inherent issue with not being able to do it for Gameworks.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Sure, it is normal to release a broken application and blaming a graphics vendor for it when they have never tested the application on their hardware. The fact that they can solve the issue on their site is proof enough that the driver has nothing to do with it.

The fact that Oxide has found problems in their code says enough. They could have fixed it before the release of the benchmark. They could have fixed the broken DX12 path, too.

They did neither.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
People put too much into an early alpha stage benchmark.

However it will be more interesting to see what developers can actually handle DX12 without some sort of penalty. And how will never cards run with older games. The BF4/Thief and GCN 1.2 comes to mind here with Mantle.
 
Feb 19, 2009
10,457
10
76
Sure, it is normal to release a broken application and blaming a graphics vendor for it when they have never tested the application on their hardware. The fact that they can solve the issue on their site is proof enough that the driver has nothing to do with it.

The fact that Oxide has found problems in their code says enough. They could have fixed it before the release of the benchmark. They could have fixed the broken DX12 path, too.

They did neither.

Read it, it's in English.

Nvidia mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode. More on that issue here. On top of that, the effect on their numbers is fairly inconsequential. As the HW vendor's DirectX 12 drivers mature, you will see DirectX 12 performance pull out ahead even further.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Sure, it is normal to release a broken application and blaming a graphics vendor for it when they have never tested the application on their hardware. The fact that they can solve the issue on their site is proof enough that the driver has nothing to do with it.

The fact that Oxide has found problems in their code says enough. They could have fixed it before the release of the benchmark. They could have fixed the broken DX12 path, too.

They did neither.

nVidia has had the source for every single build for the last year. If there was a problem why didn't they notice it? Are they only testing with AMD hardware?
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
@sontin nvidia had plenty of optimization time with the software, even releasing a beta driver, rewriting shaders and having source code access.

Also before you try discrediting oxide remember how polished the dx11 version is and how they even supported command lists.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
How many times will you repeat it?
The bug gets exposed through the driver. Oxide has never tested the application on nVidia hardware. nVidia fixed the MSAA problem on their site with their DX11 driver. This is not possible with DX12. So it is needed to get fixed in the code.

Oxide even mentioned they could have done it:
We've offered to do the optimization for their DirectX 12 driver on the app side that is in line with what they had in their DirectX 11 driver.
But it is easier to blame nVidia instead of doing the job. :lol:

nVidia has had the source for every single build for the last year. If there was a problem why didn't they notice it? Are they only testing with AMD hardware?

nVidia noticed the problem and made it public. :awe:
Oxide only discovered it days after they sent keys to the media.
 
Last edited:
Feb 19, 2009
10,457
10
76
So Oxide says the bug is in NV's DX12 driver and OFFERED to fix it for NV... wow, nice guys!

NV should go on the offensive some more, take it to a twitter war.. I'll prepare the popcorn..

hmm.. wait, *crickets chirping*, nope, that's the end of that. NV lied, got caught out, now smart enough not to dig a deeper hole. But it's supporters will continue to bash Oxide in Nv's place.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
There are close to 40M AMD GCN based consoles today (PS4 and XBONE) and 4M more are added every quarter.
Now add all those Dekstop/Notebook GCN APUs and dGPUs and we are looking at a minimum of 100M GCN hardware gamers.

Just for those they believe AMD GPU market share is small.

2016 Nintendo NX coming too.

Why is this even important? There were even more DX9 based unified shader (Terascale) Nintendo Wii+360+Wii U users than there currently are "console" GCN users. It didn't help AMD a lick during DX9 or DX11 eras of PC gaming.

Big name ports coming to PC got hijacked by Nvidia. I'd expect the same to continue to happen.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
And yet they have found problems on their site and they are working to fix it on their site.

Funny, huh? :hmm:

This happens when you only use hardware from one company.

Wow, that's amazing how they can get everything optimized on the DX11 front without using Nvidia's hardware at all. That's some impressive feat. I could only imagine how awesome Oxide would do on DX12's optimization when Nvidia decides to ship them some hardware.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Wow, that's amazing how they can get everything optimized on the DX11 front without using Nvidia's hardware at all. That's some impressive feat. I could only imagine how awesome Oxide would do on DX12's optimization when Nvidia decides to ship them some hardware.

nVidia is optimizing DX11 not Oxide. Or why do you think they are nearly 2x faster than AMD with DX11? :hmm:

So Oxide says the bug is in NV's DX12 driver and OFFERED to fix it for NV... wow, nice guys!

NV should go on the offensive some more, take it to a twitter war.. I'll prepare the popcorn..

hmm.. wait, *crickets chirping*, nope, that's the end of that. NV lied, got caught out, now smart enough not to dig a deeper hole. But it's supporters will continue to bash Oxide in Nv's place.

Yeah, because nVidia is the company using Twitter to complain about software companies... :|

And it is really nice that Oxide offers to fix their game engine. :awe:
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Reduced detail means lower poly counts, which means lower setup costs for the CPU.

The difference is almost always negligible. Build a system that isn't giving you your desired performance because of a CPU bottleneck, and start lowering settings.... It's going to be difficult to reduce the settings low enough to free up enough CPU resources to make an appreciable difference. There are exceptions, like the number of cars on track when you're playing a racing game, but those are exceptions. Most of the time, if you're CPU limited, game settings are going to do very little to help you.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Why is this even important? There were even more DX9 based unified shader (Terascale) Nintendo Wii+360+Wii U users than there currently are "console" GCN users. It didn't help AMD a lick during DX9 or DX11 eras of PC gaming.

Big name ports coming to PC got hijacked by Nvidia. I'd expect the same to continue to happen.

This isn't an accurate comparison.

First of all, the Wii didn't use AMD's unified shader Terascale architecture. Its hardware was recycled from the Gamecube, with DirectX 8 feature level fixed function shader GPU probably more similar to the original Radeon. It's irrelevant to the discussion.

The Xbox 360 did use a unified shader architecture, but it was more of a testbed for the ideas that would later become Terascale, not quite the same thing. There's also the fact that when Terascale first arrived on PC with the Radeon HD 2000 series is was sort of a massive turd that made AMD back away from producing "big die" graphics chips for a while. Beyond the graphics chip, the 360 used the IBM PowerPC CPU architecture rather than the PC standard x86 architecture, requiring the code to be different on an even more fundamental level than GPU differences. And the 360 is only half of the 7th gen HD consoles; the Playstation 3 used an Nvidia GPU (a fixed function Geforce 7000 series chip, at that) and the even more exotic Cell architecture for its CPU. With the 8th generation consoles, both the Xbox One and the Playstation 4 are using graphics chips all but identical to the GCN 1.1 architecture, and they both use the same x86 architecture CPUs. The only significant remaining hardware differences between those consoles and a PC with GCN based graphics cards are the unified CPU/GPU memory pools on both and the 32 MB of ESRAM on the Xbox One.

That's the hardware side; on the software side, the whole point of DirectX 12 is to bring PC methods of game development into convergence with console development. While the 360/PS3 hardware was more or less DirectX 9 feature level, developers had much closer access to the hardware. PC development also quickly moved on from the features of 7th gen consoles to DirectX 10 and 11 feature levels, where console development really didn't have influence one way or the other. DirectX 12 does a lot to give developers the same kind of low-level access to the hardware, and it has the same feature set as the hardware in consoles.

The similarities in both hardware and software between the Xbox One/PS4 and a PC running GCN graphics chips in DirectX 12 (or Vulkan) are unprecedented. It's very possible that AMD stands to leverage an advantage in how console games will already be coded and optimized for GCN hardware on consoles even before they're ported to PC. This is ultimately just conjecture until we really start seeing cross-platform DirectX 12 games like Deus Ex Mankind Divided and Fable Legends. But I don't think you can point to the results of the previous console generation as a reason AMD won't have an advantage.

Come to think of it, Deus Ex Mankind Divided is an AMD sponsored game, while Fable Legends is a Microsoft game not really sponsored by either IHV. Is there any upcoming Nvidia-sponsored game that is known to use DirectX 12?
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
This isn't an accurate comparison.

First of all, the Wii didn't use AMD's unified shader Terascale architecture. Its hardware was recycled from the Gamecube, with DirectX 8 feature level fixed function shader GPU probably more similar to the original Radeon. It's irrelevant to the discussion.

The Xbox 360 did use a unified shader architecture, but it was more of a testbed for the ideas that would later become Terascale, not quite the same thing. There's also the fact that when Terascale first arrived on PC with the Radeon HD 2000 series is was sort of a massive turd that made AMD back away from producing "big die" graphics chips for a while. Beyond the graphics chip, the 360 used the IBM PowerPC CPU architecture rather than the PC standard x86 architecture, requiring the code to be different on an even more fundamental level than GPU differences. And the 360 is only half of the 7th gen HD consoles; the Playstation 3 used an Nvidia GPU (a fixed function Geforce 7000 series chip, at that) and the even more exotic Cell architecture for its CPU. With the 8th generation consoles, both the Xbox One and the Playstation 4 are using graphics chips all but identical to the GCN 1.1 architecture, and they both use the same x86 architecture CPUs. The only significant remaining hardware differences between those consoles and a PC with GCN based graphics cards are the unified CPU/GPU memory pools on both and the 32 MB of ESRAM on the Xbox One.

That's the hardware side; on the software side, the whole point of DirectX 12 is to bring PC methods of game development into convergence with console development. While the 360/PS3 hardware was more or less DirectX 9 feature level, developers had much closer access to the hardware. PC development also quickly moved on from the features of 7th gen consoles to DirectX 10 and 11 feature levels, where console development really didn't have influence one way or the other. DirectX 12 does a lot to give developers the same kind of low-level access to the hardware, and it has the same feature set as the hardware in consoles.

The similarities in both hardware and software between the Xbox One/PS4 and a PC running GCN graphics chips in DirectX 12 (or Vulkan) are unprecedented. It's very possible that AMD stands to leverage an advantage in how console games will already be coded and optimized for GCN hardware on consoles even before they're ported to PC. This is ultimately just conjecture until we really start seeing cross-platform DirectX 12 games like Deus Ex Mankind Divided and Fable Legends. But I don't think you can point to the results of the previous console generation as a reason AMD won't have an advantage.

Come to think of it, Deus Ex Mankind Divided is an AMD sponsored game, while Fable Legends is a Microsoft game not really sponsored by either IHV. Is there any upcoming Nvidia-sponsored game that is known to use DirectX 12?

TW3 and...batman.