Someone explain the AMD Nvidia DX12 difference?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Yakk

Golden Member
May 28, 2016
1,574
275
81
So let me repeat it: 2 major DX12 features are supported by Intel and NVIDIA HW, but not by AMD. Giving the limited resources AMD has this is hardly surprising. Let's hope Vega fixes this problem, even though after seeing how little was delivered in terms of new features on Polaris I am not holding my breath.

AMD have integrated what they call their "Primative Discard Accelerator" into Polaris architecture. It has not been talked much about yet, but we can be sure it will be enabled should Conservative Rasterization be used. ;)
 

renderstate

Senior member
Apr 23, 2016
237
0
0
AMD have integrated what they call their "Primative Discard Accelerator" into Polaris architecture. It has not been talked much about yet, but we can be sure it will be enabled should Conservative Rasterization be used. ;)



Primitive discarding is not a DX12 feature and it has nothing to do with conservative rasterization. It's just an HW optimization for small polygons.
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
Primitive discarding is not a DX12 feature and it has nothing to do with conservative rasterization. It's just an HW optimization for small polygons.

What?!?!

Emm... Primative Discard works hand in hand with Conservative Rasterization.

It's a bit much to type, but to start at the beginning here is a basic introduction video to Conservative Rasterization and DX12 directly from Microsoft Graphics Education.

https://m.youtube.com/watch?v=zL0oSY_YmDY
 

dogen1

Senior member
Oct 14, 2014
739
40
91
What?!?!

Emm... Primative Discard works hand in hand with Conservative Rasterization.

It's a bit much to type, but to start at the beginning here is a basic introduction video to Conservative Rasterization and DX12 directly from Microsoft Graphics Education.

[/QUOTE] Everyone knows what con...ort, they probably would have said something.
 

renderstate

Senior member
Apr 23, 2016
237
0
0
What?!?!



Emm... Primative Discard works hand in hand with Conservative Rasterization.



It's a bit much to type, but to start at the beginning here is a basic introduction video to Conservative Rasterization and DX12 directly from Microsoft Graphics Education.



https://m.youtube.com/watch?v=zL0oSY_YmDY


I was talking about the new Polaris primitive discard unit, not primitive discarding in general. As I mentioned earlier in this thread conservative rasterization is the corner stone of countless rendering algorithms, which is why it's such an important feature.
 

hsjj3

Member
May 22, 2016
127
0
36
Could it be such that due to Nvidia's terrible DX12/Async performance, devs could target AMD for DX12 and Nvidia for DX11?

And correct me if I am wrong, is it fair to say that DX12 doesn't offer anything more over DX11 in terms of visuals?

Also, wasn't DX10 slightly low key? Could the same happen to DX12?
 

Mikeduffy

Member
Jun 5, 2016
27
18
46
Could it be such that due to Nvidia's terrible DX12/Async performance, devs could target AMD for DX12 and Nvidia for DX11?

And correct me if I am wrong, is it fair to say that DX12 doesn't offer anything more over DX11 in terms of visuals?

Also, wasn't DX10 slightly low key? Could the same happen to DX12?

Vast majority of high-profile games released in 2016 are directx12 - this is a fact and won't change.

DX12 is here to stay, many games released on the XBone will be ported over to PC with minimal effort. I don't believe this is common to just MS games, but every developer will follow suit.

Enthusiasts here care about the latest hardware, but they don't seem to give a crap about how their hardware performs on cutting edge APIs - this is something that I'll never understand.

Anyways, Nvidia was just stupid when they turned their back on the consoles, they need to get back into the market. Perhaps, they should sell their hardware at lower margins cause GCN is the industry standard as it stands today - taking into account consoles of course.

I mean to say that they should lower margins to win console contracts only.
 
Last edited:
Mar 10, 2006
11,715
2,012
126
Anyways, Nvidia was just stupid when they turned their back on the consoles, they need to get back into the market. Perhaps, they should sell their hardware at lower margins cause GCN is the industry standard as it stands today - taking into account consoles of course.

NVIDIA's business is bringing in record revenue, margin, and has overwhelmingly dominant share of every high margin GPU market segment.

I think they did OK.

By the way, AMD really needs to the console revenue just to stay in business barring a sharp improvement in its CPU/dGPU businesses -- they will offer prices lower than what NVIDIA is likely willing to as AMD's near-term survival depends on winning the consoles.
 

linkgoron

Platinum Member
Mar 9, 2005
2,300
821
136
Anyways, Nvidia was just stupid when they turned their back on the consoles, they need to get back into the market. Perhaps, they should sell their hardware at lower margins cause GCN is the industry standard as it stands today - taking into account consoles of course.

Why? Nvidia is completely owning the 250$+ price points, even the (year old) AIB 980ti are still unmatched. Their margins/profits have never been higher.

Even in DX12 Fury X easily loses to the 1080, and usually loses or is maybe equal to the 1070. I assume that Vega 10 will beat the 1080 at DX12, but by then we will have a Titan/1080ti which will, again, beat Vega 10.

All the while, nvidia is killing it with the 1080 and 1070 and will probably outsell (and out perform with lower power usage) the 480 with the 1060.
 
Aug 11, 2008
10,451
642
126
Well, to be fair, nVidia doesn't have a cpu anyway, so it would be pretty hard for them to make a competitive offering for consoles.

The big threat to AMD is that the consoles go ARM at some point, so the same programming could be used for phones and the consoles, with just less demanding settings for phones. Seems like the console makers want to stick with X86 though.

nVidia really needs to step up their DX12 performance. I actually think nVidia cards might age much better this generation, because they have a lot of optimizations left for DX12 (just guessing), while AMD is pretty much already optimized for it.
 

Mikeduffy

Member
Jun 5, 2016
27
18
46
Turning their backs on the consoles has given AMD their best shot in years.

I'm just saying it was a stupid move on their part to think that they didn't need the market and that their Sheild would do the job for them. They publically said that they didn't want the business and I don't see this as a wise move.


Anyways back on topic about dx12 - guess nobody can argue with me about that fact that almost every high-profile game released in 2016 will have a dx12 codepath. Anyone that says dx12 doesn't matter isn't in touch with reality.
 
Last edited:
Mar 10, 2006
11,715
2,012
126
Well, to be fair, nVidia doesn't have a cpu anyway, so it would be pretty hard for them to make a competitive offering for consoles.

The big threat to AMD is that the consoles go ARM at some point, so the same programming could be used for phones and the consoles, with just less demanding settings for phones. Seems like the console makers want to stick with X86 though.

nVidia really needs to step up their DX12 performance. I actually think nVidia cards might age much better this generation, because they have a lot of optimizations left for DX12 (just guessing), while AMD is pretty much already optimized for it.

If consoles go ARM, NVIDIA can play with Denver cores or licensed ARM cores. I don't think they will go ARM; lots of value in keeping it X86 IMHO.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Turning their backs on the consoles has given AMD their best shot in years.

I'm just saying it was a stupid move on their part to think that they didn't need the market and that their Sheild would do the job for them. They publically said that they didn't want the business and I don't see this as a wise move.


Anyways back on topic about dx12 - guess nobody can argue with me about that fact that almost every high-profile game released in 2016 will have a dx12 codepath. Anyone that says dx12 doesn't matter isn't in touch with reality.
as long as nv can handle the pressure from the console effect for 1 more year, nv will have nothing to worry about. a new design is about 3 years? we are already more than 2/3 of the ways through the lack of support for some of the dx12 features period. by late 2017 or early 2018, nv would have an answer for asynch compute or other dx12/console features for sure. with how well their marketing is, and full spread of sponsored games, I don't foresee any problems for nv at all. it is pretty incredible how well nv handled dx12 and the console effects so far.

amd couldn't capitalize on their advantage shows how bad the top management at amd is.

as a gamer, I wish amd luck. they are the only thing standing in the way of 500$ mid range gpus. :(
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Why? Nvidia is completely owning the 250$+ price points, even the (year old) AIB 980ti are still unmatched. Their margins/profits have never been higher.

Wow what a great time to be a gamer, I love paying more for less!
 

linkgoron

Platinum Member
Mar 9, 2005
2,300
821
136
Wow what a great time to be a gamer, I love paying more for less!

This is going way off topic. I was just answering his comment about nvidia's supposedly bad business decisions. I never said anything about gamers being better off.

Anyways back on topic about dx12 - guess nobody can argue with me about that fact that almost every high-profile game released in 2016 will have a dx12 codepath. Anyone that says dx12 doesn't matter isn't in touch with reality.

It doesn't matter at the high end when the 1070 and 1080 are beating the Fury X in DX12. We'll see how things pan out with 1060 and 480 - I think it'll be interesting to see, as it is more important the midrange and lower segments.

However, even though Vega 10 will probably beat the 1080 in DX12, it'll probably lose to the Titan P or 1080ti. AMD are too early with their dx12 advantage, and too late with their high-end lineup (Vega). Once DX12 is important and has more than a handful of games, we'll see nvidia having full dx12 cards thanks to their domination and profits from DX11, and I'm not sure if AMD will keep the advantage they have today.

I do agree though that reviewers are not highlighting AMD's current DX12 advantage at all, except maybe Hardware Canucks.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
as long as nv can handle the pressure from the console effect for 1 more year, nv will have nothing to worry about. a new design is about 3 years? we are already more than 2/3 of the ways through the lack of support for some of the dx12 features period. by late 2017 or early 2018, nv would have an answer for asynch compute or other dx12/console features for sure. with how well their marketing is, and full spread of sponsored games, I don't foresee any problems for nv at all. it is pretty incredible how well nv handled dx12 and the console effects so far.

amd couldn't capitalize on their advantage shows how bad the top management at amd is.

as a gamer, I wish amd luck. they are the only thing standing in the way of 500$ mid range gpus. :(

The console effect and async compute may be an advantage for amd, but will it be enough to compensate for the 50% higher clock Pascal has compared to Polaris?
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Ive been wondering about this DX 12 thing for a while guess i will post this here.

Will Nvidia be able to add performance for DX 12 through drivers or is it a issue with the hardware not supporting dx 12 features?

Im looking at buying a 1070 and am trying to figure out if thats a good idea, as DX 12 will surly become more widely used as time goes on, making AMD's better DX 12 performance more appealing over the long term, i just wish AMD had a current gen card above the $200 price point.
 

littleg

Senior member
Jul 9, 2015
355
38
91
Could it be such that due to Nvidia's terrible DX12/Async performance, devs could target AMD for DX12 and Nvidia for DX11?

And correct me if I am wrong, is it fair to say that DX12 doesn't offer anything more over DX11 in terms of visuals?

Also, wasn't DX10 slightly low key? Could the same happen to DX12?

DX12 is one of the selling points for Windows 10. Microsoft will be pushing it to get people to upgrade. And Microsoft has a lot of 'push' when they put their mind to it.
 

24601

Golden Member
Jun 10, 2007
1,683
39
86
Ive been wondering about this DX 12 thing for a while guess i will post this here.

Will Nvidia be able to add performance for DX 12 through drivers or is it a issue with the hardware not supporting dx 12 features?

Im looking at buying a 1070 and am trying to figure out if thats a good idea, as DX 12 will surly become more widely used as time goes on, making AMD's better DX 12 performance more appealing over the long term, i just wish AMD had a current gen card above the $200 price point.

In DX11 and before IHVs wrote the drivers.

In DX12 game developers write the drivers.

It shouldn't be too much of a surprise that game developers don't actually write drivers, they invite the AMD and/or Nvidia team to come and write the drivers for them.

In DX11, there isn't really much a Developer + IHV combo agreement can do to actually hinder the other IHV, as the other IHV can always just write their own driver to run the game as they see fit.

In DX12, the IHV that is excluded (deliberately or not) literally can do nothing to rectify a badly written driver that the game developers (in league with their sponsoring IHV) makes.

AKA, the people crying for low level hardware addressing got their wish, and now we are back to basically the bad old days of API wars, only it's with just one api now, DX12.

It means you will have to buy an Nvidia GPU for Nvidia exclusives and you will have to buy an AMD GPU for AMD exclusives now.

If you want DX12 that is.

2c
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
you can't compare amd and nv clock by clock.

Actually, they can be compared by clock. Just like in the CPU world, it's not instructions per clock, but performance per clock. I'm sure someone's written LINPACK/LAPACK (or similar) benchmarks for these things.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Actually, they can be compared by clock. Just like in the CPU world, it's not instructions per clock, but performance per clock. I'm sure someone's written LINPACK/LAPACK (or similar) benchmarks for these things.

Like comparing a 4.7GHz 9590 vs. a 3GHz 5960X? How does that comparison work out?
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Actually, they can be compared by clock. Just like in the CPU world, it's not instructions per clock, but performance per clock. I'm sure someone's written LINPACK/LAPACK (or similar) benchmarks for these things.
could it work? has it ever been done? I am super curious now. please respond when you can :thumbsup:
 

PigSkinWhiteBoy

Junior Member
Jul 6, 2016
10
1
11
as long as nv can handle the pressure from the console effect for 1 more year, nv will have nothing to worry about. a new design is about 3 years? we are already more than 2/3 of the ways through the lack of support for some of the dx12 features period. by late 2017 or early 2018, nv would have an answer for asynch compute or other dx12/console features for sure. with how well their marketing is, and full spread of sponsored games, I don't foresee any problems for nv at all. it is pretty incredible how well nv handled dx12 and the console effects so far.

amd couldn't capitalize on their advantage shows how bad the top management at amd is.

as a gamer, I wish amd luck. they are the only thing standing in the way of 500$ mid range gpus. :(

People said Pascal was supposed to be their answer for Async compute, and they're still bad at it.

This is going way off topic. I was just answering his comment about nvidia's supposedly bad business decisions. I never said anything about gamers being better off.



It doesn't matter at the high end when the 1070 and 1080 are beating the Fury X in DX12. We'll see how things pan out with 1060 and 480 - I think it'll be interesting to see, as it is more important the midrange and lower segments.

However, even though Vega 10 will probably beat the 1080 in DX12, it'll probably lose to the Titan P or 1080ti. AMD are too early with their dx12 advantage, and too late with their high-end lineup (Vega). Once DX12 is important and has more than a handful of games, we'll see nvidia having full dx12 cards thanks to their domination and profits from DX11, and I'm not sure if AMD will keep the advantage they have today.

I do agree though that reviewers are not highlighting AMD's current DX12 advantage at all, except maybe Hardware Canucks.

You're comparing two different generations. The fury x destroys the 980 ti in async games. Even the 390 gives it trouble, and that's a mid range vs a flagship.

They have at least four cards left. A 490 will likely beat the 1070. A 490x will likely beat the 1080. Then, they have a vega-Fury and a vega-Fury X. This time, they won't have to wait a year for devs to take advantage of their hardware since dx12 and async are becoming the norm.

The console effect and async compute may be an advantage for amd, but will it be enough to compensate for the 50% higher clock Pascal has compared to Polaris?

We'll see. As of now, Nvidia is not performing very well in async games. Maxwell is terrible at it, and even Pascal isn't seeing as big of gains as I thought it would considering the 1080 is a $700 card (no one sells it at the claimed $600 msrp).

The only thing they have left is a Pascal Titan and a 1080ti (x80ti cards are usually just cheaper titans more or less).

I was on the Pascal hype train early on until I've seen benches that show that it doesn't gain much from dx12 AND tweaktown's benches showing that two 480s can beat it.