• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

GCN... Direct3D/DirectX 12?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I think he is talking about the Xbox One.

Could i ask you to be more specific? Which company, though i presume you're suggesting AMD, but then which cards?

Yes sorry I mean the Xbox. Microsoft said during one of their presentations before the launch (which I followed closely as I pre-ordered) that it uses a low level API that was based on DirectX 11 and was highly optimized and made special for the console. Later they announced plans to bring DX 12. They did say though that DX12 won't bring significant improvements, but rather ease development for developers doing games for both PC and Xbox.

http://www.xbitlabs.com/news/multim...ft_Xbox_One_Will_Rely_on_Direct3D_11_API.html

http://www.bit-tech.net/news/gaming/2013/06/28/directx-11-2/

http://www.kitguru.net/gaming/anton...tx-12-will-not-dramatically-improve-xbox-one/
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Yes sorry I mean the Xbox. Microsoft said during one of their presentations before the launch (which I followed closely as I pre-ordered) that it uses a low level API that was based on DirectX 11 and was highly optimized and made special for the console. Later they announced plans to bring DX 12. They did say though that DX12 won't bring significant improvements, but rather ease development for developers doing games for both PC and Xbox.

http://www.xbitlabs.com/news/multim...ft_Xbox_One_Will_Rely_on_Direct3D_11_API.html

http://www.bit-tech.net/news/gaming/2013/06/28/directx-11-2/

http://www.kitguru.net/gaming/anton...tx-12-will-not-dramatically-improve-xbox-one/
Thank you!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
MSFT already said that current cards like GCN, can run DX12, but that doesn't mean they can run all features of DX12. To analogize, a DX10 card might be compatible with a game written for DX11, but the DX10 card can't do tessellation. That's a DX11 feature the card can't handle. Same thing for DX12 and current-generation DX11 cards.

Which DX12 feature that's not found in DX11 support on GCN or Kepler will be missing on those cards in DX12 games? No one has provided even 1 graphical example. Your example of tessellation was a crucial feature of DX11 GPUs. It was advertised/marketed everywhere. There is no such graphical feature that DX12 brings, at least NV, AMD and MSFT never mentioned one to this day. The killer feature of DX12 is lower CPU overhead, not new mind-blowing graphical features. If I missed where DX12 is about new graphical effects, I would love to see pictures :)

Only 1 million GTX970/980 cards have been sold so far vs. Hundreds of millions of gaming PCs out there. What developer will throw 99% of PC gaming market under the bus and make a DX12 game?

Can anyone even list all DX12 games coming out in 2015-2017? As far as I know all 2015 games are not DX12. Star Citizen in 2016, also not DX12. By the time DX12 matters a lot, a $250 14/16nm card will smash a 290X or a 970.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Which DX12 feature that's not found in DX11 support on GCN or Kepler will be missing on those cards in DX12 games? No one has provided even 1 graphical example. Your example of tessellation was a crucial feature of DX11 GPUs. It was advertised/marketed everywhere. There is no such graphical feature that DX12 brings, at least NV, AMD and MSFT never mentioned one to this day. The killer feature of DX12 is lower CPU overhead, not new mind-blowing graphical features. If I missed where DX12 is about new graphical effects, I would love to see pictures :)

Microsoft has not disclosed the new features for DX12 that will actually require DX12 hardware, but they said they will do so at this year's GDC..

Only 1 million GTX970/980 cards have been sold so far vs. Hundreds of millions of gaming PCs out there. What developer will throw 99% of PC gaming market under the bus and make a DX12 game?

Some of the DX12 feature level will have backward compatibility with DX11.x, so making a DX12 game doesn't mean that DX11.x hardware owners won't be able to play it.

I wouldn't be surprised if certain major DX11 games get a post released DX12 performance/IQ enhancing patch that adds the enhanced CPU overhead feature and whatever IQ boosting DX12 exclusive effects that Microsoft has yet to disclose to the game. Witcher 3 will be a good candidate for this I think, and CDPR said in an interview they are actively looking at doing this in fact.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I would imagine you would need an uber powerful GPU setup to run into a major CPU bottleneck in a game like TW3 with a modern overclocked i5/7. Uber Sampling and everything on max in a game like Witcher 3 should drop a 295X2/970SLI well below 60 fps. DX12 games made on next gen game engines will come out later than Witcher 3, and should be even more graphically demanding.

For example, we aren't talking about AC Victory but even 2-3 sequels beyond that. Put it this way, the best way to futureproof for DX12 is to save money today and upgrade to NEXT generation cards. For example, run R9 290CF or 970SLI instead of 980SLI and when DX12 kicks into play, dump those cards and get something way faster at 14/16nm.

Look how quickly the GPU landscape changes:

$650 780 / $699 780Ti and now that performance is easily obtainable in a $275 R9 290X or $330 970. Before that we had 295X2 for $1500 and now it sells for $660.

November 2010 a GTX580 was $499 and now an R9 280 is $150. March 2012 had a GTX680 at $499 and now after-market 290 is $240-250, GTX960 is $200. By the time DX12 matters, 290/290X and 970 will be low end.

Most first generation DX (insert DX8/9/10/11) videocards are not good enough to play next generation games of that DX generation. By Summer of 2015, we will already have cards way faster than a 290/970. Therefore, even if you make the argument that DX12 matters right now, both the 290 and a 970 will be obsoleted by a new $350 card in 2015.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Microsoft has not disclosed the new features for DX12 that will actually require DX12 hardware, but they said they will do so at this year's GDC..
Indeed. There will be a new feature level to go along with Direct3D 12. You don't need them to get the advantages of the low-level API, but they will be available to developers who want to take advantage of them.

http://www.anandtech.com/show/8544/microsoft-details-direct3d-113-12-new-features

Which DX12 feature that's not found in DX11 support on GCN or Kepler will be missing on those cards in DX12 games? No one has provided even 1 graphical example.
Conservative Rasterization. Which not-so-coincidentally is also planned as D3D FL 11_3 feature.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Conservative Rasterization. Which not-so-coincidentally is also planned as D3D FL 11_3 feature.

When can I see an actual graphical difference in games with and without this features on vs. off on a DX11 vs. DX12 videocard?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
With the features that reduce overhead built into DX12 I think it's more likely that cards that support DX12 featuresets will be able to actually run them properly. Unlike DX11 where the first cards were too slow to do some of the major features. It would run, but not that well. I think this time will be different.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
When can I see an actual graphical difference in games with and without this features on vs. off on a DX11 vs. DX12 videocard?

Stop shifting goalposts.

DX12_Features_Conservative_Rasterization_Cropped_Wide.jpg


https://developer.nvidia.com/content/dont-be-conservative-conservative-rasterization

Off- Regular shadow map

fig8.jpg


Regular raytracing

fig9.jpg


Conservative rasturization

fig10.jpg
 

garagisti

Senior member
Aug 7, 2007
592
7
81
@cmdrdredd
9xx series has brought efficiency to table, but it's not really an upgrade. GM200, well, let's see, but the pricing may make it so that it won't be first pick of every enthusiast.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
@cmdrdredd
9xx series has brought efficiency to table, but it's not really an upgrade. GM200, well, let's see, but the pricing may make it so that it won't be first pick of every enthusiast.

What? That's not the topic, it's not relevant to the conversation in the last few posts, and I said nothing about specific cards. We are talking about DX12. DX12 brings benefits of lower overhead and increased performance to any card that can use it. There are some new graphic abilities that may not be used by some older cards that don't have that support. As far as I know the GTX900 series is the first to have full feature support but other cards can benefit from it as well.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Stop shifting goalposts.

DX12_Features_Conservative_Rasterization_Cropped_Wide.jpg


https://developer.nvidia.com/content/dont-be-conservative-conservative-rasterization

Off- Regular shadow map

fig8.jpg


Regular raytracing

fig9.jpg


Conservative rasturization

fig10.jpg

I am not shifting anything. I asked to provide examples and was provided. You linked more data. I appreciate that. When DX12 use the benefits you depicted, I will simply dump my DX11 videocards and buy something much faster with better performance than a 290X/980. That's why I keep saying that for most here DX12 future proofness with existing cards on the market is a checkmark. You first provided the differences that DX12 brings, now you have to answer the 2nd part -- when are we going to see DX12 games with those differences?

If it's in 2017, most of us could care less. By that point cards like 290/970 will be 3-4 years old. Unless you have a list of DX12 games for 2015-2017 that proves why DX12 matters for a GPU purchase right now, you are just talking theoreticals. In practice, most of us will have upgraded to Pascal or Volta or whatever else is there. Don't make a mountain out of a mole hill.

I actually paid attention to the OP. I recommended in other threads for his needs to buy Reference BestBuy blower 970 SLI, which appears to be the best option for him since he can't fit the 295X2's radiator in his case but wants the heat to be exhausted out of his case. So while I did recommend 970 SLI for him, that is not because of DX12. By the time DX12 rolls around, a $400-500 videocard will be as fast as both 970s, maybe even cheaper than that. It's possible that a late 2016/early 2017 $350 Pascal will match or beat GM200 like 970 vs. 780Ti. That took only 10 months.

Unless you think otherwise, I think we have a solid 2-3 years before DX12 games become mainstream. In 2.5 years we have $1000 690/680 SLI/7990 performance in a $550 980. The primary reasons to buy 970 SLI now are its excellent price/performance and great power usage/efficiency over the similarly priced 295X2. For now DX12 is just a marketing feature and nothing else, until games actually start using it and showing vast advantages over running DX11 cards.
 
Last edited:

garagisti

Senior member
Aug 7, 2007
592
7
81
What? That's not the topic, it's not relevant to the conversation in the last few posts, and I said nothing about specific cards. We are talking about DX12. DX12 brings benefits of lower overhead and increased performance to any card that can use it. There are some new graphic abilities that may not be used by some older cards that don't have that support. As far as I know the GTX900 series is the first to have full feature support but other cards can benefit from it as well.
You said something about new cards being sufficient for DX12, and that is what i was addressing. The only DX12 cards that were launched as DX12 cards are 9xx series, or is it not? IIRC, you're wrong about full feature support... in that, i've read that DX12 is to be supported piecemeal as was DX11 by Nvidia on older cards. If you have more information on the same, i'd like you to share.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
I actually paid attention to the OP. I recommended in other threads for his needs to buy Reference BestBuy blower 970 SLI, which appears to be the best option for him since he can't fit the 295X2's radiator in his case but wants the heat to be exhausted out of his case. So while I did recommend 970 SLI for him, that is not because of DX12. By the time DX12 rolls around, a $400-500 videocard will be as fast as both 970s, maybe even cheaper than that. It's possible that a late 2016/early 2017 $350 Pascal will match or beat GM200 like 970 vs. 780Ti. That took only 10 months.

To be fair, I haven't determined fully that the 295X2 won't work - I need to get the tape measure out and measure clearance at the top of the case (assuming those 380mm tubes will reach from the card to the top). The other main concern is Hackintosh compatibility, that is still up in the air: I'm asking for help at tonymacx86 on this topic too.

Some initial research seems to indicate that might be too large for the top, but if I DID go with a Corsair H100i for the CPU, which would go up top and is compatible in my case, I would have all the room in the world on the rear exhaust. I just don't know if I want to go through that trouble of taking apart my system; considering the NH-D14 is an excellent cooler, it really wouldn't be a real upgrade (some charts suggest a handful of degrees centigrade), mostly a side-grade just to make room.

And I've basically completely ignored the idea of the Best Buy reference blower 970s. The reports of fan noise and GPU temperature, even at reference clocks, at THAT price... nope. If they were $300, perhaps. Even after discounts, you can't get them that low.

If I don't get the 295x2, I will be upgrading with additional fans, and likely replacing the stock rear exhaust with one that has higher CFM, unless I do go with an H100i regardless of GPU decision.

Now I'm just trying to figure out which will be fully compatible in OS X and net a good perf/$ ratio.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I am not shifting anything. I asked to provide examples and was provided. You linked more data. I appreciate that. When DX12 use the benefits you depicted, I will simply dump my DX11 videocards and buy something much faster with better performance than a 290X/980. That's why I keep saying that for most here DX12 future proofness with existing cards on the market is a checkmark. You first provided the differences that DX12 brings, now you have to answer the 2nd part -- when are we going to see DX12 games with those differences?

If it's in 2017, most of us could care less. By that point cards like 290/970 will be 3-4 years old. Unless you have a list of DX12 games for 2015-2017 that proves why DX12 matters for a GPU purchase right now, you are just talking theoreticals. In practice, most of us will have upgraded to Pascal or Volta or whatever else is there. Don't make a mountain out of a mole hill.

I actually paid attention to the OP. I recommended in other threads for his needs to buy Reference BestBuy blower 970 SLI, which appears to be the best option for him since he can't fit the 295X2's radiator in his case but wants the heat to be exhausted out of his case. So while I did recommend 970 SLI for him, that is not because of DX12. By the time DX12 rolls around, a $400-500 videocard will be as fast as both 970s, maybe even cheaper than that. It's possible that a late 2016/early 2017 $350 Pascal will match or beat GM200 like 970 vs. 780Ti. That took only 10 months.

Unless you think otherwise, I think we have a solid 2-3 years before DX12 games become mainstream. In 2.5 years we have $1000 690/680 SLI/7990 performance in a $550 980. The primary reasons to buy 970 SLI now are its excellent price/performance and great power usage/efficiency over the similarly priced 295X2. For now DX12 is just a marketing feature and nothing else, until games actually start using it and showing vast advantages over running DX11 cards.

You said that gcn 1.0 (7970) supports all DX 12 features. Someone mentioned conservative rasturation. You then said that it doesn't matter.

Maybe for people on this forum but there are still plenty of people playing on older cards. Look at the steam hardware survey and see how many are on 460/450/550/560 type cards.

Fermi may be old but it will still play most games on modest 1080p settings. 5xxx series is showing its age with respect to tessellation and such.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
It's a misconception that the only GPU architecture to support conservative rasterization is the 2nd gen Nvidia Maxwell ...

Any DX11 capable GPU is capable of supporting conservative rasterization by implementing the algorithms in the geometry shader. Hell, even DX9 GPUs are capable of performing conservative rasterization in the vertex shaders!

Also the 2nd gen Nvidia Maxwell is not the first GPU to support hardware conservative rasterization, that goes to Intel Haswell!
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's a misconception that the only GPU architecture to support conservative rasterization is the 2nd gen Nvidia Maxwell ...

Any DX11 capable GPU is capable of supporting conservative rasterization by implementing the algorithms in the geometry shader. Hell, even DX9 GPUs are capable of performing conservative rasterization in the vertex shaders!

Also the 2nd gen Nvidia Maxwell is not the first GPU to support hardware conservative rasterization, that goes to Intel Haswell!

Yes, but at what performance cost? DX12 will likely make the technique much more efficient and performance friendly to be actually worth using.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
It's a misconception that the only GPU architecture to support conservative rasterization is the 2nd gen Nvidia Maxwell ...

Any DX11 capable GPU is capable of supporting conservative rasterization by implementing the algorithms in the geometry shader. Hell, even DX9 GPUs are capable of performing conservative rasterization in the vertex shaders!

Also the 2nd gen Nvidia Maxwell is not the first GPU to support hardware conservative rasterization, that goes to Intel Haswell!

The article I linked mentioned that but they said it would incur a massive performance hit.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Yes, but at what performance cost? DX12 will likely make the technique much more efficient and performance friendly to be actually worth using.
Both of which are very good points. In theory, you can do almost anything on Shader Model 3.0 hardware. It's not quite Turing Complete, I believe, but it's very close. Tessellation, order independent transparency (via linked lists), conservative rasterization, typed UAV loads; all of that can be emulated to some degree in shaders.

It's just completely, utterly, and absolutely slow. The algorithms you'd need to use to emulate it on shaders are quite different than the much more efficient algorithms you can implement in fixed function hardware. GPUs aren't CPUs and we still have a ton of fixed function hardware, an important distinction both because of the shader vs. FF performance considerations, but also because there are certain techniques you want to do immediately before or after other FF steps in the rendering pipeline (e.g. conservative rasterization), before it even reaches the shaders.

This is why we implement some of these features in FF hardware rather than doing it in shaders, and why having hardware support for it can be an important distinction for development purposes.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You said something about new cards being sufficient for DX12, and that is what i was addressing. The only DX12 cards that were launched as DX12 cards are 9xx series, or is it not? IIRC, you're wrong about full feature support... in that, i've read that DX12 is to be supported piecemeal as was DX11 by Nvidia on older cards. If you have more information on the same, i'd like you to share.


Here's the thing, because dx12 brings performance benefits and lower overhead it is likely that when dx12 games come out they won't be unplayable on first gen hardware. With dx11 the cards didn't have enough horsepower for all the new features.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Here's the thing, because dx12 brings performance benefits and lower overhead it is likely that when dx12 games come out they won't be unplayable on first gen hardware. With dx11 the cards didn't have enough horsepower for all the new features.
Hmm, good point, but let us see. There will always be games like Civ BE (not that i play that genre, but i can respect good programming for what it is) which can use all you can give and then some.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Yes, but at what performance cost? DX12 will likely make the technique much more efficient and performance friendly to be actually worth using.

Conservative rasterization can be efficiently done using today's hardware capabilities ...

There's two algorithms to look at. First one computing the optimal bounding polygon with aprons in the geometry shader and the second one computing the bounding triangle then the pixel shader will discard fragments that do not overlap with the axis-aligned bounding box ...

Each of them has an overestimated or an underestimated variant but we're more interested in the overestimated variant since that has more applications to computer graphics ...

In theory Nvidia thought that the first algorithm would have been the better candidate for higher performance and a much better implementation could be achieved using a geometry shader but ironically it is the second algorithm that prevailed ...

The first algorithm is costly in terms of vertex processing and it's more involved with the geometry shader too to make matters worse. The second algorithm is costly in terms of fillrates and pixel shader processing but it's faster in practice ...

I'm not saying the performance cost isn't negligible but it's somewhat moderate so it's pretty usable as far as performance impact goes. Heck, attempting conservative rasterization is pretty easy on the PS4.

As long as there are more Tflops, bandwidth, and fillrate the issue of performance impact using conservative rasterization will disappear altogether ...
 
Last edited:

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
The article I linked mentioned that but they said it would incur a massive performance hit.

The article didn't say MASSIVE ...

That's an overstatement on your part ...

There is an overhead but not to the degree you imply since voxelization (which enforces conservative rasterization by default in it's pipeline) was easily achieved in The Tomorrow Children on the PS4 ...