AMD confirms feature-level 12_0 for GCN maximum

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
It is a bit odd that NV is all of a sudden spinning DX12_1 so much and creating marketing slides specifically for it. Why journalists even care to write so many articles on DX12_0 vs. DX12_1 feature set unless NV is doing background marketing push for it? There are no DX12 games out today, and not a single DX12.1 has even been announced/confirmed to be in development. When are those coming? In 2-3 years or will NV work with developers to put them into GameWorks titles?

Is nV trying to spin DX12_1 as more "future-proof"? That's funny.

AMD vs. NV marketing starting June 16, 2015
R9 390 8GB vs. 970 3.5GB
R9 390X 8GB vs. 980 4GB
Fiji - next generation HBM GPU memory 512GB/sec+ vs. GM200 336GB/sec

NV's response is DX12.1? Hilarious spin. :D

Yeah and dont forget:
Fiji 4Gb vs. GTX980TI 6GB.

But for whatever reason this is not a problem for you. Hilarious spin. :D

BTW, here is an example for the difference between DX11.0 and DX11.1:
bf4_windows_m.png

http://pclab.pl/art55318-10.html
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yeah and dont forget:
Fiji 4Gb vs. GTX980TI 6GB.

But for whatever reason this is not a problem for you. Hilarious spin. :D

BTW, here is an example for the difference between DX11.0 and DX11.1:
bf4_windows_m.png

http://pclab.pl/art55318-10.html

There is no DX12.1 It's DX12_1 with _1 being a feature set. Do we know for sure that you have to completely support 12_0 to be 12_1 capable? I haven't seen where DX12_1 feature set also includes all DX12_0 feature support. Maybe I missed it?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
API DX12 is supported with Feature Level 11_0. In fact API DX12 redefines the feature level.

With DX11.1 Microsoft introduced a new system that vendors can support features without support for certain feature levels. In the end feature levels are irrelevant. What matters are the supported features.

And yes you need to support the lower features level for the higher one.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah and dont forget:
Fiji 4Gb vs. GTX980TI 6GB.

But for whatever reason this is not a problem for you. Hilarious spin. :D

BTW, here is an example for the difference between DX11.0 and DX11.1:
bf4_windows_m.png

http://pclab.pl/art55318-10.html

That BF4 example has nothing to do with DX11.1. It is a driver issue on W7. You must be on something if you think DX11.1 gave Kepler a 50% increase in performance in that game. Not a single professional site corroborates that result in BF. I can't read Polish but that site always has THE most worthless data on the Internet. It's worse than PC Perspective. They have been proven to fudge game results that favor NV on purpose in GTA V before they got called out. Don't bother linking anything from them if you want to present a solid argument.

As far as 4GB HBM1 vs. 6GB GDDR5, I can't tell you who will win. 6GB did squat to help OG Titan that today gets dropped easily by a 290.

980 SLI beats Titan X in every 2560x1600 benchmark I've seen where SLI scales well. If you can produce data that shows a real benefit of > 4GB of VRAM at 2560x1600 with good fps on a Titan X, I would be interested to see it.

Once benchmarks come out, we will have a better idea. To benefit > 4GB, one is then comparing 980Ti SLi vs. FIJI CF, but that isn't a clear cut advantage for NV since CF is often smoother than SLi bridges.
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
That BF4 example has nothing to do with DX11.1. It is a driver issue on W7. You must be on something if you think DX11.1 gave Kepler a 50% increase in performance in that game. Not a single professional site corroborates that result in BF. I can't read Polish but that site always has THE most worthless data on the Internet. It's worse than PC Perspective. They have been proven to fudge game results that favor NV on purpose in GTA V before they got called out. Don't bother linking anything from them if you want to present a solid argument.

As far as 4GB HBM1 vs. 6GB GDDR5, I can't tell you who will win. 6GB did squat to help OG Titan that today gets dropped easily by a 290.

980 SLI beats Titan X in every 2560x1600 benchmark I've seen where SLI scales well. If you can produce data that shows a real benefit of > 4GB of VRAM at 2560x1600 with good fps on a Titan X, I would be interested to see it.

Once benchmarks come out, we will have a better idea. To benefit > 4GB, one is then comparing 980Ti SLi vs. FIJI CF, but that isn't a clear cut advantage for NV since CF is often smoother than SLi bridges.

So funny now you are defending that 4GB is more than enough.If AMD had 8GB HBM than i swear i will you first member here to say 4GB is not enough and Nvidia got own by AMD.
Be fair in analysis because reviews know better than you and they know their job.

You are giving your self a excuse that AA is not needed for 4K but fact you u cannot max out on 4GB.

Moderator action taken for callout and trolling. If you have an argument then use technical information to make your case instead of accusing others of bias
Moderator Subyman
 
Last edited by a moderator:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So funny now you are defending that 4GB is more than enough.If AMD had 8GB HBM than i swear i will you first member here to say 4GB is not enough and Nvidia got own by AMD.
Be fair in analysis because reviews know better than you and they know their job.

You are giving your self a excuse that AA is not needed for 4K but fact you u cannot max out on 4GB.

Did you see me say anything about 4GB being enough for 4K with MSAA? Don't make up stuff I didn't say.

You should try to read my posts. I am not recommending FIJI over 980Ti as I haven't seen any benchmarks for AMD's card. Not going to bother arguing with you for 20 pages as you only buy NV and think anyone who disses NV is automatically an AMD fan. Same for the guy above you - he only buys NV, probably has Green bedsheets. Still remember his drivel how OG Titan's DP was worth the premium and how he argued it won't become worthless for next gen games -- he was wrong of course but defended that overpriced turd...
 
Last edited:

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Did you see me say anything about 4GB being enough for 4K with MSAA? Don't make up stuff I didn't say.

You should try to read my posts. I am not recommending FIJI over 980Ti as I haven't seen any benchmarks for AMD's card. Not going to bother arguing with you for 20 pages as you only buy NV and think anyone who disses NV is automatically an AMD fan. Same for the guy above you - he only buys NV, probably has Green bedsheets. Still remember his drivel how OG Titan's DP was worth the premium and how he argued it won't become worthless for next gen games -- he was wrong of course but defended that overpriced turd...


Hahah that was funny*thumb up*
 

flopper

Senior member
Dec 16, 2005
739
19
76
It is known that Win 8 scheduler does better job for core affinity on AMD than Win 7 ( in this case CPU gets great penalty ). That and very likely different drivers and there you have it.

yea only one uneducated would use that as an example of dx.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
It is known that Win 8 scheduler does better job for core affinity on AMD than Win 7 ( in this case CPU gets great penalty ). That and very likely different drivers and there you have it.
Thank you my good sir for bringing that up. Performance increase was supposedly considerable. What is funnier is that AMD processors work even better on Linux.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Only thing this thread seems to confirm is how good AMD 7970 of a card is. Can't believe it was released almost 3.5 years ago and can effortlessly play any game near max at 1080p. Now compare it to gtx 580 and have a laugh. AMD's 8800GT.

This puppy has some grunt left in it.

7970 none GHz edition 22 desember 2011 680 GTX 22 Mars 2012 that's 3 months difference. Now compare the 2 in modern titles with update drivers. Claiming AMD's 8800GT is a far stretch since 680 was a midrange gpu and 7970GHz edition was high end card :whiste:

Some benches stolen from anandtech site, the 7970 dont seem to be able to get big enough leap in performance to say hey i can run games u cant?

http://www.anandtech.com/bench/product/508?vs=555
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
7970 none GHz edition 22 desember 2011 680 GTX 22 Mars 2012 that's 3 months difference. Now compare the 2 in modern titles with update drivers. Claiming AMD's 8800GT is a far stretch since 680 was a midrange gpu and 7970GHz edition was high end card :whiste:

Some benches stolen from AMD site the 7970 dont seem to be able to get big enough leap in performance to say hey i can run games u cant?

http://www.anandtech.com/bench/product/508?vs=555

What was the 680's high end counterpart then? And how's it doing now?
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
What was the 680's high end counterpart then? And how's it doing now?

I dont think they released a 680TI in that series cause weak competition and moved onto the refreshed 700 series (may 2013) to meet the 200 series of AMD that came late at october 2013.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Then that would make the 680 the high end product of its range, no?

Yes if u go by that logic i gues, i go by the codename of the product as an expample XX110 is hell yes we talking high end, XX104 meh midrange moving along. Hope that made sense? Not sure how to explain it properly, maybe someone can assist me? :oops:
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Yes if u go by that logic i gues, i go by the codename of the product as an expample XX110 is hell yes we talking high end, XX104 meh midrange moving along. Hope that made sense? Not sure how to explain it properly, maybe someone can assist me? :oops:

Yeah, I'm saying that they're similarly positioned in the stack, both had a similar larger core coming down the road, so it's not like the 680 was a much smaller core.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
I thought this was ended when the obvious statement of "let's see how DX12 games turn out first" was made.

Seriously.

First, we have to wait until the big hardcore developers put the work in to actually utilize DirectX 12. You won't see small studios develop for it, not immediately. One reason why Mantle hasn't seen significant attention - it's not because AMD has a lower market share, it's because AMD has a lower market share COMBINED with the fact that it is difficult to develop closer to the metal. DICE is really the only developer that really crazy about the idea, Crytek still hasn't brought out the Mantle support they promised awhile ago.

With the code now getting into DX 12 and Vulkan, it still remains the big dogs promising support. Due to the complexity of developing bare metal (or closer to it), more games will use the big developer's packaged engines. UE4, Frostbite, CryEngine, and Unity will basically be the main players, for better or worse.

I doubt studios like CD Projekt RED will take to DX12 any time soon, likely not bringing RED Engine 3 to support DX 12 until at least a year or two after the fact, or more likely never. I'd love to see a TW3 Enhanced Edition with DX12 and see Cyberpunk 2077 launch with DX12 support, but I highly, highly doubt that will happen. I'd love to be wrong, oh boy would I. :)


The short of it is, I just don't see 12_1 being utilized much at all, not before the next generation of cards. I say this is someone who prefers Nvidia.
There might be some minor, very minor support of FL 12_1, but it's going to be very minimal and be more akin to Crysis 2's DX 11 patch-in, which is to say, inefficient and heavy-handed. It'll take a more focused and long-haul approach to get quality and efficient utilization of 12_1 features added to the major engines, likely the next-gen engines too, ones coded from the ground up with it in mind. I don't expect the current gen engines (Frostbite 3, for example) to get anything more than efficiency-based 12_0 features. Battlefield 5 or whatever is next for DICE after SW: Battlefront, will likely be carry the Guide On for Frostbite 4 with a focus on "native" support for 12_0 and 12_1.

I do suspect a Frostbite engine with efficient support for DX 12 will be first out of the gate, and more so, the first engine wholly developed focusing on DX 12 will be an engine from Dice. They helped pioneer Mantle, so they are the most prepared for DX 12. I don't think they can do much but basically code on top of Frostbite 3 to showcase some features and efficiency, but whether they code from the ground up or just more intuitively work in support into Frostbite 4, it'll be among the better examples on how to do it right.

Not saying that the engine or games based on it will be the greatest thing since sliced bread, or that netcode problems or other issues that BF games have faced will be absent and resolved, but I do suspect they'll have the lead in development.
 

atticus14

Member
Apr 11, 2010
174
1
81
While I still don't think it'll be overnight, since games are in development for 2 years or so, and we still have to get those games out of the pipeline before Engines are updated, but.... I'm pinning my hopes on the fact that Xbox one has DX12 and it will spur a faster adoption rate. Hopefully XBONE DX12 is pretty much interchangeable with PC DX12. We (PC gamers) already seem to be on the radar for more ports than ever which I'm going to attribute that to the growth of PC gaming and the ease of porting from X86 + standard GPU consoles.

With Win10 being free for what I think would be the majority of modern PC gamers, the appeal (to Devs) to support DX12 should be much higher than what we have seen in the past.
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Coding for the metal creating new engines requires a huge studio and specialized labor. It will over time mean a consolidation of engines used.
The target is obviously where the huge market is and thats the consoles. You dont write of 50% of market because of a few features in dx12. Ofcource not.

And the entire idea that someone should program for far less than 1% of the market is idiotic.

The base level is consoles. But hey its a HUGE step forward from dx10/11 and even dx9 that is still used in some kind of derivation on many games.
Mantle (dx12/vulcan) is THE big step since 3d arived.
 
Last edited:

Samwell

Senior member
May 10, 2015
225
47
101
There is FL12_1. Just documentation isn't ready yet. It's sad, that David Kanter doesn't know that, but he is more a CPU than GPU guy. Actually Max McMullen showed clearly the features of 12_1 at his GDC Talk.
I was reading through posts from a game developer at b3d. His opnion on the topic was like: Great features especially for voxelization, but the features require big changes in engines and probably won't be used fast as consoles don't have that.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
There is FL12_1. Just documentation isn't ready yet. It's sad, that David Kanter doesn't know that, but he is more a CPU than GPU guy. Actually Max McMullen showed clearly the features of 12_1 at his GDC Talk.
I was reading through posts from a game developer at b3d. His opnion on the topic was like: Great features especially for voxelization, but the features require big changes in engines and probably won't be used fast as consoles don't have that.

If you read a few threads farther, someone links him fl12_1 documentation and he accepts that. Kanter was waiting for MS official confirmation