computerbaseAshes of the Singularity Beta1 DirectX 12 Benchmarks

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
All GCN GPUs are DX12 compatible. They all support the optional Async Compute feature.

They do not support Feature 12_1 which is currently for Maxwell 2 and perhaps Skylake.

GCN 1.0 doesn't support FL12.0. Only GCN 1.1 and 1.2 does.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Nope, all GCN cards supports FL12_0 (FL11_1).

Where are your informations from?

FL11.1 isn't the same as FL12.0.

They all support DX12, but only GCN 1.1 and 1.2 supports FL12.0.

DX%20Feature%20Levels_2.jpg


B47WPmC.jpg


7970.png


http://www.computerbase.de/2015-06/directx-12-amd-radeon-feature-level-12-0-gcn/

Where are you getting your information from? :)
 
Last edited:

Krteq

Golden Member
May 22, 2015
1,007
719
136
You are right, my bad.

Anyway, GCNs 1.0 Tiled Resources support is somewhere between tier 1 and tier 2 (almost full tier 2 support) via PRT.

MSDN - D3D12_TILED_RESOURCES_TIER enumeration
D3D12_TILED_RESOURCES_TIER_2

Indicates that a superset of Tier_1 functionality is supported, including this additional support:
When the size of a texture mipmap level is at least one standard tile shape for its format, the mipmap level is guaranteed to be nonpacked. For more info, see D3D12_PACKED_MIP_INFO.

Shader instructions are available for clamping level-of-detail (LOD) and for obtaining status about the shader operation. For info about one of these shader instructions, see Sample(S,float,int,float,uint). Sample(S,float,int,float,uint).

Reading from NULL-mapped tiles treat that sampled value as zero. Writes to NULL-mapped tiles are discarded. Adapters that support feature level 12_0 all support TIER_2 or greater.
AMD-DirectX-11.2-635x344.jpg

amd-directx-11.2jtu96.jpg


The DirectX 11.2 API makes use of 3D Tiled Resources and exposes AMD’s partially-resident texture feature. It also allows hardware managed virtual memory for the graphics processing unit and has several Tier-2 features supported such as Shader LOD clamp and mapped status feedback, mini/max reduction filtering and reads from non-mapped title returns 0.

Read more: http://wccftech.com/review/radeon-r...0-icooler-graphic-cards-review/#ixzz3zZnL61Ci
GCN Architecture whitepaper
While many of the critical improvements in GCN are related to general purpose programmability, they are often beneficial in the context of graphics. For
example, developers can use Partially Resident Textures (PRTs) to create a world with near-infinite texture detail. The GPU's virtual memory system will only load
the necessary parts into memory. On GCN, up to 32TB of virtual memory is available for textures. As this dwarfs the available physical memory, the driver and
GPU page in 64KB tiles of the texture as needed. This process is transparent, and relieves the programmer of explicitly preloading data into memory. PRTs are
already used in leading edge game engines such as id Tech 5 and will be common in the next generation of game engines and hit games.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
None of those things Silverforce mentioned improved performance on NVIDIA hardware, all they did was potentially decrease performance on AMD hardware. How do end users with NVIDIA graphics cards benefit?

NV cards decrease less in performance than AMD with GimpWorks. So NV cards look better in benches. The losers are we consumers because performance is worse over all cards due to GimpWorks also NVs.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
None of those things Silverforce mentioned improved performance on NVIDIA hardware, all they did was potentially decrease performance on AMD hardware. How do end users with NVIDIA graphics cards benefit?

Yeah, NV wins, but not by increasing their performance, but by limiting the other side's performance. What benefit is that to any of us? The only one it helps is NVIDIA. NVIDIA users aren't affected one way or the other, except that if they want to buy a new card for better performance, they're not being offered all the options.

The real serious NV fans win because their team wins and that's what really matters to them, since they can bask in the radiated glory of their chosen brand and enjoy the sweet dopamine pumping action of being right on the internet without any real work. Generally though, it's not nearly as important to create value for customers as creating the perception of value, that'll get you a huge fraction of buyers.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The real serious NV fans win because their team wins and that's what really matters to them, since they can bask in the radiated glory of their chosen brand and enjoy the sweet dopamine pumping action of being right on the internet without any real work.

Is that's what is going on? Well the inverse explains the AMD fan reaction. It all makes sense now.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Is that's what is going on? Well the inverse explains the AMD fan reaction. It all makes sense now.

I have a feeling there are far fewer of those than you see. It actually takes a bit of work that's harder than looking at pictures and saying green bar bigger, my team's the winner.

There's also a nice sized contingent who just utterly loathe NV and what they've been doing. Have we gotten a GW game yet that isn't a disappointment of some sort technically? It's like an inverse Nintendo Seal of Quality.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
There may be a lot of Nvidia fans which are disappointed with Nvidia approach to hardware and they are wording it out.

I am one of them.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I have a feeling there are far fewer of those than you see. It actually takes a bit of work that's harder than looking at pictures and saying green bar bigger, my team's the winner.

There's also a nice sized contingent who just utterly loathe NV and what they've been doing. Have we gotten a GW game yet that isn't a disappointment of some sort technically? It's like an inverse Nintendo Seal of Quality.

Dunno about the rest of the posters here, but I'm happy with FFXIV :D

Outside of the usual suspects lathering on their hatred for NV, as an old red army ant - I don't think I realized how terrible both sides were until I was in the middle. Either way, I'm enjoying my hardware (and purchases in the past, minus 660 Ti SLI).

Interesting enough, this is the second conversation that brings up the Nintendo Seal of Quality.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Dunno about the rest of the posters here, but I'm happy with FFXIV :D

Outside of the usual suspects lathering on their hatred for NV, as an old red army ant - I don't think I realized how terrible both sides were until I was in the middle. Either way, I'm enjoying my hardware (and purchases in the past, minus 660 Ti SLI).

Interesting enough, this is the second conversation that brings up the Nintendo Seal of Quality.

Yeah, I was reading that thread (and the review it's based on), so it's in my mind. And I'm not qualified to talk about FFXIV, not my cup of leaf water at all.

And I might be somewhat bitter about NV after seeing what they did with the market share I helped them get by being a big old sucker for evga (I literally did not purchase a single ATI/AMD card until the R9 290, and the first card I bought was iirc a 6600 GT).
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Yeah, I was reading that thread (and the review it's based on), so it's in my mind. And I'm not qualified to talk about FFXIV, not my cup of leaf water at all.

And I might be somewhat bitter about NV after seeing what they did with the market share I helped them get by being a big old sucker for evga (I literally did not purchase a single ATI/AMD card until the R9 290, and the first card I bought was iirc a 6600 GT).

I'm from the other side. I've own all AMD/ATI cards except for a one night stand with an 8800 GTS.

I was loving ATI when they were sponsoring games, such as FFXIV. Then either the money ran out or they through so much focus into Mantle that stopped. And NV crept in.

Where I stand, the games I've played with Gameworks, only about a handful have been disappointing. But then again, I'm not running out and buying every single Gameworks title. Just the ones that interest me.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
The witcher 3, gta 5.

The witcher 3 of hairworks tanked the framerates so bad AMD added a tesselation factor override to their drivers fame?

GTA 5 might have been fine, I don't remember it being particularly good but I don't remember it being particularly bad.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
And I might be somewhat bitter about NV after seeing what they did with the market share I helped them get by being a big old sucker for evga (I literally did not purchase a single ATI/AMD card until the R9 290, and the first card I bought was iirc a 6600 GT).

6600 GT was a legendary card used mine for a year and then bought the 7900 GTX. I'm just mad at NV because of Kepler performance in the last few months but other than that i'm in love with my GTX 980TI.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
The witcher 3 of hairworks tanked the framerates so bad AMD added a tesselation factor override to their drivers fame?

GTA 5 might have been fine, I don't remember it being particularly good but I don't remember it being particularly bad.

GTA 5 is thankfully from a studio big enough to tell nvidia or AMD which orifice south of their collective waistlines they can stuff their proprietary wares if there was an funny uncompetitive business going on. I wish all studios were like this.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
6600 GT was a legendary card used mine for a year and then bought the 7900 GTX. I'm just mad at NV because of Kepler performance in the last few months but other than that i'm in love with my GTX 980TI.

I pretty much managed to sync up with NV having really good releases somehow, and starting with the 6600 GT and 8800 GT gave me a real good impression of NV. I'm going to be buying AMD whenever possible and appropriate until it's not viable or their market share recovers, but that's as much trying to make sure we have competition as because I dislike NV (brand allegiance is a fool's bet).
 

tential

Diamond Member
May 13, 2008
7,348
642
121
GTA 5 is thankfully from a studio big enough to tell nvidia or AMD which orifice south of their collective waistlines they can stuff their proprietary wares if there was an funny uncompetitive business going on. I wish all studios were like this.

GTA 5 was delayed until it worked...

That's why.

Ubisoft, uses gameworks, they don't give time to let it be implemented well. They rush the port, because MONEY. Then everything barely works together, but who cares?

Everyone will buy the game anyway, no matter how much it sucks.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
The witcher 3 of hairworks tanked the framerates so bad AMD added a tessellation factor override to their drivers fame?

GTA 5 might have been fine, I don't remember it being particularly good but I don't remember it being particularly bad.

AMD tessellation factor override had been available since Crysis 2.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
GTA 5 was not a gameworks game. It had vendor tech from both AMD and nVidia AND their own home-coded tech too, which ended up both looking better (imo) and running better.

GTA 5 was a fantastic PC port by the way, triple monitor support out of the box, lots of special sliders for having super extended view distances and other stuff you can still hardly run maxed even on 980 Ti SLI rigs, lots of cool effects. Decent mod support (though it gets tricky w/ online modes). Keybindings are good out of the box with support to change it. I think it even has old school joystick support for flying, but I havent tested that yet. My only complaint is that the menu systems are still console focused, but beyond that they did a great job.

I'll take a port that takes another year but has great PC support over a sloppy fast port every single time
 
Feb 19, 2009
10,457
10
76
Is there a difference in rendering for the recent Ashes build on AMD vs NV?

Side by side comparison:
https://youtu.be/o2b6Nncu6zY?t=1m14s

Where's all the dynamic lights in the 980 scene? Aircraft engine exhaust. Missile engine exhaust, weapons charging up. On the 980 it's missing all of these lights.
 
Feb 19, 2009
10,457
10
76
Both footage were likely captured on different monitors I assume ...

Probably yes, but monitors don't affect rendering. Certainly the glowing lights are really obviously missing.

The reason I brought this up, Oxide said they use Async Compute for some of their lighting. Here it looks disabled.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Probably yes, but monitors don't affect rendering. Certainly the glowing lights are really obviously missing.

The reason I brought this up, Oxide said they use Async Compute for some of their lighting. Here it looks disabled.

Then all those commands just get pushed to a single command queue ...
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,784
3,101
146
Probably yes, but monitors don't affect rendering. Certainly the glowing lights are really obviously missing.

The reason I brought this up, Oxide said they use Async Compute for some of their lighting. Here it looks disabled.

The video is not available for me to view in the US. Any other links?

I was going to look to compare what the benchmark looks like on my PC (With a 980Ti) to see if I can reproduce the difference you are noticing.

Based on your description, I definitely have the glowing lights from weapons (both lasers and missile engines) and smoke from aircraft and missiles. These did not used to be in the benchmark for me, but a few months ago after an update they were all there.
 
Last edited: