ShintaiDK
Lifer
All GCN GPUs are DX12 compatible. They all support the optional Async Compute feature.
They do not support Feature 12_1 which is currently for Maxwell 2 and perhaps Skylake.
GCN 1.0 doesn't support FL12.0. Only GCN 1.1 and 1.2 does.
All GCN GPUs are DX12 compatible. They all support the optional Async Compute feature.
They do not support Feature 12_1 which is currently for Maxwell 2 and perhaps Skylake.
Nope, all GCN cards supports FL 12_0 (FL 11_1).GCN 1.0 doesn't support FL12.0. Only GCN 1.1 and 1.2 does.
Nope, all GCN cards supports FL12_0 (FL11_1).
Where are your informations from?
D3D12_TILED_RESOURCES_TIER_2
Indicates that a superset of Tier_1 functionality is supported, including this additional support:
When the size of a texture mipmap level is at least one standard tile shape for its format, the mipmap level is guaranteed to be nonpacked. For more info, see D3D12_PACKED_MIP_INFO.
Shader instructions are available for clamping level-of-detail (LOD) and for obtaining status about the shader operation. For info about one of these shader instructions, see Sample(S,float,int,float,uint). Sample(S,float,int,float,uint).
Reading from NULL-mapped tiles treat that sampled value as zero. Writes to NULL-mapped tiles are discarded. Adapters that support feature level 12_0 all support TIER_2 or greater.
GCN Architecture whitepaperThe DirectX 11.2 API makes use of 3D Tiled Resources and exposes AMD’s partially-resident texture feature. It also allows hardware managed virtual memory for the graphics processing unit and has several Tier-2 features supported such as Shader LOD clamp and mapped status feedback, mini/max reduction filtering and reads from non-mapped title returns 0.
Read more: http://wccftech.com/review/radeon-r...0-icooler-graphic-cards-review/#ixzz3zZnL61Ci
While many of the critical improvements in GCN are related to general purpose programmability, they are often beneficial in the context of graphics. For
example, developers can use Partially Resident Textures (PRTs) to create a world with near-infinite texture detail. The GPU's virtual memory system will only load
the necessary parts into memory. On GCN, up to 32TB of virtual memory is available for textures. As this dwarfs the available physical memory, the driver and
GPU page in 64KB tiles of the texture as needed. This process is transparent, and relieves the programmer of explicitly preloading data into memory. PRTs are
already used in leading edge game engines such as id Tech 5 and will be common in the next generation of game engines and hit games.
None of those things Silverforce mentioned improved performance on NVIDIA hardware, all they did was potentially decrease performance on AMD hardware. How do end users with NVIDIA graphics cards benefit?
None of those things Silverforce mentioned improved performance on NVIDIA hardware, all they did was potentially decrease performance on AMD hardware. How do end users with NVIDIA graphics cards benefit?
Yeah, NV wins, but not by increasing their performance, but by limiting the other side's performance. What benefit is that to any of us? The only one it helps is NVIDIA. NVIDIA users aren't affected one way or the other, except that if they want to buy a new card for better performance, they're not being offered all the options.
The real serious NV fans win because their team wins and that's what really matters to them, since they can bask in the radiated glory of their chosen brand and enjoy the sweet dopamine pumping action of being right on the internet without any real work.
Is that's what is going on? Well the inverse explains the AMD fan reaction. It all makes sense now.
I have a feeling there are far fewer of those than you see. It actually takes a bit of work that's harder than looking at pictures and saying green bar bigger, my team's the winner.
There's also a nice sized contingent who just utterly loathe NV and what they've been doing. Have we gotten a GW game yet that isn't a disappointment of some sort technically? It's like an inverse Nintendo Seal of Quality.
Dunno about the rest of the posters here, but I'm happy with FFXIV 😀
Outside of the usual suspects lathering on their hatred for NV, as an old red army ant - I don't think I realized how terrible both sides were until I was in the middle. Either way, I'm enjoying my hardware (and purchases in the past, minus 660 Ti SLI).
Interesting enough, this is the second conversation that brings up the Nintendo Seal of Quality.
The witcher 3, gta 5.Have we gotten a GW game yet that isn't a disappointment of some sort technically? It's like an inverse Nintendo Seal of Quality.
Yeah, I was reading that thread (and the review it's based on), so it's in my mind. And I'm not qualified to talk about FFXIV, not my cup of leaf water at all.
And I might be somewhat bitter about NV after seeing what they did with the market share I helped them get by being a big old sucker for evga (I literally did not purchase a single ATI/AMD card until the R9 290, and the first card I bought was iirc a 6600 GT).
The witcher 3, gta 5.
And I might be somewhat bitter about NV after seeing what they did with the market share I helped them get by being a big old sucker for evga (I literally did not purchase a single ATI/AMD card until the R9 290, and the first card I bought was iirc a 6600 GT).
The witcher 3 of hairworks tanked the framerates so bad AMD added a tesselation factor override to their drivers fame?
GTA 5 might have been fine, I don't remember it being particularly good but I don't remember it being particularly bad.
6600 GT was a legendary card used mine for a year and then bought the 7900 GTX. I'm just mad at NV because of Kepler performance in the last few months but other than that i'm in love with my GTX 980TI.
GTA 5 is thankfully from a studio big enough to tell nvidia or AMD which orifice south of their collective waistlines they can stuff their proprietary wares if there was an funny uncompetitive business going on. I wish all studios were like this.
The witcher 3 of hairworks tanked the framerates so bad AMD added a tessellation factor override to their drivers fame?
GTA 5 might have been fine, I don't remember it being particularly good but I don't remember it being particularly bad.
Is there a difference in rendering for the recent Ashes build on AMD vs NV?
Side by side comparison:
https://youtu.be/o2b6Nncu6zY?t=1m14s
Where's all the dynamic lights in the 980 scene? Aircraft engine exhaust. Missile engine exhaust, weapons charging up. On the 980 it's missing all of these lights.
Both footage were likely captured on different monitors I assume ...
Probably yes, but monitors don't affect rendering. Certainly the glowing lights are really obviously missing.
The reason I brought this up, Oxide said they use Async Compute for some of their lighting. Here it looks disabled.
Probably yes, but monitors don't affect rendering. Certainly the glowing lights are really obviously missing.
The reason I brought this up, Oxide said they use Async Compute for some of their lighting. Here it looks disabled.