• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Hitman DirectX-12 BenchmarksupdateComputerbase

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Another implication is that nVidia was probably aware of this the entire time and has designed compute back in to Pascal. If this is true, Pascal will not have the same leap over Maxwell that Polaris will be getting with GCN.

I believe Pascal will do great in DX-12, but it will not scale that much in older DX-11 GameWorks games (2014/2015). Wait and you will see how people will praise DX-12 when Pascal will be released 😉
 
you really don't know what you are talking about.

http://images.techhive.com/images/a..._of_war_19x10_medium_dx12-100647959-large.png

http://images.techhive.com/images/a...e_rate_high_quality_19x10-100647718-large.png

GoW uses only 2 cores.True DX12 will use all Cores ( as possible ) .I think Atos bench really hurts Nvidia users , because It's faster than 980Ti , right? oh dear

GoW "scales" up to 8 thread at least. I'm losing 30FPS with two cores in the benchmark. And with 4 cores all of them are nearly at 100%. With 4c/8t i get 10% more frames over 4c/4t.

So GoW is a "true" DX12 game. :thumbsup:
 
^^Psst, links are dead.

hmm , i can see them , can you see bottom images?

dx12_cpu_gears_of_war_19x10_medium_dx12-100647959-large.png


dx12_cpu_ashes_of_the_singularity_beta_2_average_cpu_frame_rate_high_quality_19x10-100647718-large.png
 
GoW "scales" up to 8 thread at least. I'm losing 30FPS with two cores in the benchmark. And with 4 cores all of them are nearly at 100%. With 4c/8t i get 10% more frames over 4c/4t.

So GoW is a "true" DX12 game. :thumbsup:

True DX12 doesn't need to be ported from old engine! and you're funny.above 4 core they're all slower than 4 cores. 82.7 vs 82.9 , wow!!!

It says:

Error 403 Forbidden

Edit : Sorry Here direct link , I googled.

http://www.pcworld.com/article/3039...es-you-really-need-for-directx-12-gaming.html
 
Last edited:
Seems like DX12 is broken on Maxwell and Kepler. But nevermind you can Upgrade to precious Pascal for "small" money very soon.
 
GoW "scales" up to 8 thread at least. I'm losing 30FPS with two cores in the benchmark. And with 4 cores all of them are nearly at 100%. With 4c/8t i get 10% more frames over 4c/4t.

So GoW is a "true" DX12 game. :thumbsup:
Alright, then Crysis 3 is a true DX12 game as well. :thumbsup:
 
Seems like DX12 is broken on Maxwell and Kepler. But nevermind you can Upgrade to precious Pascal for "small" money very soon.

It's not just DX12, RTG is pulling ahead in almost all recent games. GCN architecture is finally starting to pay off it seems.
 
It's not just DX12, RTG is pulling ahead in almost all recent games. GCN architecture is finally starting to pay off it seems.

A bit late if you ask me, hopefully this carries onto Polaris and they can truly leverage the console wins (and give them a reason to win the next round of console designs)
 
It's definitely too late, seeing as how many 970's nVidia have already sold. Very curious indeed to see if this trend carries over to Polaris/Pascal.
 
Why is AtoS a "true DX12" game? It doesnt even use CR, ROV or Tiled Ressource Tier 3. :\

A true "DX12" would be unplayable on any DX11 hardware.

BTW: The Division uses CR for nVidia's HFTS. Guess this makes it a true DX12 game even without the DX12 API, ha? 😀

Thats because those are.. wait for it... DX 11.3 features, not DX12 ones.

Pretty obvious since well, its not using a DX12 api.

DX12 features are all related to lower overhead for better engines such as async compute.

DX12 doesn't have any unique graphical features, its purely for optimization reasons. So trying to say the worst optimized released game in recent history is a hallmark of DX12 is just ignorant.

Oh yeah, and those cool new features in the Division?

Takes a whopping 28% performance hit on a 980 TI.

http://www.computerbase.de/2016-03/...schnitt_gameworks_mit_komplett_neuen_schatten

Way to make a subtle effect destroy FPS again Nvidia.
 
Last edited:
True DX12 doesn't need to be ported from old engine! and you're funny.above 4 core they're all slower than 4 cores. 82.7 vs 82.9 , wow!!!

What? Gears of War is true DX12. Rendering scales over 8 threads. That a third person shooter doesnt show a huge improvement should be clear. But this game doesnt have one "render" thread.

So it is a true DX12 game.

Thats because those are.. wait for it... DX 11.3 features, not DX12 ones.

Pretty obvious since well, its not using a DX12 api.

DX12 features are all related to lower overhead for better engines such as async compute.

DX12 doesn't have any unique graphical features, its purely for optimization reasons. So trying to say the worst optimized released game in recent history is a hallmark of DX12 is just ignorant.

It is available under DX11.3, too. Fact is it was announced as a new graphics feature for DX12. Microsoft has ported it back to DX11.3, so that developers dont need to use a low level API for those.
nVidia is using their NVAPI to make it usable under <DX11.2.

It's not the only purpose of it. The fact that Oxide and IO Interactive dont use nVidia features to improve performance make it clear that these games are not "true" DX12 games either. But for you "AC" is the only purpose to use DX12, right?
 
Last edited:
Yeah, GoW is a truly DX12 game as League of Legends is a truly DX11 game. Who is this guy trying to deceive with such inane claim?
 
What? Gears of War is true DX12. Rendering scales over 8 threads. That a third person shooter doesnt show a huge improvement should be clear. But this game doesnt have one "render" thread.

So it is a true DX12 game.



It is available under DX11.3, too. Fact is it was announced as a new graphics feature for DX12. Microsoft has ported it back to DX11.3, so that developers dont need to use a low level API for those.
nVidia is using their NVAPI to make it usable under <DX11.2.

It's not the only purpose of it. The fact that Oxide and IO Interactive dont use nVidia features to improve performance make it clear that these games are not "true" DX12 games either. But for you "AC" is the only purpose to use DX12, right?

Do you understand how those "nvidia features" are meant to be used?

Do you not understand that Nvidia has claimed to support Async Compute for months, almost a full year now and has been selling their cards under that guise?

Its not Oxide's fault that Nvidia doesn't support Async Compute while claiming to do so. They are trying to write the best performing engine they can.

Why not get angry at Nvidia for lying about supporting it instead of blaming Oxide for using a feature you are supposed to have.
 
Sontin, at this point, after all of what you have written on every topic, I do not believe anyone takes seriously what you are writing.

But that is only your fault.
 
What? Gears of War is true DX12. Rendering scales over 8 threads. That a third person shooter doesnt show a huge improvement should be clear. But this game doesnt have one "render" thread.

So it is a true DX12 game.

GoW Does not scale up to 8 thread.

http://www.pcworld.com/article/3039...es-you-really-need-for-directx-12-gaming.html

But because Nvidia cards are faster than AMD cards then they're are true DX12.Hitman/Fable/Deus/Atos they're are not True DX12 Becuase AMD cards better perform than Nvidia cards:thumbsup:
 
It'll be interesting seeing where the 390(X) land. My money's on another "and this is why the AMD fans didn't jump on the Fury in huge numbers" release.

Now that we've got Hawaii numbers, I think I get to be smug about predicting that Hawaii would kill it, and also for trading in a 970 for a 290. 😎

Yes in an AMD paid game.
Look a GTX980 is much faster than a Fury in Gears of War: http://wccftech.com/amd-radeon-crimson-16-3-drivers/

:sneaky:

It's customary to use games that actually work for your cherry-picking, just so you know.

Sontin, at this point, after all of what you have written on every topic, I do not believe anyone takes seriously what you are writing.

But that is only your fault.

Consistently wrong is nearly as good as consistently right, especially on the internet, where people don't let incorrect information go uncorrected. We owe sontin for some really great effortposts by Mahigan.
 
GoW "scales" up to 8 thread at least. I'm losing 30FPS with two cores in the benchmark. And with 4 cores all of them are nearly at 100%. With 4c/8t i get 10% more frames over 4c/4t.

So GoW is a "true" DX12 game. :thumbsup:

Just post screenshots then....
Unless you don't have them, can't recreate them, or refuse to do so.
 
Back
Top