[legitreviews] Hitman To Feature Best Implementation Of DX12 Async Compute Yet

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Should be interesting. To compare this to tomb raider

AMD is once again partnering with IO Interactive to bring an incredible Hitman gaming experience to the PC. As the newest member to the AMD Gaming Evolved program, Hitman will feature top-flight effects and performance optimizations for PC gamers.

Hitman will leverage unique DX12 hardware found in only AMD Radeon GPUs—called asynchronous compute engines—to handle heavier workloads and better image quality without compromising performance. PC gamers may have heard of asynchronous compute already, and Hitman demonstrates the best implementation of this exciting technology yet. By unlocking performance in GPUs and processors that couldn’t be touched in DirectX 11, gamers can get new performance out of the hardware they already own.
AMD is also looking to provide an exceptional experience to PC gamers with high-end PCs, collaborating with IO Interactive to implement AMD Eyefinity and ultrawide support, plus super-sample anti-aliasing for the best possible AA quality.

This partnership is a journey three years in the making, which started with Hitman: Absolution in 2012, a top seller in Europe and widely critically acclaimed. PC technical reviewers lauded all the knobs and dials that pushed GPUs of the time to their limit. That was no accident. With on-staff game developers, source code and effects, the AMDGaming Evolved program helps developers to bring the best out of a GPU. And now in 2016, Hitman gets the same PC-focused treatment with AMD and IO Interactive to ensure that the series’ newest title represents another great showcase for PC gaming!


Read more at http://www.legitreviews.com/hitman-beta-pc-system-requirements-announced_178991#2Fr31M2JxQMA2wKd.99
 
Feb 19, 2009
10,457
10
76
What is wrong with those guys @ LR, they are so ill-informed?

NVIDIA is having to rely on their software drivers to to handle asynchronous compute. The good news is that both companies are able to fully support Async Compute.

That's like saying AMD GPUs support GPU-accelerated PhysX, because it can be emulated on the CPU. o_O

They took AMD's press release where they mention:

Unique DX12 hardware found in only AMD Radeon GPU

This is on AMD's site:

AMD's advanced Graphics Core Next (GCN) architecture, which is currently the only architecture in the world that supports DX12's asynchronous shading.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
What is wrong with those guys @ LR, they are so ill-informed?



That's like saying AMD GPUs support GPU-accelerated PhysX, because it can be emulated on the CPU. o_O

They took AMD's press release where they mention:



This is on AMD's site:

I'm not surprised one bit. NVIDIA do not support Async compute under DX12. We know this now.

You'd think everyone was aware by now.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
come on now, amd doesn't have the proper marketing to make it into a big deal :) that is all there is to it, marketing money.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,111
9,364
136
Well if AMD is behind it, we can be sure that it will run just fine on NV hardware without NV users having to make IQ/performance sacrifices. AMD's dev teams do a better job selling NV hardware than AMD hardware...

Prove me wrong, AMD.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
Except a 290 is often as fast as 780 Ti in recent games, need to get with the times.
 

flopper

Senior member
Dec 16, 2005
739
19
76
Dx12 async and if it will suck on Maxwell you will find Nvidia will find a way to shut down the game.
 

Franzi

Member
Nov 18, 2012
45
0
61
Well if AMD is behind it, we can be sure that it will run just fine on NV hardware without NV users having to make IQ/performance sacrifices. AMD's dev teams do a better job selling NV hardware than AMD hardware...

Prove me wrong, AMD.

Yeah this is true. I noticed all AMD Gaming Evolved titles always performed great on my Nvidia card but actual Gameworks titles have a lot performance problems.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Seems odd to recommend a GTX660 and a GTX770 if those cards might struggle to run a DX12 game?
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Seems odd to recommend a GTX660 and a GTX770 if those cards might struggle to run a DX12 game?

Why would a 770 struggle to run a dx12 game? It should easily be able to brute force what an xbone does with magic async efficiency.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Most probably those are recommended for DX-11.

Yes, it recommends that you have DX11, and not DX12.

I would think minimum would be DX11 and recommended would be DX12 if they want to feature DX12 and AMD cards.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Just found out Hitman is being sponsored by Sony/PS4, which isn't DX12 compatible (correct?). It uses it's own DX11 variant? [EDIT: But I also remember an article that it was PS4 that pushed for Asynch Compute/Shaders, not MSFT in consoles. Perhaps Sony is the partner AMD needs in getting good games on console AND PC, not MSFT. And if true, Sony seems to be a winning bet on the console side right now, Xbone is screwed hardware/software sales. NPD January already showed Sony outsold MSFT almost 2:1].

Might explain why AMD jumped on the title to get their own code in there. I get the feeling after Tomb Raider bombed, Square/Crystal (who's CEO/president or something had to be fired/he quit) scrambled and to recover some lost sales/money (knowing the PS4 version is locked out for a year) sold their product to NV.

With Hitman being partnered to PS4, chances are it will sell well on consoles thus the PC side didn't need an infusion of cash to save face.

I need to stop wearing my tinfoil hat, but it would be interesting if my theories are true. Tomb Raider's last minute DX12 ejection had more to do with the game bombing on Xbone than NV throwing money around.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Actually, my theory would also tie to Quantum Break. MSFT is probably not confident with their current platform and will thus try to branch out to the PC side soon/faster.

They've already alluded that their top 1st party titles would get a PC version. Killer Instinct and Forza were mentioned, and the Quantum Break announcement caught everyone by surprise.

I'll go play a game and take my tin foil hat off. Haha.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
How would that translate for PC ports? Friendly, or PS3-level of "WTF is this?"

I would bet its the "WTF is this" part. The FreeBSD based OS isn't helping either. The high level layer is the most "friendly".
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Just found out Hitman is being sponsored by Sony/PS4, which isn't DX12 compatible (correct?). It uses it's own DX11 variant? [EDIT: But I also remember an article that it was PS4 that pushed for Asynch Compute/Shaders, not MSFT in consoles. Perhaps Sony is the partner AMD needs in getting good games on console AND PC, not MSFT. And if true, Sony seems to be a winning bet on the console side right now, Xbone is screwed hardware/software sales. NPD January already showed Sony outsold MSFT almost 2:1].

Might explain why AMD jumped on the title to get their own code in there. I get the feeling after Tomb Raider bombed, Square/Crystal (who's CEO/president or something had to be fired/he quit) scrambled and to recover some lost sales/money (knowing the PS4 version is locked out for a year) sold their product to NV.

With Hitman being partnered to PS4, chances are it will sell well on consoles thus the PC side didn't need an infusion of cash to save face.

I need to stop wearing my tinfoil hat, but it would be interesting if my theories are true. Tomb Raider's last minute DX12 ejection had more to do with the game bombing on Xbone than NV throwing money around.

PS4 used async shaders in one of the very early games and it looked great.

http://www.redgamingtech.com/asynch...eir-role-on-ps4-xbox-one-pc-according-to-amd/

http://www.redgamingtech.com/infamo...rt-2-ps4-performance-compute-particle-system/

seems it was also in Killzone shadowfall

http://www.redgamingtech.com/killzo...w-fall-technology-ps3-vs-ps4-analysis-part-1/

Both very good looking games.

BF4 also may have used it on ps4.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Sony was talking a lot about async compute even before the PS4 was released, it had 8 ACEs when the other AMD GPUs had 2, I'm sure they are using it...
 
Feb 19, 2009
10,457
10
76
Yep, from 2009 (there was a great article interviewing Sony officials, but I can't find the link ATM).

Sony didn't like GCN then (AMD + MS co-developed), they requested major changes, increasing the ACEs from 2 to 8 and increasing queue depth to 8 per engine, leading to what we see in modern GCN of 8 x 8 lanes for compute.

Their reason: They expected games from 2015 onwards to rely heavily on compute for effects, thus they wanted those to be able to run without bottlenecking graphics rendering.

It's been nearly all PS4 developers who have showcased Async Compute in games. It's just starting to come to the PC.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
How would that translate for PC ports? Friendly, or PS3-level of "WTF is this?"
From GNM to DX12 or Vulkan, decently friendly. (It's even lower level API, but shares similarities.)
GNMX should be similar to DX11 or OpenGL. (It's made on top of GNM.)

Problem for Ps3 was the CPU and what it needed to work properly, not the graphics APIs.

I have mostly heard good things about GNM, it really seems to be a great API which exposes GCN in ways PC developers can only dream about.

Many games use Asyc Compute as it works nicely on the machine.
IE. Technology of the Tomorrow Children (Tech talk, .PDF.)
 
Last edited:

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,596
6,069
136
From GNM to DX12 or Vulkan, decently friendly. (It's even lower level API, but shares similarities.)
GNMX should be similar to DX11 or OpenGL. (It's made on top of GNM.)

Problem for Ps3 was the CPU and what it needed to work properly, not the graphics APIs.

I have mostly heard good things about GNM, it really seems to be a great API which exposes GCN in ways PC developers can only dream about.

Many games use Asyc Compute as it works nicely on the machine.
IE. Technology of the Tomorrow Children (Tech talk, .PDF.)

They found a great way to do Global Illumination using cone tracing while being efficient with it. It's really a very clever implementation.