• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[legitreviews] Hitman To Feature Best Implementation Of DX12 Async Compute Yet

Azix

Golden Member
Should be interesting. To compare this to tomb raider

AMD is once again partnering with IO Interactive to bring an incredible Hitman gaming experience to the PC. As the newest member to the AMD Gaming Evolved program, Hitman will feature top-flight effects and performance optimizations for PC gamers.

Hitman will leverage unique DX12 hardware found in only AMD Radeon GPUs—called asynchronous compute engines—to handle heavier workloads and better image quality without compromising performance. PC gamers may have heard of asynchronous compute already, and Hitman demonstrates the best implementation of this exciting technology yet. By unlocking performance in GPUs and processors that couldn’t be touched in DirectX 11, gamers can get new performance out of the hardware they already own.
AMD is also looking to provide an exceptional experience to PC gamers with high-end PCs, collaborating with IO Interactive to implement AMD Eyefinity and ultrawide support, plus super-sample anti-aliasing for the best possible AA quality.

This partnership is a journey three years in the making, which started with Hitman: Absolution in 2012, a top seller in Europe and widely critically acclaimed. PC technical reviewers lauded all the knobs and dials that pushed GPUs of the time to their limit. That was no accident. With on-staff game developers, source code and effects, the AMDGaming Evolved program helps developers to bring the best out of a GPU. And now in 2016, Hitman gets the same PC-focused treatment with AMD and IO Interactive to ensure that the series’ newest title represents another great showcase for PC gaming!


Read more at http://www.legitreviews.com/hitman-beta-pc-system-requirements-announced_178991#2Fr31M2JxQMA2wKd.99
 
What is wrong with those guys @ LR, they are so ill-informed?

NVIDIA is having to rely on their software drivers to to handle asynchronous compute. The good news is that both companies are able to fully support Async Compute.

That's like saying AMD GPUs support GPU-accelerated PhysX, because it can be emulated on the CPU. 😵

They took AMD's press release where they mention:

Unique DX12 hardware found in only AMD Radeon GPU

This is on AMD's site:

AMD's advanced Graphics Core Next (GCN) architecture, which is currently the only architecture in the world that supports DX12's asynchronous shading.
 
What is wrong with those guys @ LR, they are so ill-informed?



That's like saying AMD GPUs support GPU-accelerated PhysX, because it can be emulated on the CPU. 😵

They took AMD's press release where they mention:



This is on AMD's site:

I'm not surprised one bit. NVIDIA do not support Async compute under DX12. We know this now.

You'd think everyone was aware by now.
 
come on now, amd doesn't have the proper marketing to make it into a big deal 🙂 that is all there is to it, marketing money.
 
Well if AMD is behind it, we can be sure that it will run just fine on NV hardware without NV users having to make IQ/performance sacrifices. AMD's dev teams do a better job selling NV hardware than AMD hardware...

Prove me wrong, AMD.
 
Dx12 async and if it will suck on Maxwell you will find Nvidia will find a way to shut down the game.
 
Well if AMD is behind it, we can be sure that it will run just fine on NV hardware without NV users having to make IQ/performance sacrifices. AMD's dev teams do a better job selling NV hardware than AMD hardware...

Prove me wrong, AMD.

Yeah this is true. I noticed all AMD Gaming Evolved titles always performed great on my Nvidia card but actual Gameworks titles have a lot performance problems.
 
Just found out Hitman is being sponsored by Sony/PS4, which isn't DX12 compatible (correct?). It uses it's own DX11 variant? [EDIT: But I also remember an article that it was PS4 that pushed for Asynch Compute/Shaders, not MSFT in consoles. Perhaps Sony is the partner AMD needs in getting good games on console AND PC, not MSFT. And if true, Sony seems to be a winning bet on the console side right now, Xbone is screwed hardware/software sales. NPD January already showed Sony outsold MSFT almost 2:1].

Might explain why AMD jumped on the title to get their own code in there. I get the feeling after Tomb Raider bombed, Square/Crystal (who's CEO/president or something had to be fired/he quit) scrambled and to recover some lost sales/money (knowing the PS4 version is locked out for a year) sold their product to NV.

With Hitman being partnered to PS4, chances are it will sell well on consoles thus the PC side didn't need an infusion of cash to save face.

I need to stop wearing my tinfoil hat, but it would be interesting if my theories are true. Tomb Raider's last minute DX12 ejection had more to do with the game bombing on Xbone than NV throwing money around.
 
Last edited:
Actually, my theory would also tie to Quantum Break. MSFT is probably not confident with their current platform and will thus try to branch out to the PC side soon/faster.

They've already alluded that their top 1st party titles would get a PC version. Killer Instinct and Forza were mentioned, and the Quantum Break announcement caught everyone by surprise.

I'll go play a game and take my tin foil hat off. Haha.
 
Just found out Hitman is being sponsored by Sony/PS4, which isn't DX12 compatible (correct?). It uses it's own DX11 variant? [EDIT: But I also remember an article that it was PS4 that pushed for Asynch Compute/Shaders, not MSFT in consoles. Perhaps Sony is the partner AMD needs in getting good games on console AND PC, not MSFT. And if true, Sony seems to be a winning bet on the console side right now, Xbone is screwed hardware/software sales. NPD January already showed Sony outsold MSFT almost 2:1].

Might explain why AMD jumped on the title to get their own code in there. I get the feeling after Tomb Raider bombed, Square/Crystal (who's CEO/president or something had to be fired/he quit) scrambled and to recover some lost sales/money (knowing the PS4 version is locked out for a year) sold their product to NV.

With Hitman being partnered to PS4, chances are it will sell well on consoles thus the PC side didn't need an infusion of cash to save face.

I need to stop wearing my tinfoil hat, but it would be interesting if my theories are true. Tomb Raider's last minute DX12 ejection had more to do with the game bombing on Xbone than NV throwing money around.

PS4 used async shaders in one of the very early games and it looked great.

http://www.redgamingtech.com/asynch...eir-role-on-ps4-xbox-one-pc-according-to-amd/

http://www.redgamingtech.com/infamo...rt-2-ps4-performance-compute-particle-system/

seems it was also in Killzone shadowfall

http://www.redgamingtech.com/killzo...w-fall-technology-ps3-vs-ps4-analysis-part-1/

Both very good looking games.

BF4 also may have used it on ps4.
 
Last edited:
Sony was talking a lot about async compute even before the PS4 was released, it had 8 ACEs when the other AMD GPUs had 2, I'm sure they are using it...
 
Yep, from 2009 (there was a great article interviewing Sony officials, but I can't find the link ATM).

Sony didn't like GCN then (AMD + MS co-developed), they requested major changes, increasing the ACEs from 2 to 8 and increasing queue depth to 8 per engine, leading to what we see in modern GCN of 8 x 8 lanes for compute.

Their reason: They expected games from 2015 onwards to rely heavily on compute for effects, thus they wanted those to be able to run without bottlenecking graphics rendering.

It's been nearly all PS4 developers who have showcased Async Compute in games. It's just starting to come to the PC.
 
How would that translate for PC ports? Friendly, or PS3-level of "WTF is this?"
From GNM to DX12 or Vulkan, decently friendly. (It's even lower level API, but shares similarities.)
GNMX should be similar to DX11 or OpenGL. (It's made on top of GNM.)

Problem for Ps3 was the CPU and what it needed to work properly, not the graphics APIs.

I have mostly heard good things about GNM, it really seems to be a great API which exposes GCN in ways PC developers can only dream about.

Many games use Asyc Compute as it works nicely on the machine.
IE. Technology of the Tomorrow Children (Tech talk, .PDF.)
 
Last edited:
From GNM to DX12 or Vulkan, decently friendly. (It's even lower level API, but shares similarities.)
GNMX should be similar to DX11 or OpenGL. (It's made on top of GNM.)

Problem for Ps3 was the CPU and what it needed to work properly, not the graphics APIs.

I have mostly heard good things about GNM, it really seems to be a great API which exposes GCN in ways PC developers can only dream about.

Many games use Asyc Compute as it works nicely on the machine.
IE. Technology of the Tomorrow Children (Tech talk, .PDF.)

They found a great way to do Global Illumination using cone tracing while being efficient with it. It's really a very clever implementation.
 
Back
Top