• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Deus EX DX11 performance

Status
Not open for further replies.

Innokentij

Senior member
New dues ex is out (yay) but the perfomance of nvidia in DX11 in this game is extreme bad. And it's ported by someone that atleast used to know how to do they're jobb back in the day?

AMD 480 basicly giving the same experience as a 980TI? This is worse then gameworsk, did AMD steal nvidias tactics and perfected em on DX11 to?


What is going on here guys?

index.php
 
If you think the gap is bad with DX11 just wait until you see DX12 ...

I wouldn't be surprised if AMD is planning something worse than gameworks from Nvidia is since their goal is to bridge the gap between console and PC (for their own GPUs) optimizations and shader model 6 is just one more step towards their goal ...

Nothing interesting is going on with Deus Ex Mankind Divided since the Dawn Engine which was heavily based on Glacier 2 also favoured AMD so it's not a coincidence. Had DE: MD been developed on Unreal Engine 4 we'd see different results ...
 
Last edited:
Do you know what Gaming Evolved does? Reduces CPU overhead from AMD GPUs...

Also look at compute performance of GPUs, and compare the results.

9 TFLOPs(GTX 1080) GPU is slightly faster than 8.6 TFLOPs(Fury X), which is faster than 6.5 TFLOPs GPU(GTX 1070), whicht is faster than 6 TFLOPs GPU(GTX 980 Ti), which is faster than 5.8 TFLOPs(RX 480), which is faster than 5.2 TFLOPs GPU(R9 390).

DX11 still however bottlenecks AMD GPUs, even using Gaming Evolved. How come? 6.5 TFLOPs GPU from Nvidia is still faster than 7 TFLOPs(Fury, and R9 Nano), and 4.3 TFLOPs(GTX 1060) is faster from 4.9 TFLOPs(RX 470).

Full DX12/Vulkan will reflect those differences in compute.
 
New dues ex is out (yay) but the perfomance of nvidia in DX11 in this game is extreme bad. And it's ported by someone that atleast used to know how to do they're jobb back in the day?

AMD 480 basicly giving the same experience as a 980TI? This is worse then gameworsk, did AMD steal nvidias tactics and perfected em on DX11 to?

Huh? I can't find a single setting that eats nVidia GPU that don't eat AMD GPU for the equivalent amount? Troll much ?
 
Huh? I can't find a single setting that eats nVidia GPU that don't eat AMD GPU for the equivalent amount? Troll much ?
What are u on about this is not trolling, if u think it''s trolling to ask why a 980TI is on par with a 480 in dx11 title well i dont even know what to tell u here?
 
Do you know what Gaming Evolved does? Reduces CPU overhead from AMD GPUs...

Also look at compute performance of GPUs, and compare the results.

9 TFLOPs(GTX 1080) GPU is slightly faster than 8.6 TFLOPs(Fury X), which is faster than 6.5 TFLOPs GPU(GTX 1070), whicht is faster than 6 TFLOPs GPU(GTX 980 Ti), which is faster than 5.8 TFLOPs(RX 480), which is faster than 5.2 TFLOPs GPU(R9 390).

DX11 still however bottlenecks AMD GPUs, even using Gaming Evolved. How come? 6.5 TFLOPs GPU from Nvidia is still faster than 7 TFLOPs(Fury, and R9 Nano), and 4.3 TFLOPs(GTX 1060) is faster from 4.9 TFLOPs(RX 470).

Full DX12/Vulkan will reflect those differences in compute.

So what u saying is the game title is optimized to run on compute that i assume means parrarel workloads? Since nvidia only does compute / parrarel workload on CUDA and dont really excell outside that system? Or im missing something
 
So what u saying is the game title is optimized to run on compute that i assume means parrarel workloads? Since nvidia only does compute / parrarel workload on CUDA and dont really excell outside that system? Or im missing something
Yes, parallel workloads. Compute performance reflects general performance of GPUs. At least it should be in non-bottlenecking environment.
 
Yes, parallel workloads. Compute performance reflects general performance of GPUs. At least it should be in non-bottlenecking environment.

I see, ty for the information this explains alot, 900 series and down only does blazing fast "singel" workload at a time outside CUDA and usualy the more parrarel u throw at it the more it will slow down and wait for something to finish before starting next task and by this in real world appear to be "slower". Well that was a bummer but such is life.
 
What are u on about this is not trolling, if u think it''s trolling to ask why a 980TI is on par with a 480 in dx11 title well i dont even know what to tell u here?

Gimpworks represent 2 things:

* Complete blackout from involvement.
* Settings like Hairworks that murders the competition's offerings.

Do you see any of these 2 attributes in the Deus Ex game?
 
Yes, parallel workloads. Compute performance reflects general performance of GPUs. At least it should be in non-bottlenecking environment.

Compute performance is just one aspect of the general performance of a GPU. Other engines use compute just as much, if not more than Deus Ex MD and the performance disparity is nowhere near as great as it is with Deus Ex MD. Frostbite 3 and ID Tech 6 are just two examples of engines that make heavy use of compute, and both perform well on AMD and NVidia hardware respectively.

No the truth is that Deus Ex MD heavily favors AMD GPUs likely due to it being a Gaming Evolved title, and the Dawn Engine being based on the Glacier 2 engine like ThatBuzzKiller said. I don't think we're ever going to see good performance on NVidia hardware in this game until months later when both NVidia and Nixxes have done some heavy optimizations that will likely include replacing some of the GCN centric shaders.
 
Seems like we've got a new Ken M o'er here!

Hasn't AMD's past few series of GPUs had a wee nugget tae them, in that they aren't as drastically affected by higher resolutions than their NVidia counterparts? Probably what we're seein' 'ere.
 
Seems like we've got a new Ken M o'er here!

Hasn't AMD's past few series of GPUs had a wee nugget tae them, in that they aren't as drastically affected by higher resolutions than their NVidia counterparts? Probably what we're seein' 'ere.

I love Ken M, and I can tell he's no Ken M. My last post in this thread. Hopefully mods won't bring down the hammer.
 
Status
Not open for further replies.
Back
Top