Deus Ex MD - CPU Thread Scaling and performance

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
General Deus Ex : Mankind Divided thread for CPU thread scaling and performance.

First official review from GameGPU.ru, i will update the OP with every new review available.

Just a reminder, Deus Ex official DX-12 will be available via a patch on the 5th of September. So the numbers bellow may change.

https://steamcommunity.com/games/337000/announcements/detail/930377969893113169
The Deus Ex franchise originated on PC, and we’re passionate about continuing to provide the best experience possible to our long-time fans and players on the PC.

Contrary to our previous announcement, Deus Ex: Mankind Divided, which is shipping on August 23rd, will unfortunately not support DirectX 12 at launch. We have some extra work and optimizations to do for DX12, and we need more time to ensure we deliver a compelling experience. Our teams are working hard to complete the final push required here though, and we expect to release DX12 support on the week of September 5th!

We thank you for your patience, passion, and support.

- The Deus Ex team

http://gamegpu.com/action-/-fps-/-tps/deus-ex-mankind-divided-test-gpu

proz_12_amd.png


proz_12_nv.png


Edit: DX-11

proz_11_amd.png


proz_11_nv.png



This is an amazing CPU thread Scaling.

intel.png


amd.png
 
Last edited:
Aug 11, 2008
10,451
642
126
So the fastest result is GTX1080 in DX 11. Interesting. Even under DX12 Vishera is still getting demolished by Intel.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Stock Haswell i5 continues to be a better gaming CPU than FX-9590. It's nice to see the FX-8350 match a stock 2600K though.

Scaling is impressive, hopefully we see this in more games.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
Wow,welcome to sub 60FPS 720p gaming with high end systems!
And keep in mind that this is the scaling of the GPU benchmark that comes with the game(video on gamegpu) and not the CPU scaling of the game itself.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
DX-12 is not optimized in the current state, we should wait for the DX-12 patch on September the 5th.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
it looks like DX11 and Nvidia is the way to go for this game
also the 6700 non K is over 60 min and almost on par with the 5960X with the fastest API/Drivers/GPU
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
Fury X loses performance going DX12 at 1920x1080 and 2560x1440 but gains a lot of performance at 1280x720.

GTX 1080 going DX12 loses very little at 1920x1080 and gains very little at 2560x1440 but loses a lot of performance at 1280x720.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Like I said in the thread in the VC&G forums, these benchmarks should be ignored as they aren't using final code..
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
DX11 beats DX12 in performance and CPU. Not exactly the story of the promised lands we was told.

But the game is another reason upgrade to Skylake or soon Kaby Lake if you havent yet. :)
 
  • Like
Reactions: Sweepr

f2bnp

Member
May 25, 2015
156
93
101
DX12 deniers will never cease to amaze me.

Another rushed AAA release. Let's see the DX12 patch on September.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I expect HUGE gains from DX12 when finally released.

For the hardware that has a real support
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
DX11 beats DX12 in performance and CPU. Not exactly the story of the promised lands we was told.

But the game is another reason upgrade to Skylake or soon Kaby Lake if you havent yet. :)

There is no proper DX12 game patch, and that means no proper AMD or NV DX12-optimized drivers. Too early to start making sweeping conclusions regarding DX12 vs. DX11 performance. DX12 patch and AMD driver update resulted in an 18-25% performance boost for RoTR:

5a4242a2-0d5e-4afa-a033-fc16a532210f.png


DX12 deniers will never cease to amaze me.
Another rushed AAA release. Let's see the DX12 patch on September.

It's natural they are downplaying it considering NV has touted DX12 support with Kepler, Maxwell and now Pascal, but on a hardware level, they are still behind 2012-2013 GCN as they haven't even incorporated hardware ACEs into their 2016 architecture. Fingers crossed Volta is the true DX12 NV architecture because by 2018 DX12 should be a lot more common.

Ironic considering how these same posters spent months criticizing and dissecting every sub-line of DX12 support during Kepler/Maxwell vs. GCN 1.0/1.1/2.0 generations. Since then the R9 290 (aka 390) has managed to take out the 780, 780Ti, and the 970. At this pace, both the RX 480 and the R9 290/390 may just drop the 980 in this title under DX12. I look forward to seeing the $250-275 R9 290 and $240 RX 480 take out the $550 980 and $700 780Ti in Deus Ex under DX12.

DX12 is showing a lot of promise, when implemented correctly, at least on GPU architectures that can perform massively parallel workloads without a context switch penalty.

XxhXDJKP2x6Z9jPMP7XgjD.png
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
My poor 2500K. :(

I don't expect that framerate to be entirely representative of my experience, since I've got it overclocked to like 4.2 GHz, but still. I hope DirectX 12 helps take the load off the CPU, or I might just need to upgrade...
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I expect HUGE gains from DX12 when finally released.

For the hardware that has a real support

If AMD does have "HUGE" gains, it will be because their DX11 performance is so abysmal. It's the same thing with Doom..
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
If AMD does have "HUGE" gains, it will be because their DX11 performance is so abysmal. It's the same thing with Doom..

According to the gamegpu review at 1080p DX-11, nothing like DOOM with OpenGL

R9 280X is faster than GTX 960
R9 380X is faster than GTX 780Ti
R9 290 is faster than GTX 970
R9 290X is faster than GTX 980
RX 480 is faster than 290X
FuryX is faster than GTX 980Ti/1070
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's natural they are downplaying it considering NV has touted DX12 support with Kepler, Maxwell and now Pascal, but on a hardware level, they are still behind 2012-2013 GCN as they haven't even incorporated hardware ACEs into their 2016 architecture. Fingers crossed Volta is the true DX12 NV architecture because by 2018 DX12 should be a lot more common.

Right, the lack of ACEs really seem to be hurting NVidia right now in DX12 titles. Let's take Ashes of the Singularity for instance, one of, if not the best showcases for DX12 as it supports Asynchronous Compute and is an exceptionally parallel engine.

GTX-1070-ZOTAC-74.jpg


In an AMD sponsored game, with an engine that is geared towards GCN, NVidia is still dominating AMD. The Fury X which has the benefit of water cooling, is losing to an air cooled reference clocked GTX 980 Ti, let alone an aftermarket version which can be up to 20% faster.. And Pascal is outright decimating the Fury cards.

DX12 is showing a lot of promise, when implemented correctly, at least on GPU architectures that can perform massively parallel workloads without a context switch penalty.

All GPUs are massively parallel by necessity. And asynchronous compute adds only about 10% extra on average. On consoles the benefit is greater since developers have much more control over the workflow, and more focused optimization. It's going to be a while before asynchronous compute becomes a big thing on PCs, and by then NVidia will likely have a better solution than what they are currently using for Pascal.

XxhXDJKP2x6Z9jPMP7XgjD.png


The above is an old graph taken from when the game first launched. After numerous driver updates and patches, the performance looks much differently:

index.php

And Hilbert's test bed includes a 5960x at 4.4ghz on all cores.
 
  • Like
Reactions: zentan

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
According to the gamegpu review at 1080p DX-11, nothing like DOOM with OpenGL

R9 280X is faster than GTX 960
R9 380X is faster than GTX 780Ti
R9 290 is faster than GTX 970
R9 290X is faster than GTX 980
RX 480 is faster than 290X
FuryX is faster than GTX 980Ti/1070

As I've said numerous times, the code GameGPU reviewed isn't final, so lets refrain from making final judgment shall we?

Day one patch might change things, and the DX12 update for certain. It usually takes a few months until we see the final performance of a game, as it takes a while for the cumulative patches and driver updates to take effect.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
index.php

And Hilbert's test bed includes a 5960x at 4.4ghz on all cores.

Warhammer is broken. When GPU utilization is 60-70% and DX11 is 30% faster. You know something is really wrong. But nice sabotage of another IHV due to sponsorship.

Funny enough Warhammer is also another example where DX12 did nothing to solve the CPU bottlenecks outside of the scripted benches.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Warhammer is broken. When GPU utilization is 60-70% and DX11 is 30% faster. You know something is really wrong. But nice sabotage of another IHV due to sponsorship.

Funny enough Warhammer is also another example where DX12 did nothing to solve the CPU bottlenecks outside of the scripted benches.

Well if it really runs that much slower in DX12, then it definitely is broken. DX12 isn't for schoolboy devs apparently. Ashes of the Singularity is still by far the best DX12 implementation available. You see fairly large performance increases going from DX11 to DX12.

Rise of the Tomb Raider is also fairly good after the second DX12 patch, but it still has a few kinks to work out..

GTX-1070-ZOTAC-56.jpg


GTX-1070-ZOTAC-74.jpg
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Wow VRAM usage at 1080p is 4.7 GB. I would like to see how a GTX 1060 3GB plays this game at 1080p max settings. imo buying a mid-range 4GB card in 2016 even for 1080p is shortsighted. i would rather recommend a custom Rx 470 8GB than a custom Rx 480 4GB or GTX 1060 3GB. With 8 Ghz memory the Rx 470 8GB Nitro is very close to Rx 480 8GB ref and 8-9% slower than custom Rx 480 8G running at same clocks. Its a much better investment to go with Rx 470 8GB rather than pay for 8-9% higher perf clock for clock with Rx 480 4G as we are seeing Rx 470 8GB and Rx 480 4G prices almost similar.

https://www.computerbase.de/2016-08/radeon-rx-470-test/3/
https://www.computerbase.de/2016-07/sapphire-radeon-rx-480-nitro-oc-test/3/
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Wow VRAM usage at 1080p is 4.7 GB. I would like to see how a GTX 1060 3GB plays this game at 1080p max settings. imo buying a mid-range 4GB card in 2016 even for 1080p is shortsighted. i would rather recommend a custom Rx 470 8GB than a custom Rx 480 4GB or GTX 1060 3GB. With 8 Ghz memory the Rx 470 8GB Nitro is very close to Rx 480 8GB ref and 8-9% slower than custom Rx 480 8G running at same clocks. Its a much better investment to go with Rx 470 8GB rather than pay for 8-9% higher perf clock for clock with Rx 480 4G as we are seeing Rx 470 8GB and Rx 480 4G prices almost similar.

https://www.computerbase.de/2016-08/radeon-rx-470-test/3/
https://www.computerbase.de/2016-07/sapphire-radeon-rx-480-nitro-oc-test/3/

I know right. The Vram creep is real. Guess all those people last year who said 4 GB was fine for Fury X were mistaken as well (4 GB for super high end in 2015). If history has shown us anything, its that more vram never seems to be a bad thing.
 
  • Like
Reactions: zentan

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Reported VRAM usage doesn't really mean the game is actually using all of that VRAM though. Most of it is being used as a cache, which is a good thing as that reduces texture swapping..
 

tg2708

Senior member
May 23, 2013
687
20
81
I see the 6700k pushing 72 vs 4670k 49fps at max in the chart above. Can someone please explain what that means for gaming performance? A 22 fps increase is huge so why do people across the internet keep saying an i5 will not hamper a strong gpu in games. So with an i7 a strong gpu is pretty much worthless.