[GameGPU] Dying Light - Horrible game engine CPU optimizations

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Usually this is where someone links GPU benchmarks, but I'll leave those last...because Core i3 Haswell 4330 is faster than Core i7 Sandy 2600K or FX9590.

:eek:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl__proz.jpg


Here is the answer why......WTH! CPU-threading optimization failure. :thumbsdown:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl__intel.jpg


2GB VRAM GPUs - Have fun!

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl__vram.jpg


Up to 8GB of RAM usage - Yup, that too!

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl__ram2.jpg


Now the GPU benches

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl__1920.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl__2560.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl_3840.jpg

Source

If this is what's to come for PC and VRAM usage for 2015 PC games in terms of optimizations, WOW!

Please link any other benches of this game if you can find it to validate or disprove this data.

Note:
For anyone wondering, this is an NV GW title (low performance on AMD cards and no CF support for now). The game seems optimized for Maxwell with 980 being unusually fast against the 780Ti, with average fps advantage at 22% and minimums 26% higher at 1600p!
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Looks like the engine is using a single thread for rendering, which explains the CPU bottleneck.

So basically, it's more a case of an inefficient engine rather than Sandy Bridge losing steam.

On another note, GameGpu needs to start including Ivy Bridge in their reviews.
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91
Techland is using their home-grown technology here, Chrome Engine. Dying Light is the debut of Chrome Engine 6.

Haven't played it for myself yet, but it looks pretty good, and there are lots of moments when you're in high locations on top of towers, skyscrapers, etc etc. In those cases the scenery and draw distance look spectacular. Cranking up the draw distance slider in the options is a good way to kill your performance though AFAIK. Good chunk of it seems to be because of poor multi-core optimization.

You can view the launch trailer on Youtube in 1080p60fps.
 
Feb 19, 2009
10,457
10
76
Didn't realize per thread, the IPC gain was that massive. Something else must be the cause, ie. anything specific on Haswell, instructions and such?
 

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
Yeah looks like single thread/dual thread performance benchmark only.

Also the '3.5gb' 970 SLI @ 1600P is showing it's memory limitation's ugly head. It's a 20% drop for the 980 going from 1080P to 1600P, for the gimped 970 it's a 30% drop. The memory usage at 1600P for both the 980/290x is over 3.5gb of ram.

In other words going from a 3gb 780Ti to a supposed 4gb 970 gained me hardly any vram longevity.
 
Last edited:

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
Too bad gamegpu doesn't test IB CPUs, it would be interesting to see if whatever SB is having a bad time at, is solved on IB or has been addressed on Haswell as seen on the results. It'd also be interesting to see what a 4.5GHz 2500k/2600k could do here, but then we have the example of the 8350 -> 9590 not gaining much.

Still, to see a puny little i3 4330 walk all over even SB-E, it's shocking. It's also strange that the 4670k avg fps are lower than the 4330's, no way that's valid data (4C4T > 2C4T) even on a game that loads two cores at most.

Let's see what other sites have to say about this game when they do.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Looks like the engine is using a single thread for rendering, which explains the CPU bottleneck.

So basically, it's more a case of an inefficient engine rather than Sandy Bridge losing steam.

Looks like the game only uses two primary threads.

You guys are right! Thanks for pointing this out. I was just so pissed seeing that level of performance on a Core i5/i7 SB CPUs that I missed the part where the engine threading is limiting CPU performance. I corrected the title and added the necessary chart.
 

Udgnim

Diamond Member
Apr 16, 2008
3,679
122
106
no crossfire support

game loves Nvidia hardware and/or hates AMD hardware

anyone buying a new GPU that only has 2GB VRAM is going to regret it, but that's stating the obvious
 
Last edited:

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
I can attest to the poor cpu optimization. My 4670k at 4.7 can't stay above 60 fps in many places, nothing below 40 fps though so it is basically playable.
 
Feb 19, 2009
10,457
10
76
no crossfire support

game loves Nvidia hardware and/or hates AMD hardware

NV sponsored title with GameWorks.

Nothing new, seen it happen time & again.

Also, single* threaded in 2015 is unacceptable for such an expensive AAA title. Note, this isn't some $19.95 indie game, its selling for $70 here.

*Or even 2.
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
So, 4670K has more cache and higher clock and loses to a crippled i3. But the lower clock 5960x trumps 4770K. Hmm.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Still, to see a puny little i3 4330 walk all over even SB-E, it's shocking.

This game engine has me scratching my head. i7 4770K has 32% higher avg. fps and 36% higher mins vs. a 2600K. How?!

anyone buying a new GPU that only has 2GB VRAM is going to regret it, but that's stating the obvious

This game is a total coding mess. Even 3GB of VRAM isn't enough.

From NV's GeForce Guide:

"Unfortunately, Dying Light doesn't allow for Texture Quality to be changed mid-game. This, combined with checkpoint-style respawn locations and a constantly-changing time of day make direct comparisons in interesting locations impossible. From testing, it appears that there is no difference in quality between the two settings, and that High may merely be storing more textures in memory on suitably equipped GPUs. For example, running around an area on Medium resulted in a modicum of texture pop-in and VRAM usage of around 2GB. Repeating the test on High resulted in zero pop-in and VRAM usage that topped out at 3.3GB, though during longer gameplay sessions usage of nearly 4GB has been observed."
http://www.geforce.com/whats-new/gu...performance-guide#dying-light-texture-quality

^^
So we need a 3.3-4GB videocard to have no texture pop-in, even though the texture quality hardly changes?!

Is Techland's idea of a next gen PC game is one that wastes VRAM resources and isn't optimized beyond 2 cores?


--

TotalBiscuit's Twitter Rant:
"At some point between this morning and now FPS cut itself in half"

or this on NeoGaf:

"Both screens were taken with almost everything maxed out (Draw distance is not maxed out). I've tweaked some settings but saw no performance improvement. Maybe getting one or two frames per second when lowering shadowmaps from high to medium.

My specs are
Core i5 2500k @ 4.3Ghz (might be the issue here)
780Ti 3GB

16GB RAM (Can't remember frequency, sorry)

PS: Issues aside I still think this is a fun game and framerate isnt bothering me that much. I also have the newest Nvidia Drivers installed. Maybe this one has a day 1 update that hasnt been released yet.

Edit 2: upon further testing it seems i am able to get 40+ frames per second if I play it with everything off and on 1024x768."

http://www.neogaf.com/forum/showthread.php?t=980126

I don't know if I should laugh or.......cry at modern Day 1 " AAA" PC gaming.
 
Last edited:

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
Damn.

This makes Unity look like a well coded game :D



Still, what is it that Haswell does so much better than SB to pull off these gains? I highly doubt they'd developed this game against Haswell and left everything else behind.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Its a console port.

But I played it today. Didnt notice any issues, performance drops or flaws yet.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Haven't played it for myself yet, but it looks pretty good, and there are lots of moments when you're in high locations on top of towers, skyscrapers, etc etc. In those cases the scenery and draw distance look spectacular.

Thanks to your comment I scrolled down to this section in the NV guide and what do we have here:

"Of all settings, View Distance has the largest impact on performance, reducing framerates by more than half at 1920x1080."

dying-light-view-distance-screenshot.png


dying-light-view-distance-performance-overclocked-cpu-640px.png


There is no way a GM200 would hit 60 fps at 1080P with maxed out draw distance. WOW!

Nvidia Guide: "Usage of Core 1 will typically be pegged at 100% in Dying Light, but as the View Distance setting is raised GPU usage plummets, as does the framerate. Testing at other resolutions further confirms these results, with performance varying by only a few frames per second."

dying-light-cpu-performance.png


5930K @ 4.4ghz showing 98% CPU usage on Core 1.

BBamXwx.jpg


Didnt notice any issues, performance drops or flaws yet.

I am pretty sure you don't have everything maxed out, like draw distance. @ 1080P, a 980 barely gets above 30 fps in the wide open areas at 100% draw distance. See above.
 
Last edited:
Aug 11, 2008
10,451
642
126
Well, it utilizes a dual core great. Not to quibble over semantics, but it uses multiple cores poorly. It would be interesting to see benchmarks of an overclocked pentium in this game.

There does seem to be something about the Haswell architecture that gives a boost, evidenced by the sandy i3 vs haswell: 13% increase in clockspeed vs 75% faster FPS.
And also as noted by Russian, the same effect for 4770k vs 2600k.

I agree with another poster though that the 5960x results look weird. Very single threaded game, but somehow a 3ghz 8 core is the fastest???

How does this thing run on a console with such poor multi-core usage and such slow single thread performance on the console cpus?
 

SPBHM

Diamond Member
Sep 12, 2012
5,065
418
126
the consoles probably run with pretty low view distance, and with a 30FPS target

also I guess the console version is running with lower CPU overhead, lower dependency on 1 thread...
I guess DX12 is needed.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Most big games these days are cross-platform, designed with consoles in mind. PC features are optional or "bonus" extras.

So its kinda pointless to say its a console port since they nearly all are. ;)

But you know what so far unifies all of the worst coded/performing/stuttering recent console ports? GameWorks -- AC Unity, Watch Dogs, FC4, Dying Light. That's 4/4 GW games with major issues. Having said that I find it hard to believe that GW's resulted in a game engine that barely scales beyond 2 cores. Sounds like Chrome Engine 6 is just garbage in its current state.

Take a look at TW3 thread - we might be 5/5 soon.
- i7 4790K and 980 to run PC on High, not Ultra.
- They are contemplating locking the PC to 30 fps.
- "True next gen PC games" -- unfinished M+K controls 4 months before launch :sneaky:
http://forums.anandtech.com/showthread.php?t=2418609

If this is the sign of 2015 PC gaming console ports, count me out unless Adam wants to make a donation for a 5960X and Quad-980s. ;)
 
Last edited:

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
I can attest that it runs at 100% on my first core and the rest are somewhat used.

Honestly none of the other video settings had much of an impact on the game framerate besides view distance slider. Even HBAO+ only removed like 1 or 2 fps.

My biggest problem with the game is the anti-aliasing solution they included is horrible. The chromatic aberration especially looks horrible in the game.

I will say I'm having a blast with it. Co-op is fun, just running around stabbing zombies or dropkicking them. My friend who has a HD7970 however is getting shit framerate, he's getting like 10-20 FPS in open areas even on lowest settings while my fps is hovering around 40-60ish on a mixture of high and medium. Maybe AMD needs to optimize their drivers a bit.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
BBamXwx.jpg




I am pretty sure you don't have everything maxed out, like draw distance. @ 1080P, a 980 barely gets above 30 fps in the wide open areas at 100% draw distance. See above.


Is it just me or is that really running over 4gb VRAM? :eek:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126