[GameGPU] Dying Light - Horrible game engine CPU optimizations

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
How is possible that AMD is performing this good with gamewoks and no driver?

Both consoles have GCN hardware. Games are inherently optimized for AMD's cards this round, save for whatever gameworks features favor nvidia's.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Yes 980 SLI support is there but GPU utilization was around 60-70% outside but strangely peaks indoors. I haven't tested it with the 1.2.1 patch yet, will later today but they claim there are some performance improvements.
 

Udgnim

Diamond Member
Apr 16, 2008
3,680
124
106
How is possible that AMD is performing this good with gamewoks and no driver?

GameGPU benchmarks show Nvidia cards demolishing AMD

TechSpot benchmarks show AMD cards being very competitive with Nvidia

wonder what methodology GameGPU used

TechSpot's was mostly indoor
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That might be the difference, indoor vs outdoor. Maybe the draw distance slider affects AMD more right now. Indoors that probably has very little effect.
 

Udgnim

Diamond Member
Apr 16, 2008
3,680
124
106
That might be the difference, indoor vs outdoor. Maybe the draw distance slider affects AMD more right now. Indoors that probably has very little effect.

the thing is, the higher the draw distance slider, the more the game becomes CPU limited

so I'd think there should be less of a difference between AMD and Nvidia GPUs as the draw distance slider increases

could just be something GameWorks related that Nvidia GPUs make use of well
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
I made a somewhat long gameplay video (22mins), recorded with an external recorder at 1080p/60fps, so the recording itself would not consume system resources at all. The OSD shows how the system gets strained at various parts.

Dying Light - PC Gameplay on gtx 970 @1.5ghz - 1080p 60fps

Once more, we witness cpu limits on a new AAA title.

I also made this video that illustrates how draw distance affects cpu performance and what the rest of the settings have to say about all this.

Dying Light - Graphics settings impact on performance - 1080p 60fps

All that on a GTX 970 boosting at 1.5Ghz and a core i5 2500k@4.8Ghz.

I will do proper benchmarking today on my other systems as well.

I redid a quick test with the new patch.

[patch 1.2.1] Dying Light quick test on how settings impact performance

With max Viewing distance on the same outdoor location, the framerate jumped to 72fps from 45fps. Pretty awesome!



I just finished uploading my benchmark session as well, if anyone would be interested. :)

First two same settings, last two same settings as well.

Dying Light 1920X1080 High GTX 970 @1.5Ghz Core i5 2500k @4.8GHz - 65fps

Dying Light 1920X1080 High Radeon 7950 @1.1Ghz CORE i7-860 @3.9GHz - 38fps

Dying Light 1920X1080 Medium GTX 570 @850Mhz Q9550 @4GHz - 43fps

Dying Light 1920X1080 Medium 5850 @950Mhz Q9550 @4GHz - 36fps

To tell the truth, even with reduced settings for the 570 and the 5850 it is not too bad looking. Surely more playable and higher grade looking, than Unity.

I also did a whole run on the GTX 970 benchmark.

I remind that I did not use max viewing distance on neither of my custom benchmarks, so I did not see much difference.

before
2015-01-28 21:52:38 - DyingLightGame
Frames: 19597 - Time: 300988ms - Avg: 65.109 - Min: 0 - Max: 98

after
2015-01-30 22:59:03 - DyingLightGame
Frames: 20603 - Time: 303890ms - Avg: 67.798 - Min: 33 - Max: 100

Still, as our friend noticed earlier, some huge frametime spikes did not occur. The game felt more relaxed. There still were spikes, but much smaller this time. Maybe that's why this split second 0 fps did not occur on this run.

edit:

Ooh, ooh, I forgot to mention something important. The benchmark result with reduced viewing distance may have not changed much, but what did, was the cpu load. One of the cores was pegged at 100% all the time, and now it was around 80% and the rest of them seemed to have somewhat more even spread.
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
A friend of mine who owns a 980 came over the day after Dying Light launched and his 980 worked flawless in my machine, and my 970 stuttered in his. Something is wrong with these cards, I just want my money back so I can get a working product that would be great. I don't think I'm being unreasonable to ask for a official response, to get a refund so that all people who bought the product can get their refund.
It's very hot down at Nvidia campus tonight. Anybody else having similar issues here or is the guy making it up?
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
With the 1.2.1 patch for Dying Light, CPU threading is MUCH better. Using HWiNFO I'm seeing up to 10 threads being used concurrently with load pretty evenly spread out so they did a nice job there. I'm also getting much smoother and consistent gameplay vs 1.2.0. They're going to be addressing multigpu even further in another patch so things are looking good for this game. Right now at 1440p with all settings except view distance (set at about 60%) maxed, I'm consistently over 60 fps and often hitting 120+.

If they can raise GPU utilization to 90%+ for both GPUs consistently, then staying at 100+ fps with all settings maxed should be easy.
 

caswow

Senior member
Sep 18, 2013
525
136
116
With the 1.2.1 patch for Dying Light, CPU threading is MUCH better. Using HWiNFO I'm seeing up to 10 threads being used concurrently with load pretty evenly spread out so they did a nice job there. I'm also getting much smoother and consistent gameplay vs 1.2.0. They're going to be addressing multigpu even further in another patch so things are looking good for this game. Right now at 1440p with all settings except view distance (set at about 60%) maxed, I'm consistently over 60 fps and often hitting 120+.

If they can raise GPU utilization to 90%+ for both GPUs consistently, then staying at 100+ fps with all settings maxed should be easy.

sorry for bursting your bubble but this is not correct.

http://www.neogaf.com/forum/showpost.php?p=149821649&postcount=712

http://www.neogaf.com/forum/showpost.php?p=149816618&postcount=703

This whole thing is a desaster. i thought gameworks libraries would help the developers to save some time to improve other things in their games :thumbsdown:
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
sorry for bursting your bubble but this is not correct.

http://www.neogaf.com/forum/showpost.php?p=149821649&postcount=712

http://www.neogaf.com/forum/showpost.php?p=149816618&postcount=703

This whole thing is a desaster. i thought gameworks libraries would help the developers to save some time to improve other things in their games :thumbsdown:

I have had a hardware monitor open in front of me before and after the patch and can see CPU utilization has gotten much better so why should I care what joe blow on neogaf thinks? If they adjusted the view distance for performance reasons, what does that have to do with better CPU utilization? That's right, nothing.

And what does GameWorks have to do with any of this? They added NV specific effects in there and those seem to be working just fine. If the title is unoptimized in other areas, that's the developers fault, it has nothing to do with NV or GW.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Yup, 1.2.1 made it a little better. I didn't have many issues with the release version, except for the freezing before cutscenes start. That has gotten a little better, but still happens sometimes.

Still a fun game. Wondering how many complaints here are from people who actually own the game...
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Yet another game I'll pick up 12-24 months after release, when a small avalanche of patches and driver updates have finally rendered the thing acceptable for the 5 bucks I'll spend on it.

Early adopters: your generosity is appreciated
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
This game fun and running like butter on GTX 980 with everything max and draw distance to 70%.

Sorry i have to say this i tired this game on R9 290X Lighting Edition and i am not even 30fps out and no matter what setting i try even on low i am getting 30fps but if i use GTX 980 than i get average of 75fps at max settings.
 
Last edited:

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
This game fun and running like butter on GTX 980 with everything max and draw distance to 70%.

Sorry i have to say this i tired this game on R9 290X Lighting Edition and i am not even 30fps out and no matter what setting i try even on low i am getting 30fps but if i use GTX 980 than i get average of 75fps at max settings.

http://greengamers.com/dying-light-how-to-disable-film-grain-increase-fps-performance/

Disable film grain.

I am also pretty annoyed by the poor performance I'm getting with Quad 290x and everything at medium at 8040x1440. (11.5MP). I disabled film grain and shadows and I average now at 55-70 fps. It sucks that it runs so bad because I'm a fanatic of these kind of games.

But my cards only get MAX of 40% usage and this is using custom crossfire profile of Optimize 1x1.
 
Last edited:

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Apparently they moved around some settings to make it seem they they did better "optimization", like shadows for example. Hilarious.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
http://greengamers.com/dying-light-how-to-disable-film-grain-increase-fps-performance/

Disable film grain.

I am also pretty annoyed by the poor performance I'm getting with Quad 290x and everything at medium at 8040x1440. (11.5MP). I disabled film grain and shadows and I average now at 55-70 fps. It sucks that it runs so bad because I'm a fanatic of these kind of games.

But my cards only get MAX of 40% usage and this is using custom crossfire profile of Optimize 1x1.

I doubt the developers even though anybody would run this with 4 cards and that resolution. Its so outlying I doubt anyone would bother optimizing.
 

DownTheSky

Senior member
Apr 7, 2013
800
167
116
I'm running @ 1080p with a R9 280X and get max 52% GPU utilization.

How much do u nVidia guys get?

EDIT: If you're playing with v-sync, turn v-sync off. It gave me like +30% FPS.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I'm running @ 1080p with a R9 280X and get max 52% GPU utilization.

How much do u nVidia guys get?

EDIT: If you're playing with v-sync, turn v-sync off. It gave me like +30% FPS.

FPS is all over the place for me in this game. I reduced the view distance to 50%, because higher than that and it was pegging one core at near 100%, and the game is still running like ass. I still only get 50-70% gpu usage on both cards.

Totally inconsistent framerates regardless of activity on the screen, 90+fps then plummets to 30fps. Another nvidia Crapworks title to toss in the bin with Watch Dogs and AC Unity.