Dumbfounded. Metro Last Light cpu and gpu usage - makes no sense!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I just ran this benchmark last night. I haven't started playing this game yet :$. I did the same run with the same settings as you Balla, 1200/1675. The 3570K @ 4.8 and my ram @ 1600Mhz. I got 56 fps - but the interesting thing is when I uped the clocks to 1250Mhz and saw an avg fps increase of 2. Looks to me it's not responding very well to any additional clocks :confused:. My cpu usage was around 40-50% and my GPU usage was 99% I will post a screen shot when I get the chance.
 
Last edited:

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
If you want to find if you are CPU limited drop resolution to absolute minimum the program lets you. (IE. 640x480 or 800x600)
After this increase resolution until FPS starts to go down, now you are GPU limited.

When ever you overclock your GPU the % metrics on GPU change so they are no longer valid for direct comparison.
You can try to compensate for it, but it's not simple math.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
If you want to find if you are CPU limited drop resolution to absolute minimum the program lets you. (IE. 640x480 or 800x600)
After this increase resolution until FPS starts to go down, now you are GPU limited.
Yep, and turn off AA too.

The fact is, CPU % can't really tell you anything because some game engines peg the CPU at full bore regardless of their actual workload.

About the only counter you can rely on is a flat-line GPU 99%, which tells you that you're completely GPU limited. Like this:

GPU.png
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
You're not bottlenecked. It's just that that particular area of the game isn't very GPU intensive. Metro Last Light isn't very system intensive at all really (unless you have PhysX and or SSAA turned on), generally speaking. Not only is it a lot more optimized than it's predecessor, but the engine is highly optimized for consoles in terms of resource usage..

It uses comparatively much lower amounts of VRAM and system memory than say Crysis 3, and even Bioshock Infinite.

I guess they took the criticism of Metro 2033's lack of optimization to heart :D

If a game is failing to utilise all of either the CPU or GPU (if not both, ideally) and is still below 150-300fps then in my book that means that it could be optimised a hell of a lot BETTER.

I agree that in this case Balla has a sort of GPU bottleneck. But if the game was truly well optimised, his GPU would be running at near-100% load rather than 70-75% load.
 

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
Okay OP. I firmly believe that Metro Last Light is at fault and not your GPU's. Some games are rather buggy and require a fix to obtain full GPU utilization.

Fallout 3 for example is a very light game but I was getting tons of stuttering. My processor ran each core at 1.2Ghz since the game is so old and my GPU utilization kept dropping like a rock and my frames went down when that happened. I downloaded a user patched mod that prevents that from happening.

Look how Crysis 3 keeps your GPU's working hard. That is what we expect in all of our games but that is not the case. It sounds like people with high-end GFX setups cannot fully utilize their GPU's in Metro LL.

I also remember reading how Metro Last Light is far more CPU dependent than Metro 2033. Anyone have SLI/Crossfire GPU usage numbers in that Metro 2033? We also do not know how many threads Metro LL can utilize. If the game can use up to 4 threads, 100% CPU utilization will not occur with hyperthreading.

I advise lowering your resolution to 1024x768 and then tell us your GPU usage numbers and FPS. If your usage numbers are lower and FPS is higher, that also will prove your GPU's are at fault.

There could be another possibility at stake. GPU's can be bottlenecked in more than one way and there could be something we are not monitoring (VPU usage for example).

If someone in here with SLI could input their GPU utilization that would be appreciated.
 
Last edited:

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Did some benchmarks for a single 7950 with this game at different clock speeds. Here are my results

3570K@4.8Ghz
1600Mhz DDR3

Everything maxed @ 1080P -NO SSAA or PhysX

800/1250 AMD stock reference clocks
800-1250_zpsf80e6233.jpg


880/1250 - MSI TF3 stock clocks
880-1250_zps0e901481.jpg


1000/1250
1000-1250_zpsb2ffebcd.jpg


1150/1250
1150-1250_zps6a04d5f1.jpg


1150/1650
1150-1650_zps67a234f4.jpg


1250/1750
1250-1750_zps51c34239.jpg
 
Last edited:

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
:biggrin:

I would also be interested to see how GTX 6 series GTX 7 series do in this bench at the same settings. Anyone chime in?

If I am reading this right - A stock Titan is about 11 frames higher on avg vs. my lowly 7950 @ 1250/1750 - of course there are other variables.

Here's one with PhysX turned on, SSAA turned off:

2uiu.jpg


And here's one using the settings you requested:

8pze.jpg


And finally, one with SSAA enabled and PhysX disabled:

f1vv.jpg


Looks like SSAA is more system intensive than PhysX. I monitored my GPU utilization, and it was above 90% for the most part. GPUs running at 1267 and memory speed at 7.7 GHZ.
 
Last edited:

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Thanks for doing that. Anyway you could run the benchmark with One GTX 770 @ 1080P with very high settings including VH tessellation and NO SSAA and NO PhysX. And then run two with the same settings?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Thanks for doing that. Anyway you could run the benchmark with One GTX 770 @ 1080P with very high settings including VH tessellation and NO SSAA and NO PhysX. And then run two with the same settings?

I can, but I don't think the performance will be representative of a single GPU system because the drivers are still configured for multi GPU at the kernel level.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I can, but I don't think the performance will be representative of a single GPU system because the drivers are still configured for multi GPU at the kernel level.

Hmm... are there separate drivers for SLI configs? Can't you just disable SLI in the Nvidia control panel?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Hmm... are there separate drivers for SLI configs? Can't you just disable SLI in the Nvidia control panel?

No, what I meant was that SLI is never truly disabled unless you do a driver reinstall. Anyway, I ran the benchmarks:

SLI disabled:

u36w.jpg


SLI enabled:

9lz7.jpg
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
No, what I meant was that SLI is never truly disabled unless you do a driver reinstall. Anyway, I ran the benchmarks:

SLI disabled:

u36w.jpg

Thanks again for doing this. Looks like without re-installing your display driver our cards are about equal (overclocked) in avg framerates.

I do notice your minimums are much better. My result has way more dips vs. your GTX 770 (CPU or AMD driver?) and your bench has a higher max frame-rate too.

1250/1750
1250-1750_zps51c34239.jpg
 
Last edited:

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Balla - What were your 7950 clocks at in this bench? You are about 5fps from Carfax83's GTX 770's

9157433190_ee94d81403_o.png
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
Can we salmon the conversation back upstream to where your 4670k is stable at 4.8 GHz.

Realtalk, how many goats did you have to sacrifice? :p