Determining a bottleneck approach

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I see a lot of questions in regards to whether something is CPU or GPU limited. While its far more complicated than just one or the other you can test for bottlenecks quite simply:

1) Run task manager and watch the CPU usage.
2) Run GPU-Z and log the GPU usage.
3) Run the game/benchmark

If the GPU-Z usage is greater than 90% then its GPU limited, otherwise its CPU. If you look at the CPU chart and take the percentage of usage and times it by the number of virtual cores you have (4 if i5 or 8 if i7) then you will get the number of cores the game is roughly using fully.

I personally think its that simple, anyone got a more sophisticated approach?
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
1.At any resolution increase the cpu speed by x%,if fps goes up by ~x% you are cpu bound.
2.Keep the cpu clock constant but increase the gpu clock by x%,if fps goes up by ~x% you are shader bound.
3.Keep the cpu clock constant but increase the memory clock by x%,if fps goes up by ~x% you are bandwidth constrained.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
1.At any resolution increase the cpu speed by x%,if fps goes up by ~x% you are cpu bound.
2.Keep the cpu clock constant but increase the gpu clock by x%,if fps goes up by ~x% you are shader bound.
3.Keep the cpu clock constant but increase the memory clock by x%,if fps goes up by ~x% you are bandwidth constrained.

The problem with this is that games almost never scale linearly with any clock speed, so your scenarios will never be observed. Furthermore, separating out GPU clock and VRAM clock makes it even more unlikely you'll ever see the scaling you presume above.

For instance, take these scenarios:
(1) Increase GPU and VRAM clock by 10%, observe 5% higher FPS.
(2) Increase GPU and VRAM clock by 10%, observe 2% higher FPS.

So, in which scenario above are you CPU-bottlenecked? I'd argue that scenario 1 might be a case of a GPU bottleneck, and scenario 2 is probably CPU-bottlenecked, with an improvement only on the margin, with scenes where the CPU load drops. But it's all guess work.

That's why I like the OP's technique better. If your GPU is running at 90% or higher most of the time, you pretty much know it's working at close to its full potential.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
@Termie
What happens when your cpu and gpu usage ~90%?Isn't there any bottleneck?There will always be bottleneck.To identity it we have to separate the scenarios.But I agree with you that it isn't easy and that is why we have profilers like perfhud(upto fermi) or nvnsight to nail it down.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
When CPU bound, clock speed scaling is pretty good. Not perfect, but near perfect. If it is not, you are not completely CPU bound.

For example in Shogun 2 76% higher clocks resulted in 70% higher performance here.

But the GPU usage approach is easier. Load EVGA Precision or Afterburner, and if you're at 98+%, it is a hard GPU limit. If it is below, you are beginning to get CPU bound. That is not linear, though. 70% GPU usage can mean quite a nasty CPU bottleneck already.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Hmm interesting philosophical question: if the CPU and GPU are equally taxed by a game, would you say that there is no bottleneck? Or would you say both are bottlenecked?

As an optimist by nature, i would say there is no bottleneck, because neither is holding the other back, they are balanced.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Hmm interesting philosophical question: if the CPU and GPU are equally taxed by a game, would you say that there is no bottleneck? Or would you say both are bottlenecked?

As an optimist by nature, i would say there is no bottleneck, because neither is holding the other back, they are balanced.
I would have to go with both being bottlenecks. I say this because there is always a bottleneck. Otherwise, the computing performance would increase to infinity. Both are bottlenecks because both are determining a maximum threshold of performance.

Simply because both are equal does not mean there is no threshold. Both are defining the threshold in that case until one or both components are upgraded.
 

Puppies04

Diamond Member
Apr 25, 2011
5,909
17
76
I would have to go with both being bottlenecks. I say this because there is always a bottleneck. Otherwise, the computing performance would increase to infinity. Both are bottlenecks because both are determining a maximum threshold of performance.

Simply because both are equal does not mean there is no threshold. Both are defining the threshold in that case until one or both components are upgraded.


When asking about bottlenecks we are generally talking about a particular system, in that system either the GPU or the CPU can be the bottleneck, if both are equally taxed then there is no bottleneck in that system, Of course there is always a bottleneck but it is pointless to point out that both components are bottlenecks because even if you built a muti million dollar gaming rig there would still be a bottleneck but pointing that out would seem a little pedantic.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Yes well, I gave a pedantic if true answer to a philosophical question. I was never referring to a specific system but to the nature of bottlenecks. I suppose my answer for bottlenecks does not equal "practical" bottlenecks.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
The real answer is that most games will have some percentage of frames that are CPU limited and some percentage that are GPU limited.

It's very rarely 100% CPU or 100% GPU limited.

The only way to tell for sure is to use FRAPS or a similar program, and run an orthogonal experiment adjusting the variables you're looking at (2 or 3 CPU speeds and 2 or 3 GPU speeds in this case.)

Then plot ms for each frame or "instantaneous FPS" and look for changes between the different cells of your experiment.

This will tell you what areas of gameplay are CPU dependent on CPU and what areas of gameplay are GPU dependent. Demos are sometimes not reliable, as they may playback gameplay rather than actually using AI. Ideally you'd do this by activating some sort of godmode cheat before testing so you can run the loop consistently. Even though there will be small discrepancies between runs, I've had solid success in the past by using this method to see CPU, memory and GPU affected areas of an individual game.

Running a 3x3 design will let you see interactions between CPU and GPU speed, but my experience is the particular areas of a game that are CPU and GPU dependent are not prone to show interactions, so a 2x2 matrix is quicker and easier.

In order to do this, you also must determine an 'acceptable' FPS level so you can use that as a benchmark. In my testing, things start getting annoying to me any time instantaneous FPS is less than 1/2 refresh. In the course of performing experiments like this you actually tune yourself in to this kind of thing. It's good to know what your personal limit is, and it was interesting to me to figure out the connection between the refresh rate and my comfortable FPS. This was several years ago at the time, well before "micro-stutter" was a mainstream concept that defines the issue well.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
You can’t infer limitations from load graphs; just because the % is high, it doesn’t mean it's the primary bottleneck.

A far simpler and more universal way is to increase the resolution and AA. If there’s a significant performance hit, you’re GPU limited. If there isn’t, you’re CPU limited.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
You can’t infer limitations from load graphs; just because the % is high, it doesn’t mean it's the primary bottleneck.

A far simpler and more universal way is to increase the resolution and AA. If there’s a significant performance hit, you’re GPU limited. If there isn’t, you’re CPU limited.

Resolution can affect cpu load. But I agree just changing AA can work. The problem with that is that some games don't have it, and often where they do, AA takes such a drastic performance hit that it's difficult to make fine-grained conclusions. So you could easily shift a game from a cpu limitation to a gpu limitation, but you really won't be able to tell that the shift occurred. If the gpu was working at 85% before AA and is pegged at 99% with AA, what finding would we make? The fact that there was a performance hit is not enough information, because of course you'd conclude that you were gpu limited when in fact you might not have been.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
It's not always as simple as looking at graph of usage, especially with the CPU. Many games are still badly multi-threaded and can be CPU limited even though the CPU appears to have low load, usually because load isn't balanced very evenly across threads and threads end up having to wait for other threads to update first.

The GPU doesn't tend to have that problem, usage can get to 95-100% quite easily, so I tend to gauge bottlenecks by that, graph it with MSIs afterburner.

The other way is to alter video settings, if you can decrease your graphics settings and the load on the GPU goes down, but your frame rate does not go up, then you're CPU limited, if you increase your graphics settings and your frame rate does not drop again you're CPU limited.

Remember that CPU/GPU limits depend on the game and what is occurring at the time, it can flip between CPU or GPU limited if the demand on either of the components changes disproportionately, for example in an MMO like Planetside 2 you can idle at spawn and get 100+ fps and most likely be GPU limited, in large battles you can end up at 30fps while CPU limited owing to the extra players in your vicinity.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
No it can't.

Yes it can, but it rather has to do with the aspect ratio than with the absolute numbers of pixels.

Take 1280x720 vs 1024x768.
1280x720 has a wider field of view, more objects are visible that need to be calculated by the CPU. Going from 5:4 (1280x1024) to 16:9 (1920x1080) I have seen an fps decrease of 5-10% in CPU limited scenes.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,799
1,525
136
Well, if we're going to be that anal it depends if the game is hor+, vert-, or some combination of the two ;)
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
What do you mean?
If the game doesn't show anything or very little at the peripheral area, then performance won't change (much), yes. A good example of this is the Shogun 2 CPU benchmark. The fighting takes place only in the middle of the screen, at the edges there is little going on, thus the performance impact of 16:9 resolutions there is minimal.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,799
1,525
136
What do you mean?

I mean not all games add to the horizontal field of view, some subtract from the vertical field of view to achieve a wider aspect ratio. A few split the difference, and end up both shrinking the vertical FOV as well as increasing the horizontal.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
No it can't.

Yes it can, but it rather has to do with the aspect ratio than with the absolute numbers of pixels.

Take 1280x720 vs 1024x768.
1280x720 has a wider field of view, more objects are visible that need to be calculated by the CPU. Going from 5:4 (1280x1024) to 16:9 (1920x1080) I have seen an fps decrease of 5-10% in CPU limited scenes.

This is what I meant.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Yes it can, but it rather has to do with the aspect ratio than with the absolute numbers of pixels.

Take 1280x720 vs 1024x768.
1280x720 has a wider field of view, more objects are visible that need to be calculated by the CPU. Going from 5:4 (1280x1024) to 16:9 (1920x1080) I have seen an fps decrease of 5-10% in CPU limited scenes.

I never actually considered this, but it seems like a valid point.

A Vert- based game fixes the horizontal viewing angle and reduce the vertical viewing angle as you increase the ratio between the width and height, i.e the wider your aspect ratio the less you see on the screen.

A Horz+ based game fixes the vertical viewing angle and will increase the horizontal angle as you increase the ratio between the width and height, i.e the winder your aspect ratio the more you see on the screen.

If you're decreasing the aspect ratio you'll get the opposite effect.

If the game uses visibility calculations to create and destroy assets in the game which require some CPU time, for example characters with AI routines or particle emitters using realtime physics then the laod could change with resolution...or more appropriately the load could change with aspect ratio.

Remember that you can dramatically change the screen resolution and maintain the same aspect ratio, for example moving from 2560x1600 to 1280x800 you quater the screen size but the aspect ratio is identical so viewable area in game would also be equal.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Yes it can, but it rather has to do with the aspect ratio than with the absolute numbers of pixels.
That’s not the resolution doing that; it's caused by the FOV changing.

Changing nothing else except for the resolution has no effect on the CPU load.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I dunno, I tend to think of increasing the total number of pixels as being an increase in the resolution, even if it's only to change the FOV.

For example, I think of going from 1920x1080 to 1920x1200 as an increase in resolution, because there are more pixels. If the game happens to be Vert- and just so happens to increase the FOV with this change, then I guess that would be incidental in my mind to the more visceral act of increasing the pixels/resolution. I guess, usually in games, you adjust the actual resolution by changing the pixels, and the FOV is not usually something you interact with directly, but rather just arises based on the change in resolution.

But I haven't played much of the newer games, are they letting you do a slider to FOV adjustments? that would be pretty cool; I remember doing some kind of hack to let me adjust FOV dynamically while playing Bioshock 1 (I was using SoftTH on triple monitor back then, the hack let you press Function keys to increase or decrease horizontal FOV) and it was kind of "trippy" like a rubber band, but definitely changed what the computer displayed on screen to where I guess the CPU had to render more or less stuff depending on the FOV. But usually, I just change the pixels/resolution and the FOV is not something I choose.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
That’s not the resolution doing that; it's caused by the FOV changing.

Changing nothing else except for the resolution has no effect on the CPU load.
This is true for CPU load per frame.

General CPU load can increase with framerate due to the dynamic game clock. (game clock calculations may adjust to framerate.)

But yes, generally easiest way to check if you are CPU limited is to use lowest possible resolution and see if one gets better framerate. (generally 640x480 should be faster than 1920x1080.. ;))
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
I dunno, I tend to think of increasing the total number of pixels as being an increase in the resolution, even if it's only to change the FOV.

For example, I think of going from 1920x1080 to 1920x1200 as an increase in resolution, because there are more pixels. If the game happens to be Vert- and just so happens to increase the FOV with this change, then I guess that would be incidental in my mind to the more visceral act of increasing the pixels/resolution. I guess, usually in games, you adjust the actual resolution by changing the pixels, and the FOV is not usually something you interact with directly, but rather just arises based on the change in resolution.

But I haven't played much of the newer games, are they letting you do a slider to FOV adjustments? that would be pretty cool; I remember doing some kind of hack to let me adjust FOV dynamically while playing Bioshock 1 (I was using SoftTH on triple monitor back then, the hack let you press Function keys to increase or decrease horizontal FOV) and it was kind of "trippy" like a rubber band, but definitely changed what the computer displayed on screen to where I guess the CPU had to render more or less stuff depending on the FOV. But usually, I just change the pixels/resolution and the FOV is not something I choose.

Well you're right, FOV changes when aspect ratio changes, all due to a change in screen resolution is purely incidental.

It is worth noting that monitors are obviously fixed aspect ratios and to make proper use of the screen space of your monitor any resolution you pick ideally should be the same aspect ratio anyway. You could argue that unless you're switching monitors or dumb enough to pick an inappropriate aspect ratio, the aspect ratio should remain the same as you increase or decrease the screen resolution.

FOV has been tweakable in many engines ever since 3D rendering, it used to be set reasonably high for PC games back in the day and exposure of the FOV was often only through develop consoles or ini/cfg tweaks. Modern games tend to use very small FOVs because they're basically console ports and console games tend to use low FOVs for several shitty reasons, this had lead to complaints on the PC platform and more recently the exposure of FOV controls in the video settings of some modern games like BF3.

Remember the CPU isn't rendering anything it's mostly doing calculations on things like game states, AI, physics and alike. However some of these game assets are often destroyed or switch to simplified behaviour in the engine when they leave your visibility, for performance reasons.

For example once a car or pedestrian in GTA goes out of your visibility by some predetermined amount then it will eventually be destroyed or at least simplified, they don't simulate peds and cars for the entire city, only some small subset of that total area, highly likely this subset is at least partially based on the players visibility, along with various other rules probably.

Increasing FOV does also increase art assets in the scene as well and can cause more load on the GPU for similar reasons, but the percentage increase in load the additional FOV causes on both the CPU and GPU respectively is likely to be different and in some cases it could be enough to flip the previous bottleneck from one to the other. Depends entirely on the game and circumstance at the time.
 
Last edited: