I keep hearing people say that a CPU upgrade can result in more FPS even if the GPU was at 100% load to begin with. This doesn't make any sense to me logically.
If an engine is rendering frames with a 100% load GPU, and the CPU cores are not at full load, that would mean that there is nothing more CPU power could help the engine with.
Can someone give a technical example of how a common game engine could render more FPS with a CPU upgrade if the GPU was at 100% load to begin with (and no maxed CPU cores)?
If an engine is rendering frames with a 100% load GPU, and the CPU cores are not at full load, that would mean that there is nothing more CPU power could help the engine with.
Can someone give a technical example of how a common game engine could render more FPS with a CPU upgrade if the GPU was at 100% load to begin with (and no maxed CPU cores)?