General question on CPU-GPU interaction

imported_and

Junior Member
Jan 30, 2006
13
0
0

Hello, a newbie question on this subject.

About the usual test to check if you're CPU-bound; lowering the resolution and checking if your FPS remains the same or gets better. Let's say you consistently get 70 FPS at 14x9 as well as 10x7.
Now what exactly is causing this - is it simply that the CPU cannot achieve more performance with the physics, AI and general computing required by the game, or is there some direct CPU-GPU interaction going on?
Does for example the CPU need to "oversee" some of the workings in the API?
Is there some specific function, ie fill rate (just throwing something out) where the CPU needs to work with the GPU?

The reason I'm asking is, well, let's say you have a system where the video card is somewhat more modern than your CPU, which is still good. Playing the games you like with the IQ settings you want, you get a steady ~55fps with no dips, which is enough to make you happy.
You've done the usual test to see if you're GPU-bound, and when you lower your resolution you indeed see that the FPS does not increase above 55, but this is just good to know, and not an issue at the moment.

Then you want to buy a new, larger monitor to game at, say 19x. You want to keep that FPS so you decide to add a second card and run SLI/CF, and maybe even get some better AA/AF modes as a bonus.

So when you have this SLI/CF setup running, let's say you're gaming at 19x in a game that doesn't tax your two graphics card to the limit, they are keeping up with a game that would be too hard on your old single setup. Is the CPU still able to give you ~55FPS, or does the increased res and graphical power also mean more strain on the CPU?

Thanks for any answers.