- Sep 9, 2010
- 2,574
- 252
- 126
So Im sure most us are familiar with the perfoverlay.drawgraph 1 console command. I understand low is good and 16.6ms = 60 fps etc etc.
What I'm not exactly clear on is the differences between cpu vs gpu frametimes and how they correlate to each other. for example, why is that the cpu frametime drives the framerate. For example the fps display will show 120 and cpu frame time of 8.3 which makes sense, but the gpu frame time will be 14 or 15.....
wouldn't that mean the real framerate is only slightly above 60? shouldnt the fps display match up with GPU frametime?
EDIT:
this only seems to happen in multigpu. If I disable SLI then the cpu and gpu frametimes are in sync and everything makes sense. does this overlay tool show double GPU frametimes if you have 2 cards?
What I'm not exactly clear on is the differences between cpu vs gpu frametimes and how they correlate to each other. for example, why is that the cpu frametime drives the framerate. For example the fps display will show 120 and cpu frame time of 8.3 which makes sense, but the gpu frame time will be 14 or 15.....
wouldn't that mean the real framerate is only slightly above 60? shouldnt the fps display match up with GPU frametime?
EDIT:
this only seems to happen in multigpu. If I disable SLI then the cpu and gpu frametimes are in sync and everything makes sense. does this overlay tool show double GPU frametimes if you have 2 cards?
Last edited: