- Aug 15, 2004
- 9
- 0
- 0
I've noticed that I tend to get much higher fps in UMark than when actually playing through similar situations in the same level with the same image quality settings. What information normally computed by the CPU in UT2k4 or Doom 3 is obtained from the timedemo file? Assuming the timedemo is a recording of typical gameplay, do timedemo benchmarks tend to give the correct performance ranking and percentage performance difference for video cards and CPUs even though they raw fps numbers are much higher than in actual gameplay? I would suspect that timedemos at least give the correct performance distribution for video cards since even if the CPU load is lighter than normal, the video card still has to do the same work per frame. However, wouldn't differences in the CPU work pattern and load when running a timedemo make the results suspect for comparing CPUs? performance in actual gameplay?
Also, I've noticed that when using demorec in UT2k4, demoplay with sound enabled only gives me about 1/2 to 2/3 the fps as I was getting when I recorded the demo.
Also, I've noticed that when using demorec in UT2k4, demoplay with sound enabled only gives me about 1/2 to 2/3 the fps as I was getting when I recorded the demo.