Originally posted by: poopa
If I see a benchmark of, let's say, UT2K3 and on Radeon 9800Pro they got, let's say 150 FPS,
and, let's say, my monitor highest refrash rate is 100 mhz, does it mean I dont see all the frames?
If you do benchies which achieve rates higher than the monitor then because they used an (driver) option to *disable* the syncronisation of the frames with the monitor. (so called Vsync, off). If Vsync is on then every 3d game would always sync with the monitor and can not exceed its refresh rate. (in my case it would be stuck at 80 or 100 hz). VERY UNLIKELY that a new game with eyecady on etc.. hits FPS higher than my monitor refresh
From a certain point of view your assumption is right, you dont SEE 150 complete frames (how could you ?) because the monitor 'only' updates 100 times a second ...if i do a benchmark getting 200FPS the card would just push everything out to the monitor as soon as the frame is rendered...very quick....and the moni would sync (slower) on its own ...
But....this, your question is too theoretic and also somehow (practically) not relevant...at least it is not a 'problem' because 99,999% of people do NOT run their monitors such high rates anyway....
Sigh...this question somehow reiminds me when i was in school and the teacher told me about the one guy who had to do a physics/science test.
His only question for the test was "Why is the sky blue ?" Of course the answer was 30 pages long and very scientific and detailed
