Originally posted by: allies
Originally posted by: BFG10K
Vsync shouldn't have anything to do with minimum frames
Sure it does.
Really? Howso? Please enlighten me.
Also, the point of my post was not complaining about my MAX fps, it was about my MIN fps
almach1.
Edit: This happens with both 100.65 and 101.70 (I'm using these now).
Edit 2: I've played at different settings too, and with stuff on low/medium and motion blur off it "only" dips into the teens. Still ridiculous
The quickie description:
* A video card has 2 image buffers, the front buffer and the back buffer.
* Whatever's currently being displayed is from the front buffer, which leaves the back buffer free to be manipulated and drawn to while something else is showing.
* When we're ready to show the next frame, the buffers are flipped so the fresh back becomes the front and the old front is now the back.
* When V-sync is on, this flip is done on every screen refresh, commonly 60hz(60 times per second) on a LCD
* The problem is that by only being able to flip buffers on every screen refresh, the flips may not occur when the back buffer is finished being drawn to
* When the back buffer is ready and waiting to flip, the video card must stop rendering until the flip because it has nothing to draw to
* As a result, this impacts both minimum and maximum frame rates.
* For minimum frame rates, say a frame takes slightly longer than 1/60th of a second to render, we then must wait for the next flip before we can continue rendering. This impacts the minimum frame rate as opposed to when y-sync is off because we've wasted that rendering time.
* Just as an example of a worst-case scenario, I'm going to show you some math with a 50hz refresh rate(since the numbers don't get so funky):
At 50hz, the screen refreshes every 20ms.
Now let's say we need 21ms to render each frame.
After 21ms, we're done, but we missed the screen refresh. We must wait until 40ms to actually show the image.
At 40ms, we show the image and start the next frame.
At 61ms, we're done with the next frame, but again missed the screen refresh, we must wait until 80ms to show it.
Etc, etc.
As it turns out, in that worst case scenario we only render 25 frames in 1 second because we had to wait for the screen refresh to flip the buffer. Had we been allowed to flip the buffer whenever we wanted, we would have been able to render 47 frames in that time period (1000ms/21ms = 47 frames & change). In other words, by enabling v-sync we've cut our graphics performance by
47%.
Granted, this is the worst case scenario and in the best case we can lose virtually no performance, but in practice the results are often in the middle and we're still giving up a significant level of performance. Other than using triple buffering, we need to disable v-sync in order to maximize our performance and keep our video card working all the time.