Am I missing something (over 60fps on LCDs)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I guess you would be correct with the tearing thing JAG, I've never really thought about it because I never use Vysnc. All of the things I've skimmed over say tearing happens when the number of rendered frames is more than the refresh rate. This is true, but there is another cause(s) of tearing which is the monitor and video card not being synced like you mentioned. Unfortunately I still cannot get Vysnc to work with that program.

Either way we are still have conflicting opinions which I don't think will change. My opinion is that for most people Vsync with triple buffering is the best solution, and for die hard performance fanatics the standard double buffering with no Vsync can be better. Double buffering and Vsync should be avoided except in niche situations. Limiting the game engine to emulate Vsync is the worst, but can still have its place for compatibility reasons.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I guess you would be correct with the tearing thing JAG, I've never really thought about it because I never use Vysnc. All of the things I've skimmed over say tearing happens when the number of rendered frames is more than the refresh rate. This is true, but there is another cause(s) of tearing which is the monitor and video card not being synced like you mentioned. Unfortunately I still cannot get Vysnc to work with that program.

Either way we are still have conflicting opinions which I don't think will change. My opinion is that for most people Vsync with triple buffering is the best solution, and for die hard performance fanatics the standard double buffering with no Vsync can be better. Double buffering and Vsync should be avoided except in niche situations. Limiting the game engine to emulate Vsync is the worst, but can still have its place for compatibility reasons.

If you just force vsync in your control panel it should work...

We have conflicting opinions because you haven't tried playing how I suggested. Whatever your refresh rate is on the LCD or the air conditioner, enable vsync and cap your fps to 1 below the refresh rate. This eliminates input lag because it basically ensures that your GPU never renders anything ahead of your refresh polls and ferment it in the two back buffers. To be honest, I'm not even sure you need this trick on 120Hz, because you are playing with 8ms gaps between frames so I don't think you would even feel input lag, but coming from no vsync and zero input lag, you might. Capping 1 frame less is always your best bet. Problem is not all games have a frame rate limiting cvar, much less a developer console nowadays. If only ATI and nvidia would get wise and give us a frame rate setting in the driver....sigh. But this is only necessary in mouse input games, input lag is not an issue in RTS, MMO, or any game you play with a controller.


EDIT:
Here dude read this article, it will explain much better what I am saying.
http://www.anandtech.com/show/2794
 
Last edited:

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Actually I was going to link that article as well.
capturebzv.png

Yes with no Vsync the image quality is terrible, but for the extremists, you still pull a good 3-6ms of input delay off from the middle of the image in that example depending on where in the frame you look.
With an easy to render game such as CS:S where its not hard to get 300-600fps and at a 120hz refresh rate, you are looking at an average of a 4ms (slightly under) difference at the middle of the screen between turning Vysnc on/off. Lowering the refresh rate to 60hz and the average benefit climbs to 8ms. Are the gains worth it? That's up for you to decide.

Capping your framerate via fps_max to 59 is just stupid in that game though. First off, you are basically just playing using double buffering Vysnc, from an input delay perspective you would be better off using triple buffering. You add ~20ms of input delay vs standard and ~13ms of delay vs triple buffering. Second and more importantly in my opinion is you are effectively limiting your cl_cmdrate to 59. While nowadays with the OJB netcode its not as big of a deal, that would have been such a huge handicap on the old engine. Here is a list of some of the things that will be affected:
Registry
Quick Switching
Recoil (Extremely minimal)
Performing any two actions consecutively.
Input Delay
Air Strafing
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Actually I was going to link that article as well.
capturebzv.png

Yes with no Vsync the image quality is terrible, but for the extremists, you still pull a good 3-6ms of input delay off from the middle of the image in that example depending on where in the frame you look.
With an easy to render game such as CS:S where its not hard to get 300-600fps and at a 120hz refresh rate, you are looking at an average of a 4ms (slightly under) difference at the middle of the screen between turning Vysnc on/off. Lowering the refresh rate to 60hz and the average benefit climbs to 8ms. Are the gains worth it? That's up for you to decide.

Capping your framerate via fps_max to 59 is just stupid in that game though. First off, you are basically just playing using double buffering Vysnc, from an input delay perspective you would be better off using triple buffering. You add ~20ms of input delay vs standard and ~13ms of delay vs triple buffering. Second and more importantly in my opinion is you are effectively limiting your cl_cmdrate to 59. While nowadays with the OJB netcode its not as big of a deal, that would have been such a huge handicap on the old engine. Here is a list of some of the things that will be affected:
Registry
Quick Switching
Recoil (Extremely minimal)
Performing any two actions consecutively.
Input Delay
Air Strafing


The source engine implements triple buffering or at least a flip queue, because when you enable vsync, you get intermediate frame rates between 30 and 60.

There is plenty of input lag with vsync and an FPS cap of 60 (or anything higher).
Put it to 59 and input lag disappears. Now I can play with a clean, tear-free, ordered frame rate stream.

Just do yourself a favor and try it. 119 for your 120Hz.
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
JAG87, have you ever played CS:S without v-sync? You'll notice how much lag it produced once you turn it off. Also, fluorescent lights running on an electronic ballast refresh at 20,000 - 60,000 Hz. The CCFLs in your monitor run on an electronic ballast. [FONT=arial, Arial, Helvetica]While incandescent bulbs do operate at 60Hz, the tungsten still retains its glow so it isn't perceptible.

Edit: I tried your method and I still had input lag. Try moving a window in CS:S and you'll notice how it has a delay.
[/FONT]
 
Last edited:

Absolution75

Senior member
Dec 3, 2007
983
3
81
I can't. And I'd love to have a scientific explanation of how on a monitor that is displaying 60 images per second, you can tell the difference.

Eh, it was already explained. There are essentially duplicate frames in the monitors framebuffer. Everything isn't exactly sync'ed perfectly.

The video card doesn't sent a frame to the monitor every frame of your video game. This would make your monitors refresh rate dependent on framerate (which it is obviously not, since it isn't variable). The monitor just reads whatever is in the framebuffer at the current time.

The same thing happens with USB devices - with the same argument.


Though that sucks for you since you can't tell =/