Monitor refresh rate matters?

tyl998

Senior member
Aug 30, 2010
236
0
0
I recently bought a 23 inch LCD screen. It supports up to 1920x1080 and the colors are great. It's refresh rate is only 60 hz however. Does that really matter? I imagine I'll be turning V-sync off for my games anyways. Will my kickass graphics setup with amazing frame rates be wasted thanks to the monitor?
 

Karl Agathon

Golden Member
Sep 30, 2010
1,081
0
0
Good question! I always have wondered why some people spend a ton of mula on high end GPUs just to have their FPS limited to their monitors refresh rate. it seems like a waste. I guess people could turn off V-sync, but then wouldnt there be a moderate amount of tearing if the FPS went above their monitors refresh level? I guess im missing something?
 
Last edited:

nOOky

Diamond Member
Aug 17, 2004
3,231
2,288
136
Just play Doom 3 on it, you'll be fine :p

You could spring for a 120hz monitor, or an old crt. Any extra fps above 60 are simply not displayed, and you'll never notice probably.
 

tyl998

Senior member
Aug 30, 2010
236
0
0
So regardless of V-sync all extra FPS above 60 is null and void?

Well I guess it's time to crank up to Anti-aliasing...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
So regardless of V-sync all extra FPS above 60 is null and void?

Well I guess it's time to crank up to Anti-aliasing...
even without AA, you would have to have a very beefy machine to stay above 60fps all the time in some games at 1920x1080.
 

Karl Agathon

Golden Member
Sep 30, 2010
1,081
0
0
Just play Doom 3 on it, you'll be fine :p

You could spring for a 120hz monitor, or an old crt. Any extra fps above 60 are simply not displayed, and you'll never notice probably.


if he got a 120hz monitor, im assuming that would mean he could turn off V-sync and not have to worry about tearing. is that correct?
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
It doesn't matter because it's an LCD. You won't see any difference with a 120hz monitor. The purpose of 120hz AFAIK is 3D. Shutter glasses blink on and off each eye alternately, so each eye sees a 60hz image.
 

Karl Agathon

Golden Member
Sep 30, 2010
1,081
0
0
^^^^Ok, but even without using the monitors 3D capablities, wouldnt he still get the benefits of using native 120hz capability? More smoothness etc etc
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
It doesn't matter because it's an LCD. You won't see any difference with a 120hz monitor. The purpose of 120hz AFAIK is 3D. Shutter glasses blink on and off each eye alternately, so each eye sees a 60hz image.
no, a true 120hz monitor will let you actually see and feel 120fps. now as for running a game in 3D then I believe you are right about 60fps per eye.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Vision is a very subjective thing. There are two main things that decide if a higher refresh rate matters.

The first is the speed in which individual frames become perceived as perfectly fluid motion. I see many numbers thrown around here all claiming to be the magic "Frame-rate" limit of human vision. 24, 30, 60, 72, 75, 80, 100, and even numbers above and beyond 120hz have been thrown around as the magical barrier in which humans cannot perceive individual frames anymore. The simple fact of the matter is that the human eye does not see in frames per second. It is a complex mix of the concentration of photons, changes in the scene, contrast in the scene, any many other things such as the eye containing thousands and thousands of individual nerves that determine when individual frames can be distinguished from an otherwise motion illusion.

Imagine watching a movie in which the scene consists solely of a slow moving fog. It could be nearly impossible to decipher individual frames even without motion blurring simply because the contrast is so low and there isn't that much movement. 10 frames of this fog could completely and utterly convince the brain of fluid movement. Now imagine a fast paced shooter game such as quake. There is no motion blurring here and moving the camera fast induces extreme contrast changes. 30 frames per second here looks like a slide show to most people. In the most extreme of cases, imagine floating around in space next to a completely black neutron star. Suddenly, for 1/100000th of a second it pulses a gamma ray burst including visible white light. Our special space suit protects us from the radiation, keeping us alive to see this extreme amount of very short lived light. The magic that sets an upper limit on the "Frame rate" of the human eye would best be discussed in this thread.

Making things even more complicated, vision varies from person to person. What you may consider choppy, a granny could view as fluid motion. What Ryan Smith, the guy who writes most video card articles on AT, considers overkill, I could consider relatively choppy. As mentioned in the above paragraph, it also is dependent on game type. Generally there are a lot more changes going on when playing a First Person Shooter than a strategy game. The limit on how many frames per second needed to create fluid motion depends entirely on your eyes, your computer setup, and the game you are playing.

Going back to the original OP question of if a higher refresh rate matters, the second thing that matters is how much you actually care about being able to make out individual frames. On Counter-Strike: Source, fluid motion begins to happen for me at around 90fps when looking at the middle of the screen and not actively trying to pick out individual frames. Once I start to search for and identify individual frames, the illusion of motion goes away within the limits of my monitor. I do notice that it is much easier to discern frames when looking at the edges of the screen. It amplified with more rectangular aspect ratios such as 16:10 or 16:9.

Back when I played games for money, those extra hertz actually mattered. "Overclocking" my FW900 CRT monitor to 170hz allowed me to perform to my potential. However in recent years, my priorities have changed and those extra 50hz got outweighed by a much more crisp and livable 2233RZ. The few occasions where I do game, 120hz is easily enough and anything more really doesn't matter.

It is up to you to really decide if you need more than 60hz or not. It may be a classic case of ignorance is bliss and you just get blown away by 120hz, or it could be something where you notice it is moderately smoother but it really isn't that big of a deal.

Code:
Kevin: okay but what are u actaull doing
Ben: im on anandtech writing about the human eye and refresh rates
Kevin: tell them the human eye
Kevin: is what u use to see things
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
So regardless of V-sync all extra FPS above 60 is null and void?

Well I guess it's time to crank up to Anti-aliasing...

It's not entirely null and void. V-sync sets up a buffer that stores the most recent complete frame. Every 1/60th of a second (16.6ms) when the monitor refresh rolls around it shows whatever frame is stored in the buffer. So even at 60 fps the frame in that buffer can be up to 16.6ms old if the complete frame was finished immediately after the last refresh and a new frame hasn't been finished yet. This shows up as a bit of input lag because the frame is older than what's happening.

If your system can do 120fps it still renders those frames to the v-sync buffer so when the refresh comes around it is much more likely to have a brand new frame in there, reducing input lag.


With V-sync off what the frame can very often get updated in the midst of a monitor refresh, so you will get the top portion of the screen being an old frame, and the bottom portion of the screen being the just finished frame - a tearing effect because the top and bottom half don't match up.


So.. yea it's not entirely null and void if you are very perceptive of input lag but I would just crank up AA, and v-sync is almost always better to have on to avoid screen tearing unless you are really really competitive at an FPS :)
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I also completely forgot to mention that some games benefit from having the engine running fast regardless of your refresh rate. That can matter depending on the game. For example bunnyhopping is completely impossible on the original Counter-Strike: Source when running at 60 or less fps, things such as quick switching and other junk get slowed down marginally as well.

I don't have much experience with newer games however.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
It doesn't matter because it's an LCD. You won't see any difference with a 120hz monitor.
This is untrue; a 120 Hz device will be able to display more full frames per second with vsync, and it'll tear less without vsync.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
if he got a 120hz monitor, im assuming that would mean he could turn off V-sync and not have to worry about tearing. is that correct?

no, tearing is always a problem, tearing existed in CRT days as well (it was just easier to miss over the wobbly and malformed image CRT provides, but it has been documented to be there), and tearing is NOT mitigated by higher FPS or higher monitor refresh rate... the ONLY thing that gets rid of tearing is V-Sync (or potentially some other sync method).

So regardless of V-sync all extra FPS above 60 is null and void?

Well I guess it's time to crank up to Anti-aliasing...

yes, exactly! all FPS above 60 is indeed null and void; it literally discards extra frames without displaying them.
However, keep in mind that if it says 70FPS, that is an average, some games fluctuate significantly, it could be 40 to 100 and average out to 70FPS, I prefer looking at min FPS rather then average. but if your min FPS is above 60 then its just wasted, it renders frames and then discards them without ever displaying them. and you have tearing, and you waste power (your video card has AMAZING power saving features, so the less calculations you make it do, the less power it consumes, the less noise its fan makes, etc).

As for 120hz monitors... those are for 3d, so you could display 60 frames per second per "eye"
hz = cycles per second. in this case a cycle is a frame.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
no, tearing is always a problem, tearing existed in CRT days as well, and tearing is NOT mitigated by higher FPS or higher monitor refresh rate... the ONLY thing that gets rid of tearing is V-Sync.
Like I said above, a higher refresh rate will reduce tearing. Also if it’s high enough - say a 60 FPS game being displayed on a 1000 Hz display - it can theoretrically cure it if frames never arrive between refresh cycles.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Like I said above, a higher refresh rate will reduce tearing. Also if it’s high enough - say a 60 FPS game being displayed on a 1000 Hz display - it can theoretrically cure it if frames never arrive between refresh cycles.

tearing due to "too high" refresh rate might look better than than tearing due to "too low" refresh rate, but its still there...

http://hardforum.com/showthread.php?t=928593
an explanation from 5.4 years ago, perfectly explains v-sync and tearing... using a CRT as main example (yes, amazingly those things existed and were developed in CRT days).
NOTHING changed since this article was written.

As for 60fps being displayed on a 1000hz...
1. There are no such displays and aren't going to be for a long time, if ever (why waste money on it when its not needed due to human perception?)
2. There will never be such huge disparity between GPU and monitor technology, it simply doesn't make any sense to spend the money to make your monitor 1000hz when video cards can't even come close to such a speed. It is plausible to have GPUs getting 1000 fps with 1000 fps monitors though, they simply need to be close together.
3. The refresh rate on such a ridiculously display will NEVER have a frame arrive between refresh cycles because the refresh cycles must be ridiculously fast to have 1000hz display. Heck, it might have even more tearing. It can easily have MULTIPLE tearing lines per single frame. (IIRC, I think I have seen cases of 2 tearing lines on a single frame)

It will have some refresh cycles without tearing, then some with.
60fps = 1/60s = 0.0167s average to render a frame on GPU.
1000hz = 1/1000s = 0.001s to display a frame on monitor.

the monitor is chugging along, polling the video card 166 times only to receive the exact same frame every time, displaying the same frame, then on the 167th (give or take) it will get tearing as it polls while the video card is updating the reading buffer. unless there is vsync... However, the next 0.001ms the image with the tear will be replaced... depending on the speed of transfer it might or might not have a tear as well, but even if it does it will only be for a few monitor frames... so tearing is still there, it just blips in and out in a few miliseconds rather then appearing in every monitor frame.

In our bizzare and impossible purely hypothetical scenario we might even get exotic things such as multiple tearings per image with or without v-sync, all depending on transfer speeds between buffers and between monitor and GPU. I frankly can't see such an extreme case of hardware dimorphism where you have a video card only capable of 60hz and a monitor of 1000hz, either we would make faster video cards or slower monitors rather then waste money.

The question is, today, will using a 120hz monitor help with tearing? no, it will not.
will using v-sync help with tearing? yes, it will completely eliminate it, but either at the cost of FPS (double buffering) or at the cost of video card ram (triple buffering)
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Yes it's still possible for the buffer to be updated in the middle of a refresh with a 120hz or higher monitor but the next refresh is only 8.3 ms away which will display the full frame and you will not perceive the torn image that was there as much as you would if it was up there for 16.6ms.

At 1000hz the tear will only be present for 1ms and you shouldn't see it.
 

Vdubchaos

Lifer
Nov 11, 2009
10,408
10
0
I think this entire 120hz thing is bogus.

I have never had issues with 60hz TVs or Monitors (both used for heavy gaming).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
http://hardforum.com/showthread.php?t=928593
an explanation from 5.4 years ago, perfectly explains v-sync and tearing... using a CRT as main example (yes, amazingly those things existed and were developed in CRT days).
NOTHING changed since this article was written.
That post adds absolutely nothing to what we’re talking about. We’re talking about changing the refresh rate, and the effect thereof on a system running without vsync.

1. There are no such displays and aren't going to be for a long time, if ever (why waste money on it when its not needed due to human perception?)
I’m not stating 1000 Hz displays are on the horizon; I’m giving an example of how a high refresh rate device can not only reduce tearing, it can also theoretically eliminate it without the need for vsync.

Heck, it might have even more tearing. It can easily have MULTIPLE tearing lines per single frame. (IIRC, I think I have seen cases of 2 tearing lines on a single frame)
No, it will not have more tearing. If the time taken for a refresh cycle evenly divides into the time taken for a frame to arrive, it will never tear. Multiple tearing can only happen in the opposite case, where a frame is updated more than once during a single refresh cycle.

Assuming the same starting point and a constant framerate, 60 FPS @ 180 Hz will never tear because every third Hz will get a brand new frame, while the other two Hz simply repeat the current frame.

But 180 FPS @ 60 Hz will tear twice because each Hz gets three frames crammed into it, leading to two join points.

The question is, today, will using a 120hz monitor help with tearing? no, it will not.
This is absolutely false, and is also mathematically provable.

A 120 Hz device has half the chance of tearing compared to a 60 Hz device, given it’s twice as likely to have a refresh cycle available when a frame arrives. In fact, this is a lot like signal theory, with the monitor sampling the framerate at fixed intervals. Here’s an example of a game running at a constant 40 FPS:

  • Game 40 FPS: 25 ms, 50 ms, 75 ms, 100 ms.
  • Refresh 60 Hz: 16.67 ms, 33.34 ms, 50 ms (frame two), 66.68 ms, 83.35 ms, 100 ms (frame four).
  • Refresh 120 Hz: 8.33 ms, 16.67 ms, 25 ms (frame one), 33.3 ms, 41.6 ms, 50 ms (frame two).
At 60 Hz, only every third refresh cycle lines up with a frame (please excuse the rounding). Meanwhile, one of the other two refresh cycles will always tear because a new frame arrives before its finished.

At 120 Hz, there’s always a refresh cycle available when a frame is ready, and it’s every third one. The rest of the refresh cycles simply display repeated frames and do not tear because no update happens during their cycle duration.

Now, it’s quite likely a game doesn’t have a constant framerate, so on a 60 Hz display all the game has to do is update somewhere in between the 16.67 ms cycle to break it. On a 120 Hz display however, that window of opportunity is halved, so the game has to update in between the 8.3 ms cycle. On a 240 Hz CRT, that drops to just 4.1 ms.

And like I said earlier, even if a tear does happen, the duration is reduced on a device with a higher refresh rate, thus reducing its visible impact. A 240 Hz CRT will only show a given tear for no more than 4.1 ms, compared to 16.67 ms on a 60 Hz LCD.
 

mv2devnull

Golden Member
Apr 13, 2010
1,526
160
106
With V-sync off what the frame can very often get updated in the midst of a monitor refresh, so you will get the top portion of the screen being an old frame, and the bottom portion of the screen being the just finished frame - a tearing effect because the top and bottom half don't match up.
What does cause that "top to bottom" in LCD? In CRT's it was clearly the lone gun doing its job, but I thought that each pixel in LCD could act on its own?
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
its not complicated! get a true 120Hz monitor, turn on Vsynch, enjoy upto 120 frames thrown at you each second, as many as your computer can generate upto 120 without tearing.

worst case scenario: game runs at say 70 FPS (average) on a 120Hz monitor, then certain frames get re-rendered a couple of times more than others and you dont notice it or any tearing.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Assuming the same starting point and a constant framerate, 60 FPS @ 180 Hz will never tear because every third Hz will get a brand new frame, while the other two Hz simply repeat the current frame.
And where is that constant frame rate coming from? from vsync (or another theoretical sync method). Even if you use a framecap and low settings to ensure a constant framerate of 60gps, you can still have tearing displaying constant 60fps on a 60hz (or 120, or 180) monitor without vsync because the monitor can poll the GPU while it is copying the frame from the rendering buffer to the display buffer.

But 180 FPS @ 60 Hz will tear twice because each Hz gets three frames crammed into it, leading to two join points.
If you are using vsync you will just render at 60fps and have no tearing...
but if you insist on disabling it (and presumably disable triple buffering as well), then yes, you can experience multiple tears.

A 120 Hz device has half the chance of tearing compared to a 60 Hz device, given it’s twice as likely to have a refresh cycle available when a frame arrives. In fact, this is a lot like signal theory, with the monitor sampling the framerate at fixed intervals. Here’s an example of a game running at a constant 40 FPS:
Game 40 FPS: 25 ms, 50 ms, 75 ms, 100 ms.
Refresh 60 Hz: 16.67 ms, 33.34 ms, 50 ms (frame two), 66.68 ms, 83.35 ms, 100 ms (frame four).
Refresh 120 Hz: 8.33 ms, 16.67 ms, 25 ms (frame one), 33.3 ms, 41.6 ms, 50 ms (frame two).

you don't seem to take into account that data transfers are not instantaneous. - the actual pooling by the monitor takes a certain amount of time, so does the process of copying the data from render buffer to display buffer. Also, you assume single buffering when today, realistically, its either double or triple buffering.
It isn't a case of "render -> display", its "render -> copy from render buffer to display buffer -> have display buffer polled by monitor -> copy from gpu display buffer to monitor's buffer -> have the monitor update its pixels from its own buffer". Each step takes time, a different amount of time, and without careful synchronization can happen out of order, causing tearing.
Your figures do not take into account those issues.

so, for example: Refresh 120 Hz: 8.33 ms, 16.67 ms, 25 ms (frame one), 33.3 ms, 41.6 ms, 50 ms (frame two).
will actually be when the monitor BEGINS each polling/copy operation, that takes time, for a few ms the monitor is copying. if the monitor started copying at 25ms and finished copying at 30 ms, then it will take a few more ms to actually change the image on the screen, those 5ms copy period is time during which the monitor is reading the GPU data and can cause a tear if the GPU changes the picture in its display buffer (and the only thing stopping it from doing so is vsync)

The only benefit of a faster display is that it will replace the torn image with a non torn one the next time it polls the GPU. still noticeable, but my wording of "does nothing for it" was poor choice of words, it does nothing to stop tearing from occurring, but it can make tearing less noticeable.

EDIT: According to wikipedia: http://en.wikipedia.org/wiki/Digital_Visual_Interface#Specifications
DVI Digital has the following specs:
Minimum clock frequency: 25.175 MHz
Maximum clock by cable quality (up to 330 MHz, 7.92 Gbit/s)
Pixels per clock cycle: 1 (single link) or 2 (dual link)
Bits per pixel: 24 (single and dual link) or 48 (dual link only)

lets say we are sending at maximum speed over a single link DVI connection. That is 330Mhz = 330,000,000 cycles/second * 1 pixel / cycle = 330,000,000 pixels / second.
if you send a 1920x1080 picture it has 2,073,600 pixels... how long does it take to transfer? 2,073,600 pixels / ( 330,000,000 pixels / second ) = 0.00628363636363636363636363636364 seconds
it takes 6.2 miliseconds to send 1920x1080 image over single link DVI.

so, for example: Refresh 120 Hz: 8.33 ms transfer begins, 14.53 transfer ends, 16.67 ms transfer begins, 22.87 transfer ends, 25 ms transfer begins, 31.2ms transfer ends. and so on and so forth. and this is not accounting for any other step, so things are probably a little worse but its getting difficult to keep track of it all.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
And where is that constant frame rate coming from? from vsync (or another theoretical sync method). Even if you use a framecap and low settings to ensure a constant framerate of 60gps, you can still have tearing displaying constant 60fps on a 60hz (or 120, or 180) monitor without vsync because the monitor can poll the GPU while it is copying the frame from the rendering buffer to the display buffer.
Yes, you can still have tearing, but a higher the refresh rate makes tearing less likely to happen.

you don't seem to take into account that data transfers are not instantaneous. - the actual pooling by the monitor takes a certain amount of time, so does the process of copying the data from render buffer to display buffer. Also, you assume single buffering when today, realistically, its either double or triple buffering.
Double buffering doesn’t change the amount of time taken to render a frame if vsync is not running. Neither does triple buffering, and you shouldn’t be running be running it without vsync anyway.

Mine was a theoretical example designed for simplicity of explanation. Yes, there are other factors that slightly add to the frametime, but I could have easily added those in and still shown the same thing.

The fact remains: a display with a higher refresh rate has a higher sampling rate, and is thus less likely to tear than a lower refresh one.

If a display has a high enough refresh rate so that a refresh cycle is always available when a frame arrives, and also finishes displaying said frame before the next frame arrives, it won’t ever tear.

will actually be when the monitor BEGINS each polling/copy operation, that takes time, for a few ms the monitor is copying. if the monitor started copying at 25ms and finished copying at 30 ms, then it will take a few more ms to actually change the image on the screen, those 5ms copy period is time during which the monitor is reading the GPU data and can cause a tear if the GPU changes the picture in its display buffer (and the only thing stopping it from doing so is vsync)
Using your specific example, that total 30 ms frametime (approx 33.33 FPS) will tear on a 60 Hz device, but won’t on a 100 Hz device. The fact is, you could pick any ms figure and I could still show the same thing. Do you understand now?

it takes 6.2 miliseconds to send 1920x1080 image over single link DVI.
I’ll assume your calculation is correct, which means the game would have to be rendering over ~162 FPS to generate frames quickly enough to exceed the 6.2 ms limit. With dual-link DVI, the game would have to be rendering over ~324 FPS.

We also do things like add AA to our image, which reduces the speed of the GPU but has no impact on the transfer speed to the display.

The only benefit of a faster display is that it will replace the torn image with a non torn one the next time it polls the GPU.
No, this is false. A faster display is less likely to tear, tears for shorter durations if it does tear, and can also display more full frames per second with vsync activated.

Also things like 3:2 pulldown are not needed on a 120 Hz device, and RTC/motion blur artifacts are also reduced with a higher refresh rate.