• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

30-60 Vs 60 Vs 60+ FPS, What is the true?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
You're confusing multiple issues.

First of all the human eye can see way more than 30fps, 60fps, 120fps and even higher, the benefits of displaying more unique frames to a user is indisputable.
+1 (very especially true for sample-and-hold display technologies like LCD)

Although there are gradual points of diminishing returns, see Science & References for some useful information about the human vision system.

On a slightly different but very related topic, it is true humans can't tell flicker directly beyond a specific Hz. It varies human by human. Some can't see flicker directly at 60Hz, others can see it even at 120Hz. Virtually nobody can notice flicker staring directly at 1000Hz flicker. However, there are side effects even from 1000Hz flicker such as wagon wheel effect / phantom array effect / stroboscopic effects. There were tests where humans were able to tell a >500Hz flicker, and in some cases, 10,000Hz flicker through such an indirect effect (see these references, scroll to bottom).

People can see short strobes. A xenon camera flash is less than 1 millisecond. Frame samples on a 60Hz display is much longer than that (1/60sec = 16.7ms). Moving a camera around while taking a photograph at 1/60sec shutter, will cause a blurry photograph. Likewise, moving your eyeballs while tracking moving objects on a screen, leads to motion blur on a sample-and-hold display.

The frame is continuously shining for the whole refresh. Your eyes are always moving while tracking moving objects. The static image on the screen (non-strobed, long sample of 16.7ms) gets smeared across your vision since your eyeball is in a different position at the beginning of the refresh versus the end of the refresh. Thus, a sample-and-hold display (LCD) will have lots of motion blur compared to an impulse-driven display (CRT). The short samples (flickers/strobes) eliminate the chance that the image gets blurred across your retinas.

In fact, repeated frames on flicker/impulse driven displays (e.g. CRT 30fps@60Hz, 60fps@120Hz, 100fps@200Hz, etc) is tantamount to a double-exposure at different positions as you eyes keep moving and the extra refresh is in a different position, causing a doubled-up-edge in moving objects, with separation between doubled-up edges smaller the higher framerate you go. There's been comments about the LightBoost doubled-up-edge effect at 60fps@120Hz. Usually, people are most familiar with the CRT 30fps@60Hz doubled-up-edge effect, but it remains at far higher refresh rates.

Humans can tell apart photographs taken with a 1/60sec shutter and a 1/1000sec shutter. The photograph taken at 1/1000sec is much sharper. There are point of diminishing returns well above 60fps, but the curve doesn't end until well beyond 1000fps(equivalence, via interpolation and/or strobes). That's why Samsung and Sony have "Clear Motion Rate 960" and "Motionflow XR 960" in some high-end HDTV's (costing over $2000). Although somewhat gimmicky, tests have shown that the motion on these displays are pretty clear, although the input lag of interpolation make them unsuitable for use with video games. LightBoost solves this problem by avoiding interpolation, and strictly sticking to strobes -- providing about 90% less motion blur than a regular 60 Hz LCD. (regular non-LightBoost 120 Hz LCD only has 50% less motion blur than 60 Hz LCD).

Obviously cameras and eyeballs are not apples to apples, but it serves to help people understand why 60fps is not the final frontier, and the extra motion blur caused by your moving eyeballs -- it is same type of blurring effect as taking a long-exposure photograph with a moving camera. (The sample-and-hold effect forces this second cause of motion blur to happen -- in addition to motion blur caused by pixel persistence, which are two separate causes of display motion blur as explained in academic/university papers, and TV manufacturer research).
 
Last edited:

Idocare

Junior Member
Feb 4, 2013
20
0
0
The frame is continuously shining for the whole refresh. Your eyes are always moving while tracking moving objects. The static image on the screen (non-strobed, long sample of 16.7ms) gets smeared across your vision since your eyeball is in a different position at the beginning of the refresh versus the end of the refresh. Thus, a sample-and-hold display (LCD) will have lots of motion blur compared to an impulse-driven display (CRT). The short samples (flickers/strobes) eliminate the chance that the image gets blurred across your retinas.
Yep, I got the motion-blur you meant and I must say that put in that way is true.

To give an idea, there is a value that's drammatically different between an LCD, even the best around, and a CRT monitor and this is the state-to-state refresh time of the single pixel.

In a monitor the binary file of the frame, imagine a long stripe of hex bytes that states each from 0 to 255 the intensity value for each subpixel, has to be gathered completely before it can be displayed as a frame and here comes in the state-to-state thing, a subpixel of a CRT monitor, at the same screen refresh, takes a really small amount of time in comparison to a subpixel of an LCD screen, while an LCD pixel needs milliseconds to switch from an all-lit to an all-off position (or an intermediate state which gives colors) a pixel of a CRT needs nanoseconds and this means that at a fixed "screen" refresh rate the single frame in a CRT monitor can last hundreths of times more than on a LCD one thus giving a more steady image on the human retina while on a LCD, taking the whole process more time due to the state-to-state time being higher, the lasting of the completely switched frame on the screen is much shorter leading to what you call motion-blur that in fact is all caused by this delay.

And here we should dive into panels technology room in which we could find that Plasma monitors have nearly the same amount of speed than CRT ones while in the LCD family there are panel better suited to play as TN ones as they are usually faster for example of IPS ones while IPS ones have better viewing angles and an absolutely wider colour spectrum to dig down till we find UV2A panels that are even faster and so on...

Looking at that is true that having a monitor more suited for gaming tasks is better than having a monitor better suited for CAD tasks or static 3D modeling, you dont have the need to rotate that much faster on the screen a 3D object you are building and the state-to-state time in this scenario is not compelling, it's nearly better if is higher in this scenario as it accomodates for example the slow rotation of a gearbox while you're checking to not have forgot some stuff and probably you will take more time staring at a fixed image than moving the position of the stuff you are viewing, different thing is when you need to use it for a more dinamic kind of things, the less the screen takes to switch frames the more they will last on the screen the more consistent is the message transmitted to the brain by the eyes through the optical nerve the lower is the so called motion-blur that's not other thing than a kind of "unsureness" of the eye-nerve-brain system.

But...it's not up to the screen refresh time itself, though in higher refresh screens are used usually higher refresh panels with lower state-to-state transition timings that induce a lower so-called "blur", if any.
 

BFG10K

Lifer
Aug 14, 2000
21,902
995
126
click those links while at 30fps. is it fluid?

click those links while at 120fps. is it fluid?

enough said.

done beating a dead horse. if you have not comprehend it by now. you never will. ignorance is bliss. carry on.
Did you try the links? Are you saying you didn't see a difference between 30 FPS and 60 FPS with the cube and the soccer ball?

You'd have to be blind not to see it.

Now, you could make a case against 60 FPS and 120 FPS not showing a difference, but I'll bet if you had to control those objects at some level of precision, you'd see a difference there, as is the case in many games where input is tighter and more fluid @ 120 FPS compared to 60 FPS.
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
Did you try the links? Are you saying you didn't see a difference between 30 FPS and 60 FPS with the cube and the soccer ball?

You'd have to be blind not to see it.

Now, you could make a case against 60 FPS and 120 FPS not showing a difference, but I'll bet if you had to control those objects at some level of precision, you'd see a difference there, as is the case in many games where input is tighter and more fluid @ 120 FPS compared to 60 FPS.
I think pretty much everyone can see a difference between 30 and 60.
60 vs 120 is another thing. I agree if you control it with your mouse then you'll *feel* the difference, but you'd be hard pressed to just *see* the difference of just some soccer balls or squares moving across the screen at 60 or 120hz.
But hey, if someone can spot 5 out of 5 in a blind test, I'll certainly believe that they can tell the difference.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I can see the difference between 60 and 120fps no problem.
Me too and my eyes are pretty bad (need thick glasses). I am genuinely jealous of anyone who cannot tell the difference, but I can, and after gaming at 120fps@120hz for over a year it is now genuinely unpleasant to go back to 60fps
 

lehtv

Elite Member
Dec 8, 2010
11,900
74
91
Black Octagon, you'd get used to 60fps @ 60hz in a short time, it would feel normal again. I'd give it a few hours tops.
 

omeds

Senior member
Dec 14, 2011
646
13
81
After using 120hz for 2 or 3 years I find 60hz a total mess tbh, at least without vsync. With vsync I think its perfectly fine, but unfortunately vsync makes online FPS unplayable for me...
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
I can see the difference between 60 and 120fps no problem.
Me too and my eyes are pretty bad (need thick glasses). I am genuinely jealous of anyone who cannot tell the difference, but I can, and after gaming at 120fps@120hz for over a year it is now genuinely unpleasant to go back to 60fps
Yeah sorry, I guess I should clarify that I meant that cube and the balls that BFG10K is talking about, on a CRT.

On LCD Yes, probably.
On CRT: No, probably not, but I will test this when I get home.
 

beginner99

Diamond Member
Jun 2, 2009
4,889
1,274
136
I disagree. Vsync introduces significant input lag, and I'll take screen tearing (which I barely notice) over it any day.
This. My KDR in BC2 increased significantly after turning of vsync. And I'm not a pro gamer, not at all. They annoying part is that this was not really directly noticeable in game expect that enemies seemed to die faster and I won more "duels" and hence better KDR.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Oh gosh. If you want higher then 60fps then leave your vsync disabled which looks like sh*T and image tearing is nasty.

I turn vsync on, and my mon pownz every mon out there, Try 1ms @ 75hz IPG IPS whatever it is blows balls because it has motion blur and laggy mouse ghosting avg led is 7ms access time..... go upt o 12ms ,,,
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

Can tell the difference, but the words, "diminishing returns" hits my mind-set after 60. It
reminds me of quality more than x8 AA, where anything over this the diminishing return ball really starts rolling.

Thankfully, my eyes are not as sensitive and compelled to absolutely need faster than 60, where it breaks immersion for me.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
But, I told him. For the human eye, its impossible to feel any diference while playing over 60 FPS.
Not true. There is a clear difference playing games at 60FPS vs 120FPS. Try a 120Hz monitor. You will notice a difference.

Granted, the difference between 60-120FPS is not as dramatic as 30-60FPS.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
Imho,
Can tell the difference, but the words, "diminishing returns" hits my mind-set after 60.
Due to the differences between how LCD's and CRT's work, perceived motion blur is NOT directly related to refresh rate but the SAMPLE LENGTH of the refresh. This accounts for the black periods between samples. As a result:

Visually when testing in PixPerAn, perceived motion blur goes the following:

LCD 60fps@60Hz = baseline (recent model, where pixel persistence is far less than a refresh)
LCD 120fps@120Hz = 50% less motion blur than LCD 60Hz
CRT 60fps@60Hz = ~90-95% less motion blur than LCD 60Hz, flickers
CRT 75fps@75Hz = ~90-95% less motion blur than LCD 60Hz, flicker less
CRT 120fps@120Hz = ~90-95% less motion blur than LCD 60Hz, flicker-free
LCD 120fps@120Hz LightBoost = 85% less motion blur than LCD 60 Hz
LCD 120fps@120Hz LightBoost(OSD 10% setting) = 92% less motion blur than LCD 60 Hz
Note: CRT varies in phosphor decay speed. Short or longer persistence phosphor. Notice how fluidity of CRT during fps=Hz is very hard to tell apart, except for "solidness" and flicker-freeness of the image. This is because phosphor decay is at the same speed.

The above blur corresponds to measured sample length a refresh is continuously displayed for:

LCD 60fps@60Hz = refresh displayed 16.7ms
LCD 120fps@120Hz = refresh displayed 8.33ms
CRT 60fps@60Hz = refresh displayed 1-2ms (phosphor decay)
CRT 75fps@75Hz = refresh displayed 1-2ms (phosphor decay)
CRT 120fps@120Hz = refresh displayed 1-2ms (phosphor decay)
LCD 120fps@120Hz LightBoost = refresh displayed 2.4ms (strobe backlight)
LCD 120fps@120Hz LightBoost(OSD 10% setting) = refresh displayed 1.4ms (strobe backlight)

The relationship of perceived motion blur = sample length is scientifically confirmed.
This is confirmed yourself with PixPerAn and other motion tests, with a room full of displays.
This is confirmed by TV manufacturer laboratories. (papers)
This is confirmed by university research. (papers).
This is technology independent (LCD, CRT, OLED, SED, DLP, CRT, plasma, etc)
Note for other technologies that use multiple flickers: Multiple flickers of the exact same pixels of same frame (e.g. plasma subfields, DLP temporal dithering, PWM dimming, etc) correspond to a unified sample length from the start of the first pulse to the end of the last pulse, for the perspective of motion blur.

References:
Study up.
 
Last edited:

Cadarin

Member
Jan 14, 2013
30
0
16
If anyone had told me about that lightboost hack a few months ago, I'd have bought a 670 instead of my 7970. I really hope someone figures out how to make it work with an AMD card.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
If anyone had told me about that lightboost hack a few months ago, I'd have bought a 670 instead of my 7970. I really hope someone figures out how to make it work with an AMD card.
It works with the DVI cable swap -- make it work with nVidia, hot-plug the cable into AMD, and LightBoost continues working if the 1080p timings are the same. There's some stuff in the HardForum.com thread that covers this talk (around approximately page 24 to 26+ of the thread -- its a very popular thread there). There's some attempts to use softMCCS (from entechtaiwan.com) to send DDC/CI commands to the monitor to attempt to enable LightBoost, with partial success so far (only for specific BENQ monitors).

There's also certain Samsung monitors that work in AMD cards out of the box; certan models were discovered to have an undocumented similar (LightBoost-like) strobe backlight mode that supports AMD cards (Samsung Zero Motion Blur HOWTO), but the Samsung version unfortunately has lots of input lag; unlike the BENQ.
 

Idocare

Junior Member
Feb 4, 2013
20
0
0
Well, I'm currently keeping a 60Hz TV-LCD hooked to my rig but that thing has a 1ms UV2A panel and is enough satisfactory, even if the LightBoost thing seems really effective blacking out state-transition of the pixels and pulsing light through them only when they needs to be seen, thus giving a more steady image quality, I think it simply cannot stand up to PLASMA's technology that's the nearest to CRT one speaking of responsiveness.

Even if someway effective the LightBoost tech is only "hiding" the slowness of the state-to-state (G2G or BTW doesn't matter that much) transition of LCD pixels, I'd rather go with a PLASMA TV-set and a VGA able to always keep the game @60fps with V-synch enabled than with a 120Hz monitor and a VGA that can often drop under 120fps or cannot reach that value at all.

Generally speaking, lets say that if a 120Hz monitor needs to always display 120fps, If the VGA cannot deliver the monitor needs to adjust that filling the missing frames by its logic and taking back in input-lag what it could give in image quality.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Due to the differences between how LCD's and CRT's work, perceived motion blur is NOT directly related to refresh rate but the SAMPLE LENGTH of the refresh. This accounts for the black periods between samples. As a result:

Visually when testing in PixPerAn, perceived motion blur goes the following:

LCD 60fps@60Hz = baseline (recent model, where pixel persistence is far less than a refresh)
LCD 120fps@120Hz = 50% less motion blur than LCD 60Hz
CRT 60fps@60Hz = ~90-95% less motion blur than LCD 60Hz, flickers
CRT 75fps@75Hz = ~90-95% less motion blur than LCD 60Hz, flicker less
CRT 120fps@120Hz = ~90-95% less motion blur than LCD 60Hz, flicker-free
LCD 120fps@120Hz LightBoost = 85% less motion blur than LCD 60 Hz
LCD 120fps@120Hz LightBoost(OSD 10% setting) = 92% less motion blur than LCD 60 Hz
Note: CRT varies in phosphor decay speed. Short or longer persistence phosphor. Notice how fluidity of CRT during fps=Hz is very hard to tell apart, except for "solidness" and flicker-freeness of the image. This is because phosphor decay is at the same speed.

The above blur corresponds to measured sample length a refresh is continuously displayed for:

LCD 60fps@60Hz = refresh displayed 16.7ms
LCD 120fps@120Hz = refresh displayed 8.33ms
CRT 60fps@60Hz = refresh displayed 1-2ms (phosphor decay)
CRT 75fps@75Hz = refresh displayed 1-2ms (phosphor decay)
CRT 120fps@120Hz = refresh displayed 1-2ms (phosphor decay)
LCD 120fps@120Hz LightBoost = refresh displayed 2.4ms (strobe backlight)
LCD 120fps@120Hz LightBoost(OSD 10% setting) = refresh displayed 1.4ms (strobe backlight)

The relationship of perceived motion blur = sample length is scientifically confirmed.
This is confirmed yourself with PixPerAn and other motion tests, with a room full of displays.
This is confirmed by TV manufacturer laboratories. (papers)
This is confirmed by university research. (papers).
This is technology independent (LCD, CRT, OLED, SED, DLP, CRT, plasma, etc)
Note for other technologies that use multiple flickers: Multiple flickers of the exact same pixels of same frame (e.g. plasma subfields, DLP temporal dithering, PWM dimming, etc) correspond to a unified sample length from the start of the first pulse to the end of the last pulse, for the perspective of motion blur.

References:
Study up.
I have this monitor:

http://www.newegg.com/Product/Produc...82E16824236206

I can tell the differences with 120hz and 60hz and with the lightboost ability for 2d and also still more-so diminishing returns. I don't think they're massive differences (120 FPS vs 60FPS - 120hz vs 60hz) but more-so subtle.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Indeed, a dimming, but one can add more contrast! In a darkened room would look okay but feel the trade-off is too much for me. However, the awareness and ability is more than welcomed for eyes that are sensitive and improves their experience over-all.

I'm all for it.
 

Idocare

Junior Member
Feb 4, 2013
20
0
0
I have this monitor:

http://www.newegg.com/Product/Produc...82E16824236206

I can tell the differences with 120hz and 60hz and with the lightboost ability for 2d and also still more-so diminishing returns. I don't think they're massive differences (120 FPS vs 60FPS - 120hz vs 60hz) but more-so subtle.
Pretty nice piece of kit.

That said lightboost is not the reinvention of warm water but a patented name for a kind of the well known by far "scanning-backlight" technology, this time GPU driven....

...however if you link that thing @120Hz and you feed it with an 80fps output it's most likely you will find yourself in a not so confortable zone as you were feeding it with a 120+fps signal, in this example your screen needs to create from the signal one frame every three it displays, linking it @60Hz it would have to discard one frame every four from its signal and @60Hz with v-sinch enabled the choice of wich frame to drop will be left on charge of the GPU that would manage the framebuffer in order to get the most out of the screen, the ideal choice would be synching it 120Hz with a GPU capable to run a game at higher frame rates but capped (better would be v-synched, but..) @120fps, I don't know which hardware you've hooked it into but if you can do the try it would be nice to see your impressions...
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,471
223
106
I enjoyed playing Quake 2 at 120 fps. Todays games feel different.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Pretty nice piece of kit.

That said lightboost is not the reinvention of warm water but a patented name for a kind of the well known by far "scanning-backlight" technology, this time GPU driven....

...however if you link that thing @120Hz and you feed it with an 80fps output it's most likely you will find yourself in a not so confortable zone as you were feeding it with a 120+fps signal, in this example your screen needs to create from the signal one frame every three it displays, linking it @60Hz it would have to discard one frame every four from its signal and @60Hz with v-sinch enabled the choice of wich frame to drop will be left on charge of the GPU that would manage the framebuffer in order to get the most out of the screen, the ideal choice would be synching it 120Hz with a GPU capable to run a game at higher frame rates but capped (better would be v-synched, but..) @120fps, I don't know which hardware you've hooked it into but if you can do the try it would be nice to see your impressions...

Where Lightboost made sense for me was it helped with the dimming with shutters and helped to reduce crosstalk based on one can reduce the 3d contrast and still receive a bright and vibrant experience, while curbing crosstalk.
 
Feb 25, 2011
16,576
1,337
126
He says when someone plays at 60+ FPS, his character will "run faster", jump "faster" (higher !?), shoot "faster" than anyone in the game.
That would only be true if the game registered your keypress and waited to render a certain number of frames before registering the action in question.

Even if they did work that way (I have no idea) at 30fps you wouldn't be more than 1/30th of a second behind the guy running at 350fps. Human reaction time and network lag would be much bigger concerns.
 

ASK THE COMMUNITY