Is 30hz video really that bad?

Kippa

Senior member
Dec 12, 2011
392
1
81
The frame rate for PAL video is 25fps and I can watch that fine, so why is gaming at 30Hz which is 5fps faster that bad? Can gaming on a 30Hz monitor really be that bad? I am not expecting super silky smooth video, but talking about one that is playable. I was thinking about this with 4K monitor in mind that is set at 30Hz.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
playback and rendering are apples and oranges.

playback = ALL frames times are delivered evenly.
rendering = NOT all frame times are delivered evenly.

30fps playback = smooth
30fps rendering = some stuttering

60fps playback = fluid
60fps rendering = smooth

90fps playback = beyond necessary
90fps rendering = fluid

120fps playback = way beyond necessary
120fps rendering = beyond necessary

this applies in general. your actual "PERCEPTION" may vary.
 

TheThirdMan

Member
Jul 5, 2011
113
11
81
Also important to remember that video at 25fps will have motion blur which eases the transition between frames. If you displayed 25 non-motion blurred frames in one second, it would still look quite jarring. Remember, movies are at 24fps and still look fine, because of the motion blur present.
 

SPBHM

Diamond Member
Sep 12, 2012
5,061
414
126
yes, motion blur is what makes videos OK at low frame rates (like 24-25) but games pretty bad, also as mentioned it's hard to get 100% consistent frame time with games,

30Hz screen is bad, even windows desktop will feel bad as far as I know, video is OK, gaming without V-Sync feels OK but with a lot of tearing

60Hz SST 4K displays are becoming quite affordable, skip the 30Hz stuff.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
As said it's the motion blur. Current research shows in some circumstances are eyes can detect changes as small as 1/1000 of a second, suggesting we could see benefits all the way up to 1000 fps.
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
i'm now watching the NBA playoff games at 1080p60, it's so much better than p30. to me, 60fps is a must for sporting broadcasts.
 

SAAA

Senior member
May 14, 2014
541
126
116
As said it's the motion blur. Current research shows in some circumstances are eyes can detect changes as small as 1/1000 of a second, suggesting we could see benefits all the way up to 1000 fps.

This x100. I don't even care if we can really dedect 1/1000th of a second or not, but just in the desktop I notice a huge difference setting 60+ frames vs anything lower: that's were you spend most of the time! In gaming of course it would be bad, especially in FPSs while looking around. Don't be fooled by "scientific facts" like human eyes can't see faster than this, smaller than that etc. I find most of those values very inaccurate actually, same for reaction times and many other human physical feats.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
As mentioned, movies have motion blur and very evenly delivered frames, but from my experience, one of the biggest differences not mentioned is the interaction between the game and what you see. When you watch a video, it does not matter how long it takes to deliver a frame, other than the end result, but when you game, that adds to latency and makes a game feel very sluggish. This is less noticeable with consoles, as controllers do not have the same connected feel that mice have. With a mouse, that low frame rate results in more latency and makes a game feel sluggish. For those with simulator sickness, as I have, it can even make you feel very nauseated, yet watching a movie at 25-30 FPS does not bother me one bit.
 

Sohaltang

Senior member
Apr 13, 2013
854
0
0
playback and rendering are apples and oranges.

playback = ALL frames times are delivered evenly.
rendering = NOT all frame times are delivered evenly.

30fps playback = smooth
30fps rendering = some stuttering

60fps playback = fluid
60fps rendering = smooth

90fps playback = beyond necessary
90fps rendering = fluid

120fps playback = way beyond necessary
120fps rendering = beyond necessary

this applies in general. your actual "PERCEPTION" may vary.


The difference between 60fps and 120 fps is very noticeable, even in just normal browsing and OS usage. I cant imagine how bad 30 fps would be.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Depends on how quickly things onscreen are moving. I've actually become more sensitive to the low framerate in movies in big action scenes, and find it very annoying.

Wish more movies would go to a higher framerate.

There's also the issue of frame time variance, which is specific to render-on-the-fly situations like gaming. 30 Hz average frame rate does not mean your game is putting out a frame exactly every thirty three and a third milliseconds. Small variances in frame time have a much larger and more noticeable impact if the framerate is lower.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
playback and rendering are apples and oranges.

playback = ALL frames times are delivered evenly.
rendering = NOT all frame times are delivered evenly.

30fps playback = smooth
30fps rendering = some stuttering

60fps playback = fluid
60fps rendering = smooth

90fps playback = beyond necessary
90fps rendering = fluid

120fps playback = way beyond necessary
120fps rendering = beyond necessary

this applies in general. your actual "PERCEPTION" may vary.

It goes without saying the above is garbage. Studies shows people can tell the difference, well into the 240 fps range, and like I said before we have perception into the 1000 fps range. Maybe beyond that there is no benefit, but we don't know that yet, what we do know is the eye doesn't work in frames so it takes a lot of fps to fool it into believing its the real world. It might be fooled into motion around 30-60 fps, but that doesn't mean its anywhere near the limits of what we can perceive the difference of. So yeah this comment is just wrong.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Even video playback at those low framerates can look bad. You'll see movies filmed in a way that tries to pan very very slowly or else you get juddering etc. due to the handicap of using such a low framerate.

So keep in mind, games are much more free and dynamic and you will continuously run into these visual issues, unless you train yourself to pan very slowly, and make a huge effort to move your point of view in artificially limited ways, sort of how the film industry already does when trying to work within the low-framerate constraints that they just have to deal with. I certainly don't game that way.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
if in general is well into the 240fps and some into the 1000fps and 120fps is necessary for browsing and os usage.

30fps must be near blind if not blind. :rolleyes:

carry on - as your "perception" will vary.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
One thing that holds back our perception of high frame rates, is motion blur. If your monitor is causing blur between frames, it is hard to distinguish between frame rates. So with a "crappy" monitor, you will have a harder time recognizing higher frame rates. I quote crappy, because there is a group of people here would not consider IPS's motion blur a problem.
 

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
It goes without saying the above is garbage. Studies shows people can tell the difference, well into the 240 fps range, and like I said before we have perception into the 1000 fps range. Maybe beyond that there is no benefit, but we don't know that yet, what we do know is the eye doesn't work in frames so it takes a lot of fps to fool it into believing its the real world. It might be fooled into motion around 30-60 fps, but that doesn't mean its anywhere near the limits of what we can perceive the difference of. So yeah this comment is just wrong.

I agree with you for the most part. The only caveat I would say is that the VR guys are saying you reach "good enough" to trick the brain into thinking its reality somewhere between 90-120fps. They do make sure to say that a higher FPS will still yield benefits but if you can achieve what they call "presence" with 90-120 then that must be a good enough threshold for a wide swath of the population.

For me personally I am madly in love with 120fps monitors and for FPS games can see a huge difference between 60 and 120. I am facing a tough decision this year between 4k and 2560@120fps. Anyone who calls 120fps beyond necessary I would assume hasn't used it.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I agree with you for the most part. The only caveat I would say is that the VR guys are saying you reach "good enough" to trick the brain into thinking its reality somewhere between 90-120fps. They do make sure to say that a higher FPS will still yield benefits but if you can achieve what they call "presence" with 90-120 then that must be a good enough threshold for a wide swath of the population.

I'll be honest I don't notice a difference between 120hz and 144hz. But I do definitely notice a difference between 60 fps and 84 fps and 36 fps and 60fps. Even percentage wise the 20% increase in frame rate would be very noticable down in the 30-60hz range, but around 120 it doesn't seem to for me. But then I haven't yet seen 240 fps to compare, I just know the studies done on human vision say we can perceive a difference. The Occulus guys via Carmack said that 90hz is basically the minimum the human brain would accept with VR glasses without simulator sickness creeping in. Its not what it is necessary to make it perfect, its that once you start moving the head around the input latency to display latency has to be really small and the frame rate really high to trick the brain because its not as detached as a mouse+kb and monitor is.

The ideal frame rate is a lot higher than we have currently settled on but like all things the gains will become more and more marginal as we push higher. Until we see it in motion we don't really know how much impact it will have. I didn't appreciate the difference 144hz would make before I got it, I just saw too many rave reviews of the difference to ignore it. Now I know its important but I haven't seen true 240hz yet, but I want to.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
30 fps is abysmal with a mouse. just panning around is a stuttery mess. just cap your games at 30 fps and see for yourself. better yet, really torture yourself by using half refresh adaptive vsync to add more lag. lol
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
To add to the above lists:

1. Motion blur - as mentioned previously, the shutter on a 25 or 30 FPS camera doesn't open and close instantly, but is rather open for most of the duration of the frame which results in motion being captured in video as a blur but a objects warping from spot to spot in games.
2. Frame consistency - again as captured earlier, video is a consistent frame rate, whereas games can have much lower than their average frame rate and differences from frame to frame can increase the jitteryness of the motion where video will show the motion smoothly because the frames are precisely paced.
3. Input feel - I think that this is often underrated. Slow frame rates leads directly to input lag. At 20 FPS, there is 50 ms between when a frame is started to when it is sent to the buffer for output (can be more for some double buffering modes). Game engines have built in compensation for net lag, but it is hard to compensate for rendering time. If you press a button or move your mouse/joystick, it can take over 75 ms for it to appear on screen if you add up your input device polling time (up to 8 ms for USB devices), monitor scan rate (up to 16.7 ms for a 60 Hz monitor), game engine tick rate interval (usually up to 25 ms of delay) and render time (50 ms for 20 FPS). Once you start to get close to 1/10th of a second of total input delay, your brain can tell that the reaction of the screen is lagged way behind your inputs and the illusion of control is broken.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
4k@1000fps/hz/blarpbarts or bust.


30hz not gonna cut it for gaming for the vast majority of gamers.
 

Haserath

Senior member
Sep 12, 2010
793
1
81
The biggest difference between 60 and 120/144 hz monitors is input lag, but I doubt that we also can't feel at least a small difference from the amount of frames too.

My eyes can't see the individual pixels of my 24" 1080p monitor(except aliasing), but I can feel my eyes get strained after an hour on it unless I keep relaxing them. While a 250ppi device I can watch for hours. There shouldn't be any difference unless the backlight(FL vs LED) is also affecting my eyes.

So here's what I'm hoping for in the future. A 4k 120hz display to use as a 1080p display. Wonder when that'll be here.:confused:

Also, there is a big difference between controlling a character at 30fps and watching a movie at 30fps.
 

Kippa

Senior member
Dec 12, 2011
392
1
81
Thanks for the feedback. After reading this I definitely skip getting a 4K monitor at 30Hz and wait out for a 60Hz one. :)
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
To add to the above lists:

1. Motion blur - as mentioned previously, the shutter on a 25 or 30 FPS camera doesn't open and close instantly, but is rather open for most of the duration of the frame which results in motion being captured in video as a blur but a objects warping from spot to spot in games.
2. Frame consistency - again as captured earlier, video is a consistent frame rate, whereas games can have much lower than their average frame rate and differences from frame to frame can increase the jitteryness of the motion where video will show the motion smoothly because the frames are precisely paced.
3. Input feel - I think that this is often underrated. Slow frame rates leads directly to input lag. At 20 FPS, there is 50 ms between when a frame is started to when it is sent to the buffer for output (can be more for some double buffering modes). Game engines have built in compensation for net lag, but it is hard to compensate for rendering time. If you press a button or move your mouse/joystick, it can take over 75 ms for it to appear on screen if you add up your input device polling time (up to 8 ms for USB devices), monitor scan rate (up to 16.7 ms for a 60 Hz monitor), game engine tick rate interval (usually up to 25 ms of delay) and render time (50 ms for 20 FPS). Once you start to get close to 1/10th of a second of total input delay, your brain can tell that the reaction of the screen is lagged way behind your inputs and the illusion of control is broken.


:thumbsup::thumbsup: