Theory of the framerate that is smooth to the eye.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rbV5

Lifer
Dec 10, 2000
12,632
0
0
What do you hate about film's 24fps? I believe most theatres run film at 48"fps," running each frame twice for a more solid picture.

Personally, the wide pans at 24FPS are barely tolerable for me depending on whats in the scene, they can be nauseating in a large theater, especially if I sit too close.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
OK, I was thinking in terms of refresh rate, not framerate. But I don't sit close to the screen, either. :)
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
lets simply aggree that different people can see it different framerates. here is an example of me changing my view of frames through the process of a year. I played Q3 for dreamcast. it runs at 30fps. It was fun and smooth. Then i get q3 for pc and it runs at 100fps for a few years. I tried to go and play dc version because the computer was in use, the dc version looked like a slideshow. This proves a fact that overtime, possibly to a certain point, the human eye can learn to see higher fps. In fact, 30 fps would be fine for all people if they didnt know what fps is, just like jaggies, 1024X768 and 640X480 would have no difference if you didnt know what jaggies are. But we have all been cursed, the instant we read and/or learned about these graphical flaws, we will now always notice them till the day we die.
 

Wedge1

Senior member
Mar 22, 2003
905
0
0
Originally posted by: VIAN
Main difference between NTSC and PAL

NTSC = 30fps at 550 scanlines
PAL = 25fps at 650 scanlines



You know how the refresh rate at 60Hz you see a flickering because the light goes on and off, well in videogames. after the frame is drawn, it stays there until another frame is drawn. Therefore we can conclude that on some level there is choppiness because of the refresh rate bit. But one thing I know is that there is high contrast between light and darkness and little contrast between one light and another. So we notice the flickering or choppiness in the refresh rate because of its high contrast. And that's only in 2d applications. I know my CS runs at 60Hz and I never noticed it until I found out, and I still don't notice it. In fast paced playing it isn't noticeable. With frames there is no high contrast and therefore it appears smooth to use, but it is choppy. And you cannot call it totally smooth until it matches the refresh rate you are confortable with.

I totally relate to what you are saying here. It's the same for me. In regular 2D use, i.e., surfing the web, reading the forums, 60Hz seems to bright to me and is very uncomfortable. For 2D, I simply need a minimum of 75Hz to eliminate the uncomfortable "brightness" that my eye is picking up.
But if I start up Quake3 or Counter-Strike the 60Hz doesn't have a detrimental effect to my playing and enjoyment. In fact, for Quake, some maps are a bit dark in some spots and I think the lower refresh rate of 60Hz kind of helps me to see better.

Now, going back to something I mentioned earlier. I use to play with com_maxfps at 125. Then I decided to up that to 150 for a while, then to 200. But I must admit, I could not tell a difference between any of these 3 high framerates -- 125, 150, and 200. It simply did not make any improvement to me, although I know it indicates the raw power of the card.

But, I recently turned on vsync for the very first time, and it capped my fps to 75 because my refresh rate is 75Hz. Turning on vsync made a vast improvement on the quality of the image and smoothness. I guess it eliminated that "tearing" of textures that I have seen people complain about. But my point is, the 75 fps with vsync turned on is clearly better to me than 125 fps+ with vsync turned off (to me). I'm amazed at the quality of the image with vsync turned on, and it makes me wonder if many of the perspectives here are taking into account that some of us have it turned on and some have it turned off. I'm thinking that this must play a factor as to why some people are perfectly happy with lower fps and some people with higher fps -- maybe vsync has something to do with this and we are not including it in our discussion as to "what is smooth to me"?

 

The Green Bean

Diamond Member
Jul 27, 2003
6,506
7
81
17 fps is what the human retina works at...the reason why u notce a flicker on a CRT at 60HZ is cuz it is shooting beams of light into ur retina...On an LCD 60 is perfectly smooth....besides TV runs at 18 fps...I personally dont find a differece b/w 25 fps ang 100.

hatim
 

Jeff7

Lifer
Jan 4, 2001
41,596
20
81
Why 60fps can appear fluid, but 60Hz will not:
The screen refresh operates differently than does the framerate.
Assume a monitor with a 1000000Hz refresh rate - for all intents and purposes, this is a constant image.
Now the videocard will do 30fps. As one frame is completely drawn, it is still drawn on the screen repeatedly, until the next frame is ready to go. With that in mind:
Try an imaginary monitor with a 30Hz refresh rate. The image does NOT stay on the screen for the full 1/30th of a second, as it does where framerate is concerned. The image is drawn, but the phosphors remain excited for a very short period of time, as in around 1/5000th of a second. So, the screen is scanned completely in 1/30th of a second. That leaves the screen nearly blank for 1/4970th of a second. That's why it'll appear to flicker.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Hatim, you have no idea what your talking about. If you want to find out why just read all posts.

The frame is not drawn on the screen repeatedly because how would the next frame be drawn it the processor is too busy drawning one frame repeatedly. It just stayst there until the next one comes up.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Originally posted by: rbV5
There are 2 main formats, NTSC and PAL.
PAL systems are slowly moving to PAL60. For Games Consoles and DVD players. And some tellys will refresh at 100hz now, PAL ones that is.

 

WobbleWobble

Diamond Member
Jun 29, 2001
4,867
1
0
As long as the FPS doesn't dip below 30, it's fine with me. In even in first first shooters.

I agree that everyone's eyes are different and there is no defined max framerate the eye can see.

The whole TV/movie argument is flawed because they have motion blurring so you don't see jerkiness. Computer games don't have motion blurring so they need a higher FPS to compensate for it.

I don't believe V-Sync takes a hit on your FPS if it's lower than your refresh rate. It only limits your maximum FPS.

I have never noticed any tearing with V-Sync off, nor do I notice the differences in FPS at 50+

But those are just opinions with my eyes.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Wedge1

But if I start up Quake3 or Counter-Strike the 60Hz doesn't have a detrimental effect to my playing and enjoyment. In fact, for Quake, some maps are a bit dark in some spots and I think the lower refresh rate of 60Hz kind of helps me to see better.
Yes, 3D games are generally darker than your desktop, thus they make 60Hz more tolerable.

Vsync is a separate (but related) issue. We could go into it, but that would be digressing further from the main point of what constitutes a "smooth" framerate. A lot of people are switching between smooth in terms of looks and smooth in terms of control.


Originally posted by: hatim
17 fps is what the human retina works at...the reason why u notce a flicker on a CRT at 60HZ is cuz it is shooting beams of light into ur retina...On an LCD 60 is perfectly smooth....besides TV runs at 18 fps...I personally dont find a differece b/w 25 fps ang 100.
This is basically all wrong, and all the necessary corrections are already in this thread. Look up. :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
er...it [vsync] only makes your MAX framerate 85fps
This is untrue. Vsync can lower your minimum and average framerate in addition to nuking your maximum framerate. The effects of vsync are much more than just a simple framerate cap.

Well, then there's some kind of software weirdness on your computer, because using vsync does NOT affect how fast the GPU/CPU/memory render frames.
That's true as far as the physical rendering speed is concerned but in reality vsync introduces timing issues so that it causes extra delays when the current frame misses the current refresh cycle.

Don't confuse refresh rate (screen redraw) with framerate (individual linked frames of animation).
Well there are two sides to that coin. On the one hand both are a measure of screen updates per second so they are essentially the same thing. But on the other hand having a higher framerate than the refresh rate is still noticeable because of other factors such as partially drawn frames and input sensitivity.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
So at the risk of getting far astray from topic, is DX a separate API like OpenGL and D3D?
No, DirectX is the superset API for Windows gaming programming and Direct3D is a subset of it that deals with 3D rendering.

If you got a constant, never falling below 30fps you would not notice a difference.
That isn't true at all. In fact a constant framerate is irrelevant to the discussion if the minimum is still above the threshold of smoothness. Thus for example a range of 200-100 FPS is far smoother than a constant 60 FPS.

The eye can only see ~30-35 FPS so if any game can sustain above ~30FPS then you would not notice any slow downs.
That is completely false. If you think that's the case fire up your favourite first person shooter, cap it at 35 FPS and play it for a week like that. Then uncap it and play it in the same places. I gaurantee you'll see a massive difference (assuming of course your system is actually capable of going much higher than 35 FPS in the said game).

17 fps is what the human retina works at...the reason why u notce a flicker on a CRT at 60HZ is cuz it is shooting beams of light into ur retina...On an LCD 60 is perfectly smooth....besides TV runs at 18 fps...I personally dont find a differece b/w 25 fps ang 100.
That whole paragrah is incorrect.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
My theory on minimum frame rate is partially correct. Minimum frame rate being 85fps because of no flickering at 85Hz.
To be honest that's not a bad theory at all. It's not universal of course but it's a good starting point. If you have a minimum of 85 FPS in all situations I doubt too many people would complain of choppiness.

I myself have often pointed out the similarities between refresh rate and framerate and the fact that 60 Hz obviously flickers because our eyes can see the individual refresh cycles as they're not fast enough to look constant. And if it's proven that you can see flickering then it's simply not possible to then turn around and claim that 24 FPS is smooth to the eye since 60 Hz is essentially 60 "FPS".

I remember when I got my first Voodoo 2 and ran Descent II with the 3dfx Glide patch for the V2. 120FPS, liquid-smooth... I don't know about the rest of you, but I long ago cast aside the various theories ("24fps is what TV runs at, and it looks smooth, so that's all you need for smooth gaming" etc) in favor of reality.
I agree with you completely. There's smooth and then there's liquid smooth, a status that is only achieved through a very high framerate. When you're playing a game that's liquid smooth you're gliding through it more like moving through it and it feels extremely immersive, almost unreal.

orion, I'll give you an example BFG uses: doing a 180 degree turn in one second at 30fps means 3 degrees per frame update.
Actually I tend to use 360 degrees with a subdivision of 12 degrees. :p
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Whoops, I must have been thinking of 60fps when I said 3 degrees per frame. And my apologies for fumbling the quote. ;)
 

Jeff7

Lifer
Jan 4, 2001
41,596
20
81
Originally posted by: VIAN
Hatim, you have no idea what your talking about. If you want to find out why just read all posts.

The frame is not drawn on the screen repeatedly because how would the next frame be drawn it the processor is too busy drawning one frame repeatedly. It just stayst there until the next one comes up.

As I understand it (and based on what I just researched quick), this is what happens:
The video processor sends a finished frame to the frame buffer. The RAMDAC then sends this data to the monitor. As long as the data in the frame buffer (1 frame) remains the same, that's what the RAMDAC will keep sending to the monitor - until the next frame is updated, the frame will continue to be redrawn on the monitor. Otherwise, if you'd play an intensive 3D graphics demo that only did 5fps, you'd get a brief flicker of a frame, while the monitor and RAMDAC waited for the next frame to be generated. But instead of a flashing screen, you get a slide show.
 

beatle

Diamond Member
Apr 2, 2001
5,661
5
81
Hmm, where to start...

When computing framerates, it really depends on the source. As another poster has said, a normal NTSC TV has 60 interlaced fields per second. It will first draw line 1, then 3, then 5, etc. and then it will draw 2, 4, 6, etc. This translates into 29.976 frames per second. I believe the .024 loss has to do with how the video is filmed. On film, I believe there is a .001 gap between frames. PAL has 50 interlaced fields for 25 fps.

A computer monitor (and some TVs) are progressive scan, which means the image is drawn from top to bottom without interlacing. This looks very sharp since there are no interlacing artifacts (more apparent in fast motion scenes). Since the picture is still being drawn top to bottom, the phosphors begin to fade ever so slightly before the image is redrawn, thus the slight flickering you might see - especially out of the corner of your eye.

An LCD is drawn constantly. As another posted mentioned, there is a constant backlight. The difference here is that there are not frames, per se, but times when everything on the screen is "updated." These changes all appear at the same time, so there is never any flickering, regardless of the refresh rate. This is why LCDs are much easier on your eyes than a CRT.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Originally posted by: beatle
An LCD is drawn constantly. As another posted mentioned, there is a constant backlight. The difference here is that there are not frames, per se, but times when everything on the screen is "updated." These changes all appear at the same time, so there is never any flickering, regardless of the refresh rate. This is why LCDs are much easier on your eyes than a CRT.
Slightly off topic, but am i correct in thinking that TFTs only "refresh" the pixels that have changed? And not the whole screen?

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
That should be the end effect, either way. Even if LCDs (of which TFT is one type, I believe) refresh the info to each pixel, if it's the same value as the last frame, then the (sub)pixel(s) won't change.

Unless we're talking about some (all?) 16ms displays, which, when they can't show an exact color using their 16-bit palette, alternate between two similar colors to approximate the intended one.