- Jul 22, 2000
- 4,694
- 0
- 0
Televisions certainly do render frames, frames are made of fields, each field consists of every other scan line, 1 field of the odd scan lines, 1 field of even scan lines. Each field is displayed(or rendered) every 1/60th of a second for NTSC and every 1/50th of a second for PAL. This results in a framerate of 30 fps for NTSC and 25 fps PAL, and the corresponding refresh rate of 60Hz for NTSC and 50Hz for PAL. This is commonly refered to "interlaced video" as in, each frame consists of 2 fields "laced" together, it is a constant framerate, certainly not a "range" as you suggest.The most common "frame rate" on a television set is 24......
Actually I can easily see flicker at 60Hz on my monitor looking straight ahead.you can't notice the flicker of a 60hz monitor looking head on
You are mixing up your terms, when you are talking video display, the term "progressive" refers to how the video is displayed. Progressive means that ALL of the lines in a frame are displayed in one pass from top to bottom before the next frame appears, this is how all modern CRT monitors and DTV display video, as opposed to Analog broadcast television is displayed how I describe above. LCD's in digital mode however, don't actually "refresh" in the traditional sense, instead each pixel is turned off and on when a different instruction is sent, otherwise, they just stay the same.Current LCDs don't have higher than 30-40hz refresh rates, but they're progressive. So things may look blurry, but they won't be choppy
I'm sorry, but 40fps is 40fps whether you have AF enabled or not.Try this: find an older video card (or underclock what you currently have) that will only run 1600x1200 at 40fps without any special features. Chances are it'll look smoother because you don't have anisotropic filtering enabled
Geez, I can see flicker at 75HZ! I use 1280x1024 as my gaming resolution - my monitor can support 85Hz refresh. I'd prefer higher, but yeah well.you can't notice the flicker of a 60hz monitor looking head on
LCD's don't even have refresh rates from what I've read. The issue with LCD's is pixel response rate - how fast the pixels can change their colors or brightnesses. Otherwise you get the streaking or ghosting of older LCD's.Current LCDs don't have higher than 30-40hz refresh rates
If the lowest is 40, it should still be smooth. That's my experience anyway. 30fps or lower - then it can get bad.Your rates drop from 150 down to 40. Naturally, it looks like crap. Why? Several reasons
I can notice flickering at 75 Hz head on, much less 60 Hz. I take your point though - eyes differ in sensitivity depending on how you look at things.Furthermore, eye sensitivity is different throughout (you can't notice the flicker of a 60hz monitor looking head on, but it's quite obvious when gazing sideways)
Vsync does much more than this.A lot of people despise vertical sync, as this caps your maximum frame rate to that of your monitor's refresh rate (assuming we're talking CRT).
I'm not sure if I agree with you on this. It's possible that very low framerates don't look as bad at low detail levels as they do at high detail levels but I've never really tested this.Chances are it'll look smoother because you don't have anisotropic filtering enabled.
For most people, yes. However some people aren't satisifed even if they've never seen anything better, like me when I first started 3D gaming. In those days everyone was happy with 25-30 FPS but I hated it.More importantly, however, is adaptation. If you've played games at 40fps all your life and haven't seen otherwise, it'll look pretty good to you.
Agreed.I bet if you play at 250fps (possible, but only with older games at low resolutions) for a couple weeks and then go down to 125, you will see a difference.
40 FPS is well below the required threshold for smoothness. Generally speaking I start to notice significant slowdowns when the framerate starts to dip below 60 FPS.If the lowest is 40, it should still be smooth.
Ok, that was bad wording. But don't they have 30-40hz pixel refresh rates?Current LCDs don't have higher than 30-40hz refresh rates, but they're progressive. So things may look blurry, but they won't be choppy
Basically because the action in the scene moves during the exposure while filming. When you play the frames together, the motion looks smooth because the blurred action "blends" together.so why do action stills look blurry on a TV set then?