Originally posted by: Cerb
I usually find VSync off to be more bothersome at low framerates than high ones. When you're getting over 85 FPS, who cares? They differences between frames are minimal. But when you're getting 40, that half-light, half-dark frame burns into your mind and makes it seem more like a slide-show. Very similar to using a strobe light.
*** Really? I never saw a half-light half-dark frame even when I was using a Cyrix CPU with a Monster3D (Voodoo) card, and I was getting like 15 fps normally, and one frame every 3-10 seconds during tough scenes. That's probably because I can't see anything that takes less than 1/60th of xecond, like a normal person. (But there was definately a slide show effect!) Vertical sync did nothing either way, and never has as many times as I've tried it with different video card. I got the old Monster3D out of the junk box and tried it for kicks. Same. I think what you are seeing must be an artifact introduced by bad game design.
There is a setting in many video games to sync to the vertical frame rate but it is generally considered undesirable because it forces transfers to/from the video card to only occur during a small part of the time, drastically restricting the effective AGP bandwidth.
No, it doesn't. It has nothing to do with AGP bandwidth. The work there is being done between the primary framebuffer and the VGA output.
*** I was under the impression it allowed tranfer over the system bus only during the vertical sync interval. That drastically confines the transfer time and therefore the net bandwidth. I believe this predated accelerated 3D cards, for the purpose of eliminating tearing that occured when opening windows and such. Some of the old 2D-only cards evidently could not output proper video while handling transfers over the bus simultaneously, I guess, but I never had one like that. Whatever the technology, they alway tell you to have it off for better speed.
BTW, the higher the rates, the larger the bandwith required to transfer a clean, unblured signal. So the higher the resolution you set, and the higher the refresh rate, the worse signal quality becomes.
Wrong. No CRT monitor has any issues with bandwidth. It's an analog device. The blurriness at higher resolutions has to do with the quality and design of the components used between the RAMDAC and the monitor's input, and sometimes in the monitor itself. It's LCDs using a DVI-D input where bandwidth becomes an issue.
*** CRTs have no issues with bandwidth because they are analog devices? All analog devices have a bandwidth. Any amplifier has a bandwidth. Any CRT tube has a bandwidth. Any real physical device, even a wire, has bandwidth. (They are going to fiber optics to increase bandwidth.) Bandwidth is always intentionally limited in analog (and digital) electronic circuits as much as possible to 1) reduce noise 2) reduce oscillation due to in-phase feedback at some frequency. Bandwidth must also be carefully limited in any A/D or D/A transfer to avoid aliasing. It looks like the computer guys have used bandwidth in a new sense, and don't know what it had always meant before; just like they seem oblivious to the fact mega still means million (10 to the 6th).
This is an iron law of physics. I don't know why "experts" decline to mention this. I rather doubt that any reasonably-priced home monitor has the bandwidith to cleanly do 120Hz at 1600x1200. I don't care for blur, and I am immune to the supposed headaches which deranged lunies relentlessly claim is caused by low refresh rates, so I set my refresh rate to 60. 60 is about twice what a normal human being can see as flicker. (They can perceive "something" at higher rates though.)
Tell that to people who have also had issues with older flourescent lights that do 60Hz and flicker like mad.
*** I have some old flourescent lights, the ones with a big iron ballast inductor and a starter, which I like a lot. They don't flicker. Only psychologically suggestible individuals have been conned into believing they do. 60Hz is faster than the persistance of vision. (actually 120Hz. The tubes conduct in both polarities.) The phoshors also have a persistance which cuts into possible flicker. Sure, when the tubes start to go bad, they do flicker.
The frame rate of theater movies is only 24, as I recall. It has not been a technological problem to do higher rates for quite a while. When it has been tried, movie viewers dislike it "because it looks like TV." The initial frame rate of movies (in the silent era), I believe was 16, because that was well into the perception of continuous motion. They moved it to 24 to eliminate the perception of flicker.
FPS != refresh rate. There is no "flicker" in movies, even at 1 FPS, flicker is not involved.
*** I don't know what you are trying to say. Silent era movies definately flickered. That is where the nickname that lives to this day originated. Flickers, or flicks. If you have a light go on and off faster and faster, there is a speed at which it appears steady. Before that you see flicker. That's what flicker is.
On framerate, READ, PEOPLE!
This has gone on many a time. Watch some MTV. Watch some racing.
Watch an action movie from the 80s.
Difference? Shutter speed.
How do you get the same effect in games? Lots of frames per second, so that your eye percieves motion blur.
*** I'm not sure what you are saying. Motion blur is something desireable they introduce into digitally generated frames to make it look more like the real life. Its not flicker, or the lack of it. Basically, without motion-blur you can see more easily that the video is somehow unlike the appearance of the real thing, probably because it is sharper than any real moving object could appear to the human eye.
*** And another thing: It is relatively recently that the terminology "refresh rate" has been used for video, and I can't see any good reason for it. The standard terminology had always been: frame rate. As usual, the computer people introduce confusion, and then try to explain that everyone else is confused. One complete still picture is a frame. Then you go to the next frame. The frame rate of a movie is 24. By extension, the frame rate of a TV picture is 30Hz. TV does every second line (a field) in one vertical pass and then interlaces the rest on the second pass. The field rate is 60Hz. They've done this since black and white TV in 1950. (The color signal standard offset the frame rate slightly.) 60Hz fields put the flicker so far into imperceptability that no one growing up watching TV endless can believe that anyone (but a suggestible oddball) could get a headache from it. (That is not to say that people cannot tell the image is somehow not identical to a continuous one.) Maybe people are getting eye fatigue focusing continuously on small things, like clerical workers do. Maybe they can't tolerate the glare of lights washing out the video on the display. Maybe flourescent lights are strobing with the CRT rate at perceptible beat frequency. Maybe the micro eye scanning movements are producing a beat effect.
*** I used to get a kick out of this as kid watching TV: Wave your finger fast in front of a bright CRT. You will see multiple finger shadows. You will see them about as well at a 120Hz refresh rate. They don't go away. Therefore I don't think this effect could be the source of claimed refresh-rate-headaches. I can believe that the fast-fade phosphors used in high refresh rate monitors are accentuating the effect at low refresh rates.