Relation between refresh rate and fps?

ThatWasFat

Member
Dec 15, 2001
93
0
0
I've always been kind of confused about this.

Like..for instance. Lets say that I have my desktop running at 1024x768 @ 85Hz. Then, let's assume that I go play CS, and I play at 1024x768. But, my FPS is at 100.0.

100.0 is higher than 85. What's the relationship here between fps and refresh rate? Why would I use vsync?

Also, another thought - My monitor has a problem where the screen kind of "scrunches up" from the left and right sides. It only does it when I'm in windows. If I'm in BIOS, or the windows loading screen, my screen looks fine. Then I get to my desktop, and the screen goes all wacky. THEN, if I get in a game, it MIGHT fix, depending on what resolution I play at. I can play CS at 800x600 and the screen wont scrunch up. But if I go higher, who knows.

Why 800x600? I would get really high FPS at 800x600.

Blah =/
 

Avalon

Diamond Member
Jul 16, 2001
7,569
172
106
You're getting 100FPS in CS despite an 85hz refresh rate because Vsynch is off. Vertical synching...I believe it's when the video card draws the game as fast as your monitor can display it, or such. I don't quite know the technical definition. You don't need to use vsynch unless you're experience texture warping or anything really funny during games.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Your monitor only replaces the image 85 times a second.
The CPU and video card are still drawing frames the whole time, though.
What ends up happening, as you obviously haven't seen, is tearing, where it gets part of one frame and part of another, given it's not synched with the main framebuffer. In firefights, this often shows itself, where half the screen will appear dark and other half light, for such an instant.
 

DoctaZ

Member
May 28, 2004
35
0
0
You should only get tearing if your monitor couldn't process 100 fps.

If you simply had it set at 85 hz and it could hit 100 hz at that resolution and you have vsync off, you should not get tearing or any distortion.

I love my monitor :)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
regardless of if ones monitor "couldn't process 100 fps" or not, terring will occur unless the framerate is synced with the refresh rate.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: DoctaZ
You should only get tearing if your monitor couldn't process 100 fps.

If you simply had it set at 85 hz and it could hit 100 hz at that resolution and you have vsync off, you should not get tearing or any distortion.

I love my monitor :)
Er, no. If the monitor is set at 85Hz, you only get 85 frames each second. But if you are rendering 100 frames, on average you're getting about 1/6th of one frame and 5/6 of another on each refresh.
If you have Vsync on, you will get no more than your refresh rate, but experience no tearing. If Vsync is off, you will get tearing, even if your brain tells you it's just more motion.
 

KF

Golden Member
Dec 3, 1999
1,371
0
0
>My monitor has a problem where the screen kind of "scrunches up" from the left and right sides.
This should not be a problem. All the monitors I've had for many years (and they are not high end ones) keep track of several resolutions/refresh rates, at least 4 before they drop the earlier ones. When the monitor CPU figures out the rate, it automatically switches to the screen size adjustments you have set for that one. I thought they were all like that. If you don't adjust the size, it will not have a correct setting to switch to, of course. So I set the boot-up size while looking at the BIOS, and the Wndows size while Windows is up, etc. There are some cards that keep track of this on board (Matrox?) and you use a utility to set them.

One person I know is of the opinion that adjusting anything is a mistake, because the one it comes set to "is the best." Therefore his screen is scrunched.

Why the difference in size? The scan of a line or a frame takes different amounts of time depending on the rates. Within the electronic sync marks, the non-blank video section is different in proportion to the total time. The net effect is that the size on screen is different depending on all the rates.

I've never seen the explanation of how the video card frame rate is accomodated to the video output rate. I assume the video output logic just reads (with perhaps one line buffered) what is in the video card frame buffer at the video output rate. The GPU concurrently does its work totally independant of that. In that case, at some spots you will see the top part of one frame stitched to the bottom part of the next, if the card frame rate is faster, if your eyeballs are fast enough. I know mine aren't. Perhaps in the old days some video cards put out glitches at these spots, but I have never seen it. I don't think any decent modern video card has a problem with this

There is a setting in many video games to sync to the vertical frame rate but it is generally considered undesirable because it forces transfers to/from the video card to only occur during a small part of the time, drastically restricting the effective AGP bandwidth.

BTW, the higher the rates, the larger the bandwith required to transfer a clean, unblured signal. So the higher the resolution you set, and the higher the refresh rate, the worse signal quality becomes. This is an iron law of physics. I don't know why "experts" decline to mention this. I rather doubt that any reasonably-priced home monitor has the bandwidith to cleanly do 120Hz at 1600x1200. I don't care for blur, and I am immune to the supposed headaches which deranged lunies relentlessly claim is caused by low refresh rates, so I set my refresh rate to 60. 60 is about twice what a normal human being can see as flicker. (They can perceive "something" at higher rates though.) The frame rate of theater movies is only 24, as I recall. It has not been a technological problem to do higher rates for quite a while. When it has been tried, movie viewers dislike it "because it looks like TV." The initial frame rate of movies (in the silent era), I believe was 16, because that was well into the perception of continuous motion. They moved it to 24 to eliminate the perception of flicker.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: KF
I've never seen the explanation of how the video card frame rate is accomodated to the video output rate. I assume the video output logic just reads (with perhaps one line buffered) what is in the video card frame buffer at the video output rate. The GPU concurrently does its work totally independant of that. In that case, at some spots you will see the top part of one frame stitched to the bottom part of the next, if the card frame rate is faster, if your eyeballs are fast enough. I know mine aren't. Perhaps in the old days some video cards put out glitches at these spots, but I have never seen it. I don't think any decent modern video card has a problem with this
I usually find VSync off to be more bothersome at low framerates than high ones. When you're getting over 85 FPS, who cares? They differneces between frames are minimal. But when you're getting 40, that half-light, half-dark frame burns into your mind and makes it seem more like a slide-show. Very similar to using a strobe light.
There is a setting in many video games to sync to the vertical frame rate but it is generally considered undesirable because it forces transfers to/from the video card to only occur during a small part of the time, drastically restricting the effective AGP bandwidth.
No, it doesn't. It has nothing to do with AGP bandwidth. The work there is being done between the primary framebuffer and the VGA output.
BTW, the higher the rates, the larger the bandwith required to transfer a clean, unblured signal. So the higher the resolution you set, and the higher the refresh rate, the worse signal quality becomes.
Wrong. No CRT monitor has any issues with bandwidth. It's an analog device. The blurriness at higher resolutions has to do with the quality and design of the components used between the RAMDAC and the monitor's input, and sometimes in the monitor itself. It's LCDs using a DVI-D input where bandwidth becomes an issue.
This is an iron law of physics. I don't know why "experts" decline to mention this. I rather doubt that any reasonably-priced home monitor has the bandwidith to cleanly do 120Hz at 1600x1200. I don't care for blur, and I am immune to the supposed headaches which deranged lunies relentlessly claim is caused by low refresh rates, so I set my refresh rate to 60. 60 is about twice what a normal human being can see as flicker. (They can perceive "something" at higher rates though.)
Tell that to people who have also had issues with older flourescent lights that do 60Hz and flicker like mad.
The frame rate of theater movies is only 24, as I recall. It has not been a technological problem to do higher rates for quite a while. When it has been tried, movie viewers dislike it "because it looks like TV." The initial frame rate of movies (in the silent era), I believe was 16, because that was well into the perception of continuous motion. They moved it to 24 to eliminate the perception of flicker.
FPS != refresh rate. There is no "flicker" in movies, even at 1 FPS, flicker is not involved.
On framerate, READ, PEOPLE!
This has gone on many a time. Watch some MTV. Watch some racing.
Watch an action movie from the 80s.
Difference? Shutter speed.
How do you get the same effect in games? Lots of frames per second, so that your eye percieves motion blur.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
100.0 is higher than 85. What's the relationship here between fps and refresh rate?
None unless vsync is on in which case it will never go above 85.

Also, another thought - My monitor has a problem where the screen kind of "scrunches up" from the left and right sides.
Use your monitor's controls to fix it.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
The relationship between the monitor refresh rate and the game fps is that they are the same thing - they are both fps. Only that a refresh will fade from the screen in time because it is not static, but a frame of fps will be static.

In order to experience 100fps in a game, you would have to have your monitor refresh at 100Hz.

Vsync will only display a game frame when the monitor displays a frame. This way, they display at the same time.

With Vsync off, the graphics card will draw frames regardless of the monitor's refresh rate. So the monitor can potentially show half of one frame and half of the previous one, which is kinda annoying.

One thing with Vsync on, although it looks best, easier on the eyes. It does eat some framerate. If you have 85Hz refresh rate with Vsync on you will only be able to play at 84fps, 42fps, 28fps, 21fps and so on. You can find these numbers by dividing 84 by 2, then 3, then 4. This happens based on how many refreshes the video card misses to draw a frame. If the video card is drawing a frame and it misses one refresh then it will have to draw it on the second refresh, if not then on the 3rd, if not then on the 4th, each time killing your frame rate.
 

KF

Golden Member
Dec 3, 1999
1,371
0
0
Well, I knew some one would get irritated if I rubbed it in concerning my pet peeve about mythological flicker, but I did it anyway. No one likes getting their favorite "old wives tale" debunked.

Originally posted by: Cerb
I usually find VSync off to be more bothersome at low framerates than high ones. When you're getting over 85 FPS, who cares? They differences between frames are minimal. But when you're getting 40, that half-light, half-dark frame burns into your mind and makes it seem more like a slide-show. Very similar to using a strobe light.

*** Really? I never saw a half-light half-dark frame even when I was using a Cyrix CPU with a Monster3D (Voodoo) card, and I was getting like 15 fps normally, and one frame every 3-10 seconds during tough scenes. That's probably because I can't see anything that takes less than 1/60th of xecond, like a normal person. (But there was definately a slide show effect!) Vertical sync did nothing either way, and never has as many times as I've tried it with different video card. I got the old Monster3D out of the junk box and tried it for kicks. Same. I think what you are seeing must be an artifact introduced by bad game design.

There is a setting in many video games to sync to the vertical frame rate but it is generally considered undesirable because it forces transfers to/from the video card to only occur during a small part of the time, drastically restricting the effective AGP bandwidth.
No, it doesn't. It has nothing to do with AGP bandwidth. The work there is being done between the primary framebuffer and the VGA output.

*** I was under the impression it allowed tranfer over the system bus only during the vertical sync interval. That drastically confines the transfer time and therefore the net bandwidth. I believe this predated accelerated 3D cards, for the purpose of eliminating tearing that occured when opening windows and such. Some of the old 2D-only cards evidently could not output proper video while handling transfers over the bus simultaneously, I guess, but I never had one like that. Whatever the technology, they alway tell you to have it off for better speed.

BTW, the higher the rates, the larger the bandwith required to transfer a clean, unblured signal. So the higher the resolution you set, and the higher the refresh rate, the worse signal quality becomes.
Wrong. No CRT monitor has any issues with bandwidth. It's an analog device. The blurriness at higher resolutions has to do with the quality and design of the components used between the RAMDAC and the monitor's input, and sometimes in the monitor itself. It's LCDs using a DVI-D input where bandwidth becomes an issue.

*** CRTs have no issues with bandwidth because they are analog devices? All analog devices have a bandwidth. Any amplifier has a bandwidth. Any CRT tube has a bandwidth. Any real physical device, even a wire, has bandwidth. (They are going to fiber optics to increase bandwidth.) Bandwidth is always intentionally limited in analog (and digital) electronic circuits as much as possible to 1) reduce noise 2) reduce oscillation due to in-phase feedback at some frequency. Bandwidth must also be carefully limited in any A/D or D/A transfer to avoid aliasing. It looks like the computer guys have used bandwidth in a new sense, and don't know what it had always meant before; just like they seem oblivious to the fact mega still means million (10 to the 6th).

This is an iron law of physics. I don't know why "experts" decline to mention this. I rather doubt that any reasonably-priced home monitor has the bandwidith to cleanly do 120Hz at 1600x1200. I don't care for blur, and I am immune to the supposed headaches which deranged lunies relentlessly claim is caused by low refresh rates, so I set my refresh rate to 60. 60 is about twice what a normal human being can see as flicker. (They can perceive "something" at higher rates though.)
Tell that to people who have also had issues with older flourescent lights that do 60Hz and flicker like mad.

*** I have some old flourescent lights, the ones with a big iron ballast inductor and a starter, which I like a lot. They don't flicker. Only psychologically suggestible individuals have been conned into believing they do. 60Hz is faster than the persistance of vision. (actually 120Hz. The tubes conduct in both polarities.) The phoshors also have a persistance which cuts into possible flicker. Sure, when the tubes start to go bad, they do flicker.

The frame rate of theater movies is only 24, as I recall. It has not been a technological problem to do higher rates for quite a while. When it has been tried, movie viewers dislike it "because it looks like TV." The initial frame rate of movies (in the silent era), I believe was 16, because that was well into the perception of continuous motion. They moved it to 24 to eliminate the perception of flicker.
FPS != refresh rate. There is no "flicker" in movies, even at 1 FPS, flicker is not involved.

*** I don't know what you are trying to say. Silent era movies definately flickered. That is where the nickname that lives to this day originated. Flickers, or flicks. If you have a light go on and off faster and faster, there is a speed at which it appears steady. Before that you see flicker. That's what flicker is.


On framerate, READ, PEOPLE!
This has gone on many a time. Watch some MTV. Watch some racing.
Watch an action movie from the 80s.
Difference? Shutter speed.
How do you get the same effect in games? Lots of frames per second, so that your eye percieves motion blur.

*** I'm not sure what you are saying. Motion blur is something desireable they introduce into digitally generated frames to make it look more like the real life. Its not flicker, or the lack of it. Basically, without motion-blur you can see more easily that the video is somehow unlike the appearance of the real thing, probably because it is sharper than any real moving object could appear to the human eye.

*** And another thing: It is relatively recently that the terminology "refresh rate" has been used for video, and I can't see any good reason for it. The standard terminology had always been: frame rate. As usual, the computer people introduce confusion, and then try to explain that everyone else is confused. One complete still picture is a frame. Then you go to the next frame. The frame rate of a movie is 24. By extension, the frame rate of a TV picture is 30Hz. TV does every second line (a field) in one vertical pass and then interlaces the rest on the second pass. The field rate is 60Hz. They've done this since black and white TV in 1950. (The color signal standard offset the frame rate slightly.) 60Hz fields put the flicker so far into imperceptability that no one growing up watching TV endless can believe that anyone (but a suggestible oddball) could get a headache from it. (That is not to say that people cannot tell the image is somehow not identical to a continuous one.) Maybe people are getting eye fatigue focusing continuously on small things, like clerical workers do. Maybe they can't tolerate the glare of lights washing out the video on the display. Maybe flourescent lights are strobing with the CRT rate at perceptible beat frequency. Maybe the micro eye scanning movements are producing a beat effect.

*** I used to get a kick out of this as kid watching TV: Wave your finger fast in front of a bright CRT. You will see multiple finger shadows. You will see them about as well at a 120Hz refresh rate. They don't go away. Therefore I don't think this effect could be the source of claimed refresh-rate-headaches. I can believe that the fast-fade phosphors used in high refresh rate monitors are accentuating the effect at low refresh rates.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
*** Really? I never saw a half-light half-dark frame even when I was using a Cyrix CPU with a Monster3D (Voodoo) card, and I was getting like 15 fps normally, and one frame every 3-10 seconds during tough scenes. That's probably because I can't see anything that takes less than 1/60th of xecond, like a normal person. (But there was definately a slide show effect!) Vertical sync did nothing either way, and never has as many times as I've tried it with different video card. I got the old Monster3D out of the junk box and tried it for kicks. Same. I think what you are seeing must be an artifact introduced by bad game design.
I've seen half-light, harlf-dark frames when you have Vsync off and then go play Unreal where light flickering in game then you will see half the screen is dark and half is light. Anyway, you can see over 85fps. Tests that I have done show me that we 85fps is very confortable to view and in terms of viewing - is the smoothest, however, if you increase the frames per second in 1:1 ratio with the monitor refresh rate, you will not visually notice a difference, but you will notice that the input latency is less. You mouse moves much faster and smoother, meaning that you can see the extra frames.

*** CRTs have no issues with bandwidth because they are analog devices? All analog devices have a bandwidth. Any amplifier has a bandwidth. Any CRT tube has a bandwidth. Any real physical device, even a wire, has bandwidth. (They are going to fiber optics to increase bandwidth.) Bandwidth is always intentionally limited in analog (and digital) electronic circuits as much as possible to 1) reduce noise 2) reduce oscillation due to in-phase feedback at some frequency. Bandwidth must also be carefully limited in any A/D or D/A transfer to avoid aliasing. It looks like the computer guys have used bandwidth in a new sense, and don't know what it had always meant before; just like they seem oblivious to the fact mega still means million (10 to the 6th).
Good answer.

*** I don't know what you are trying to say. Silent era movies definately flickered. That is where the nickname that lives to this day originated. Flickers, or flicks. If you have a light go on and off faster and faster, there is a speed at which it appears steady. Before that you see flicker. That's what flicker is.
Nice info about where "flicks" came from.

so that your eye percieves motion blur.
Motion blur has to do with depth and focus.
 

somekid617

Member
Mar 27, 2004
143
0
0
my lcd wont go past 75hz. or i am warned the hardware will break/ unusable monitor.... get a lot of motion blur, like playing in vasiline, and jumpy graphics for now, i have a 5200 pci card, and im gettign an x800xt... should i turn on vsync with the x800xt?
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
my lcd wont go past 75hz. or i am warned teh hardware will break/ unusable monitor.... get a lot of motion blue, and jumpy graphics for now, i have a 5200 pci cad, and im gettign an x800xt... should i turn on vsync with the x800xt?
If you don't use Native resolution and refresh rate, you wont get the crispiest picture.

I like vsync.
 

somekid617

Member
Mar 27, 2004
143
0
0
its prolly becasue my lcd is one of teh first hp models produced, about 4 years old, maybe 5....SUPER BLURRY GAMING, and was 1 grand.....parents wanetd it though for soem reason...
 

Ages120

Senior member
May 28, 2004
218
0
0
I like v-sync when it gets rid of texture tearing. If theres no tearing no sense in holding my reflexes back.