Updated Theory of Smooth Frame Rate

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
60Hz@60fps - A refresh happens every16.7ms as does the framerate. I you notice choppiness in one area, then there is nodoubt choppyniess in the other area because both of them refresh every 16.7ms. The reason that we see the choppiness in the refresh rate and not in framerate is because of the contrast. If after a frame was drawn, everything turned black until another frame came up, you would notice it too. But its not like that, it just refreshes the frame ontop of another fram with very similar light - so you do not see a flickering. On the refresh rate side though, there is that contrast and you see a flicker. In my opinion something should always be tested on extremes so that when you go to less extremes they will work properly and this is that. The extreme is the hight contrast versus something less extreme such as very low contrast. So given something where there is little contrast such as 85Hz we can conclude the we need a 85fps. Meaning that both will refresh at 11.7ms. Therefore the 85fps is just as smooth as the 85Hz. 85fps@85Hz is my recommendation for optimal gaming experience.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Thank you professor.

Oh wait. This is your second thread about the exact topic, and you posted this exact paragraph in both.

Your thoughts are not organized, it's hard to tell what you are trying to explain. Try using paragraphs.

You fail to consider the effects of VSYNC, and/or what happens if the framerate drops below 85fps with VSYNC on. If you're not sync'd, then I'm not sure I get the point.

Actually, not to be rude, but this was the most annoying paragraph I've had the displeasure to read and I want my 30 seconds back.

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Oh my god!!! Give it up already!

You know what's smooth? A framerate that is CONSTANT. Choppy is (large) fluctuation in fps.
 

Viper96720

Diamond Member
Jul 15, 2002
4,390
0
0
LOL Refresh rate and frame rate are separate things. 150fps@60hz is smoother than 60fps@60hz. You should spend more time playing the games instead of coming up with
flawed theory.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
VIAN, why not stick to the same thread so people who reach it later on by searching can follow the development of the conversation?
 

AgaBoogaBoo

Lifer
Feb 16, 2003
26,108
5
81
Originally posted by: Viper96720
LOL Refresh rate and frame rate are separate things. 150fps@60hz is smoother than 60fps@60hz. You should spend more time playing the games instead of coming up with
flawed theory.

Are you sure about that? I think that is more opinion than fact, but I think you guys should pay a little more attention to what he is saying. I think he means that its better to run at higher refresh rates with the same frame rate for a smoother display. I'm not really sure exactly, but VIAN, maybe you should go into further research on this and conduct an experiment of some sort comparing them.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
OK Students (credit to merlocka), let me try to answer all questions.

merlocka - Yes this the second thread and I have removed the message from the other one so people will only respond to this one. I wrote this in a hurry, I'm not handing in an english paper, I would expect you to understand it. The only thing that I find confusing is the term contrast. This means the difference between one extreme and another being light and dark. Why would you not play with VSYNC on, free added picture quality. Nice funny comment in the end there.

jiffylube1024 - A constant framerate of 5fps would look like crap to me. Choppy is not just the large fluctuation. I've heard those being called lagging similar to what happens online. To many people including me, think that 30fps is choppy, while it is satisfactory for play, usually because we don't have money to upgrade. We deal with it lets say.

Viper96720 - They are separate and somewhat work differently, but they have a lot of relation. And why don't you use VSYNC. 150fps@60Hz wouldn't be possible with vsync enabled. I don't think the theory is totally flawed, I just need time and money to experiment with it. Before you do the actual physical work, it's good to have the math behind it.

Pete - I'm sorry for not sticking to the same thread. I wanted to catch more attention of the serious responders and I thought that this was the best way. Apparently I was mistaken. I apologize. Here is the link to the first thread.

AgaBooga - My intention was to prove that even at 60fps it is still choppy. And also offer a new solution which is at 85fps. Now how could I come up with this number. The refresh rate is where I got it from. Being that at whichever refreshrate you deem nonflickering is the framerate that you would find smoothest. This is of course very difficult to find out in LCD's because of the backlight. So I proposed that 85Hz, which I think is the smoothest and notice no flickering at all, should be the new standard for the smoothest gameplay. My target would have been 100Hz for everybodies satisfaction, but bandwidth limitations would degrade picture quality.

You were the one who understood the most. It's good to know I'm reaching someone. As far as experimenting, I'm trying to with my currently outdated TNT2 playing CS. I have recently posted a new topic on how to change refresh rate on CS and I hope to try their suggestions soon.
 

lazybum131

Senior member
Apr 4, 2003
231
0
76
If your video card can't pump out enough FPS, then it doesn't matter how high the refresh rate is because it'll still be choppy, but you're eyes won't get strained because of flicker if the refresh rate is high enough for your eyes.

85Hz as a refresh rate is good enough for my eyes, heck I think my eyes handle 60Hz fine.

FPS you should try to get as high as you can get, just because that means with newer and more hardware stressing games the drop in FPS won't be as great, but of course the maximum that you can see from the screen is capped at the refresh rate.

30000 FPS shouldn't look any better at 85Hz as 85FPS at same refresh rate. In fact it could look worse if the two are out of sync to much (tearing is it?). If you notice an improvement with 85+ FPS at 85Hz, then there are probably other factors involved.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
BFG10K -

To be honest that's not a bad theory at all. It's not universal of course but it's a good starting point. If you have a minimum of 85 FPS in all situations I doubt too many people would complain of choppiness.

I myself have often pointed out the similarities between refresh rate and framerate and the fact that 60 Hz obviously flickers because our eyes can see the individual refresh cycles as they're not fast enough to look constant. And if it's proven that you can see flickering then it's simply not possible to then turn around and claim that 24 FPS is smooth to the eye since 60 Hz is essentially 60 "FPS".

- might better clarify what I'm trying to prove.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
You need to understand that you are never going to prove anything regarding this topic. You might as well try to prove which speaker sounds the best or which car handles the road the best. This ends up being an analog system and each persons perception is different. There is no "unified theory of frame".

Secondly, you are stating that because 85Hz doesn't flicker, and because the "flicker" represents the worst case "contrast change" between frames, that 85fps is the lowest FPS that would be required to be "smooth".

If this is really your theory, there are so many assumptions here that it really isn't worth commenting on.

Such as...

1) VSYNC vs no VSYNC. Of course VSYNC looks better and performs well if you FPS > refresh but when FPS < refresh there are framerate consequences due to the timing between the frame buffer and VSYNC which can cause dramatic drops in displayed framerate.

2) Are you talking about avereage framerate, minimum, or instantanious?

3) What about LCD which do not refresh the screen when they re-draw?


It has and always will boil down to personal preference. Some will state that they set their display to 640x480 to ensure hundreds of FPS because they can tell the difference. Some will turn on all the details to make the IQ great because they don't mind 40-60fps.

There is no right answer for everyone, the sure sign of this is that it is argued weekly on these forums.


 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
It sort of makes sense when you don't want to go above 85Hz. 85fps will be your maximum speed. And you need a kick ass graphics card to support a constant fps. It really disappoints me that these new graphics card play these new games so terribly. I don't understand you LCD question. There is not straight answer for everyone, but nobody told 3dfx that. Everybody always aims for what 3dfx set down, a 60fps. That should be the minimum and 85 should the the maximum, but I'd rather it be constant.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Yeesh. I've been following the last couple of these threads, and the big problem here is that there are too many issues being discussed at once. I think if discussion were broken up like this, we'd all be a lot happier:

1) Refresh rate and LCDs. Since LCD screens don't redraw the same way that CRT monitors do, they don't have a "refresh rate" but rather a per-pixel "response time", which is how long it takes for a pixel to go from black to white and back (or vice versa). While LCD monitors have (generally) a worse "refresh rate" than CRTs (the best LCDs have a 16ms *average* response time, or just over 60Hz, and most have 25-35ms -- a bit over 30Hz), they *do not flicker at all*. Some people are very annoyed by CRT flicker at 60Hz, a problem which is compounded under fluorescent lighting (which also flickers at 60Hz). I notice 60Hz flicker if I look for it (look at the screen out of the corner of your eye and you'll see it) or if I work in front of one for a long time, though 75Hz and higher doesn't seem to bother me. Rather than flickering, low-speed LCD displays will blur whenever the image on the screen is changing rapidly (as when watching DVD movies or playing games). Tolerance for blur relative to flicker seems to be an individual preference.

2) VSYNC and/or high frame rates. VSYNC makes the graphics card only redraw the screen between refresh cycles. Without VSYNC, you can (and will) get situations where monitor redraws half the screen, but then the video card changes to the next frame -- so you get part of one frame and part of another. This creates visible artifacts ('texture tearing' in particular) during fast motion, and I don't like it. Other people don't notice it as much, and AA can help to cover it up by making the transition less sharp. Whether or not VSYNC is on, *there is no advantage to having a fps count higher than your monitor's refresh rate/response time* (as long as the minimum never drops below the refresh rate, that is). 150fps@60Hz looks exactly like 100fps@60Hz and 60fps@60Hz (except that without VSYNC you may get even worse tearing due to multiple frame buffer redraws during each monitor refresh). Ideally, you'd have a system capable of running at or higher than your refresh rate at all times, and run with VSYNC enabled (which would lock you to running at, say, 75fps@75Hz).

3) "Choppiness" / "Jitter" / "Lag". There are two different problems being addressed by at least three different names here. One is having a "low" frame rate (with low being a relative term) all the time. The other is having a low frame rate some of the time, or having one that bounces rapidly back and forth between, say, 10-20fps and 60fps (often seen in FPS games where there are extreme peaks and valleys in the number of polygons/textures/effects being rendered). I reserve the term "lag" for network transmission delays, so I would suggest that the first issue be called "choppiness" (or, alternatively, "your graphics card blows") and the second "jitter" or "stutter" (or "your processor/memory/HD blows"/"your graphics card only kind of blows"), just for some sort of cohesiveness in any further discussion. Some people don't mind a fairly low framerate as long as it's constant, while others just can't stand anything below a certain fps number.

4) How many fps are necessary? We've all heard the argument that you can't discern more than 24 FPS. It's crap. Movies and TV run at 24fps (for the most part -- cheap Saturday morning cartoons are usually down in the teens somewhere, for example), but they benefit greatly from the free "antialiasing" (ie, blur) provided by television screens. You ever watch TV (like a football game or something) on a computer monitor? It looks terrible. DVD movies look better because of the increased resolution and the fact that you're not usually getting the same sort of rapid full-screen motion that games give. I guarantee that if you had someone watch, side-by-side, something running (a game loop, for instance) at 20, 30, and 60 fps, they'd be able to tell them apart and would prefer the faster ones. It would be an interesting perceptual psychology experiment, for anyone in college... That said, most people seem to be fine with framerates in the 20s-30s, as long as they don't dip below that too much (which usually means a "resting"/average framerate up in the 50-60s at least).

I hope that helped. :)

 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Yeesh. I've been following the last couple of these threads, and the big problem here is that there are too many issues being discussed at once

You should have probably left out #4 then because it concerns framerate as it pertains to the display of video, which only confuses the discussion of framerate as it pertains to the output of a game engine. It is also full of inaccuracies.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
I could've been a little clearer there. I was trying to say that comparing "TV" FPS to "computer" FPS is not really possible, since televisions are much lower-quality than computer monitors. Fast-changing 24FPS video looks far worse on a CRT/LCD screen then on a television.

Beyond that, I'd like to hear about the inaccuracies... though I must say I have no hard evidence (other than my own experience) that people can differentiate between 20 and 60 fps on a monitor. But I'd put money on it. :)

 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I could've been a little clearer there. I was trying to say that comparing "TV" FPS to "computer" FPS is not really possible, since televisions are much lower-quality than computer monitors. Fast-changing 24FPS video looks far worse on a CRT/LCD screen then on a television.

Beyond that, I'd like to hear about the inaccuracies... though I must say I have no hard evidence (other than my own experience) that people can differentiate between 20 and 60 fps on a monitor. But I'd put money on it.
Its just that once you introduce apples (video display) to an oranges (graphic engine output) discussion on framerates/refresh ect. It seriously convolutes the discussion. Movies and television aren't broadcast at 24FPS for example. Film however is projected at 24FPS (most commonly today anyway). Any deeper discussion of Video display requires a seperate thread IMHO as it is a complex subject that is misunderstood by most people even though its the medium they are most familiar with.
 

Jeff7

Lifer
Jan 4, 2001
41,596
20
81
Originally posted by: VIAN
BFG10K -

To be honest that's not a bad theory at all. It's not universal of course but it's a good starting point. If you have a minimum of 85 FPS in all situations I doubt too many people would complain of choppiness.

I myself have often pointed out the similarities between refresh rate and framerate and the fact that 60 Hz obviously flickers because our eyes can see the individual refresh cycles as they're not fast enough to look constant. And if it's proven that you can see flickering then it's simply not possible to then turn around and claim that 24 FPS is smooth to the eye since 60 Hz is essentially 60 "FPS".

- might better clarify what I'm trying to prove.

BFG10K hasn't posted in this thread. You're getting your own threads confused, VIAN!
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I really enjoyed reading your response Matthias99. Now here is mine.

1. LCD's do have a refresh rate. And a refresh rate is a refresh rate and nothing more. If a CRT was constantly back lit, we wouldn't see any flickering either. But it isn't and this is why I used the CRT to prove that 60fps is choppy. Because 60fps is choppy, we must move up steps in refresh rate, assuming VSYNC is on, to get a better framerate. 85Hz is probably the best refresh rate since it shows no flickering and any more would probably degrade image quality. So 85fps@85Hz. It can be done in LCD's too and will work the same way because a refresh rate is still a refresh rate. If you own an LCD you'll also need to take into consideration the response time. I don't know too much about LCD's so excuse me because of LCD's limitations.

2. We assume that VSYNC is on for better picture quality.

3. The only problem with the 85fps@85Hz is that many people's "graphics card blows," and aren't able to reach that consistancy. But, It may not be their fault. I don't think that you should buy a graphics card for the future. But buy one for now and deal with the future. All these people have been buy graphics cards before D3 and HL2 come out. Next year or this year when they come out, who knows, they will no longer have the performance they once had and need to buy a new one. So, I recommend that you buy the graphics card that is released after the game you want comes out, for better performance. I say this because 9800XT can barely get 50 on halo. They were both released at same time. But then you have to wait, which sucks too. And there might be a lot of games that you want, so your just gonna have to choose your most anticipated.

4. Theaters run at 24fps which I notice and hate. TV runs at 30fps. The reason it looks better on TV over your PC monitor is because of the high resolution of your monitor. TV runs at 30fps at 550 scan lines. AT 1024x768, you have a higher resolution than that. DVD's look good to me when running in window mode, when I full screen the resolution is stretched and it looks like crap.

I still need to do some testing. I will recieve my temporary Voodoo5 5500 PCI and test this theory out on Counter-Strike. Also, I'll post some nice benchies.

And these are just theories, they aren't proven or perfect. Most of everything isn't proven.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
4. Theaters run at 24fps which I notice and hate. TV runs at 30fps. The reason it looks better on TV over your PC monitor is because of the high resolution of your monitor. TV runs at 30fps at 550 scan lines. AT 1024x768, you have a higher resolution than that. DVD's look good to me when running in window mode, when I full screen the resolution is stretched and it looks like crap.

Resolution has little to do with it actually. People tend to sit much closer to their PC, and notice how poor broadcast TV really is. Sit a foot away from your TV set...how does it look now? Interlaced broadcasts also need some serious processing in order to not introduce horrible artifacts when shown on a progressive scanned display like a computer monitor. How well your video card handles de-interlacing directly affects the quality of the displayed video. Choppy pans are also symptoms of de-interlacing video. Same with scaling. If you have a card that is particularly good at scaling the source video, and the display to properly display it, it can be dramatic. DVD's output with my HTPC look better at 1080i than at 480p native resolution.
 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
I just read in a Doom3 update that Doom3 will be locked at 60FPS max... LINK

"The framerate taps out at 60 frames per second?it won't ever run faster than that. We believe that everybody will be able to enjoy the experience at 30 frames per second."

That's straight from iD software so apparently they feel 30-60 FPS is smooth enough.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
"The framerate taps out at 60 frames per second?it won't ever run faster than that. We believe that everybody will be able to enjoy the experience at 30 frames per second."
I Wonder how Anand got the FX5900ultra to run at 104 FPS...interesting.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I'll just say that it would be best to have 85fps@85Hz, but I'll settle for 30fps min.
 

Johnbear007

Diamond Member
Jul 1, 2002
4,570
0
0
Originally posted by: merlocka
Thank you professor.

Oh wait. This is your second thread about the exact topic, and you posted this exact paragraph in both.

Your thoughts are not organized, it's hard to tell what you are trying to explain. Try using paragraphs.

You fail to consider the effects of VSYNC, and/or what happens if the framerate drops below 85fps with VSYNC on. If you're not sync'd, then I'm not sure I get the point.

Actually, not to be rude, but this was the most annoying paragraph I've had the displeasure to read and I want my 30 seconds back.

I understood exactly what he was saying. Work on your reading comprehension.

 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: VIAN
I really enjoyed reading your response Matthias99. Now here is mine.

1. LCD's do have a refresh rate. And a refresh rate is a refresh rate and nothing more. If a CRT was constantly back lit, we wouldn't see any flickering either. But it isn't and this is why I used the CRT to prove that 60fps is choppy. Because 60fps is choppy, we must move up steps in refresh rate, assuming VSYNC is on, to get a better framerate. 85Hz is probably the best refresh rate since it shows no flickering and any more would probably degrade image quality. So 85fps@85Hz. It can be done in LCD's too and will work the same way because a refresh rate is still a refresh rate. If you own an LCD you'll also need to take into consideration the response time. I don't know too much about LCD's so excuse me because of LCD's limitations.

Technically, yes, LCD monitors *do* have a hardware refresh rate, but it tends to be overshadowed by the incredibly poor response time. Essentially, refresh rate is how often the monitor *tries* to change what it's showing, and response time is how long it takes the monitor to actually switch the pixels. CRT monitors have a fairly low refresh rate but essentially zero response time, whereas LCD monitors have (usually) high refresh rates and a really long response time. And if a CRT was constantly backlit, it wouldn't flicker, but you might get tired of looking at a solid white display after a while. ;)

I also fully agree that discussions of video/tv stuff don't really belong in this thread, but other people had been bringing them up. The more important point I was trying to make was that even a constant 24fps is not really enough for action games (although it may be perfectly acceptable for TV/DVD viewing, due to the inherent properties of the media), and moving to a higher framerate will improve your gaming experience (to a point).

 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
The more important point I was trying to make was that even a constant 24fps is not really enough for action games (although it may be perfectly acceptable for TV/DVD viewing, due to the inherent properties of the media),

Film is projected at 24FPS..not movies..not DVD's not Television..Film projectors ;)
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
VIAN,

1. Again, you're bundling flickering and a smooth framerate together. They're separate issues, and it would be very helpful if you'd treat them as such.

2. Why are you assuming vsync is enabled? I'd only do so if you also assume triple buffering is enabled, to ensure that when your video card's framerate dips below the monitor's refresh rate you're not halving your visible framerate.

4. As I've said, theatres run film at 24fps@48Hz (progressive). Do you hate the low 24 frames per second "framerate" or the low 48Hz refresh rate? NTSC TV is ~30fps@60Hz (interlaced). I don't understand the last three sentences of your fourth post. DVD is about 720 "scanlines," so it should look close to perfect at 1024x768. As to your card looking worse in fullscreen mode, that's probably because it has a poor-quality upsampling method. Upsampling 720 to 768 requires some sort of blurring or smoothing, which will be closer to the TV's blurring, so I'm not sure why you prefer DVD downsampled to TV, but not DVD upsampled to 1024x768. I'm ignoring TV's interlacing for the sake of simplicity.

I'll say it would be best to have 120fps@120Hz, but not all video cards or monitors will cooperate. Thus the desire for either disabling vsync or enabling triple buffering. And how does 85fps@85Hz help in newer games like Doom 3 or HL2, that average well below 85fps? With vsync on, you'd be seeing 42.5fps (or worse) avg, rather than the 50+fps some cards are capable of with vsync off.