# Who's Right?

#### allies

##### Platinum Member
Alright, well I'm having an argument with someone about how many FPS the human eye can see. He first said that after 37fps, the human eye couldn't tell a difference. I have always thought that ~60 was where the eye could not see any more. Now he says that at about 27fps is where you cannot tell a difference, and this is the rate at about which movies are played at. Can anyone tell me which number it is, and if there is a link somewhere to prove to him that he is wrong? Help me win this battle plz

#### imported_zenwhen

##### Senior member

He is...

...right by the way.

#### allies

##### Platinum Member
Few questions then... why is 60hz straining on your eyes, and why are you able to see flickering through peripheral vision at this refresh rate? Also when setting fps_max to 30 in HL then setting it to 60 I can notice a difference in how smooth the scenes are rendered? I'm just wondering here, and inquiring minds need to know

#### merlocka

##### Platinum Member
Oh dear... here we go.

You might want to read the FAQ on this topic.

In addition, clarify what you mean by "FPS"

Are you talking about TV, computer games, normal vision responce, etc...?

To save everyone some time...

guy#1) 30FPS is enough, people who say they need more are just trying to justify their \$400 video cards.

guy#2) you idiot, if you think 30FPS is enough, set your monitor refresh to 30hz and see how much flicker you can see!

guy#1) stfu, TV is at like 27FPS and it looks fine.

guy#3) no, TV is at twice that, and they interlace every other line

guy#4) dood, you need at least 100fps for games or else it's a slideshow

guy#1) ok, then 60fps is enough.

guy#4) no way, with 60fps you have minimums in the 30's which is too slow

etc... etc... etc...

#### allies

##### Platinum Member
By FPS I'm referring to a refresh rate of a computer monitor or simply fps in a game. The person that I was talking with started using references to TV/Theatre, and obviously these are different scenarios than computers. I'll read up on the faq and if I need any more help feel free to post it here, thx.

#### merlocka

##### Platinum Member
By FPS I'm referring to a refresh rate of a computer monitor or simply fps in a game.

Well, most people agree that 60Hz refresh rate on a monitor is bad enough to melt their eyes.

Refresh rates and fps are not the same, and many people play with Vsync disabled anyway.

#### Insidious

##### Diamond Member
Few questions then... why is 60hz straining on your eyes, and why are you able to see flickering through peripheral vision at this refresh rate? Also when setting fps_max to 30 in HL then setting it to 60 I can notice a difference in how smooth the scenes are rendered? I'm just wondering here, and inquiring minds need to know

Because it syncs up slightly out of phase with the lights in your room which are turning on and off at 60Hz (not to mention the power to your monitor). The "flickering" is when you move from leading to in-phase to lagging. Even the 50Hz stuff in some "other more backwards lands" is close enough to get this effect.

In office spaces, the electricians have to take care to wire some lights on each of the 3 phases of power available (instead of all on a single phase) to avoid giving the entire room a massive headache.

It is NOT your brain it is the power company. If you were able to power your system and lighting from a 400Hz supply, you wouldn't have this problem

#### allies

##### Platinum Member

So its a complicated issue and when computers, tv, and other technology is involved its really hard to say what your eye can see and not see?

Oh, and that does make sense insidious

Originally posted by: merlocka

Well, most people agree that 60Hz refresh rate on a monitor is bad enough to melt their eyes.

Refresh rates and fps are not the same, and many people play with Vsync disabled anyway.

My 17" 5 year old monitor is running at 1024*768 @ 60 hz, yes it does suck. I'll take donations for a new monitor.

#### ProviaFan

##### Lifer
Instead of lumping together the definitions of "FPS" like you do, they should be separated, so I'm going to break this up into separate paragraphs...

First, on monitor refresh rate: The flicker you see is not directly related to how many Hz your video card is pumping out of it's HD15 connector (just look at a good LCD running off the analog input at 60Hz). On CRTs, you see the flicker at 60Hz (and some people see it on up until 80Hz or so) because the phosphors will only stay "activated" for a certain [very short] period of time after being struck by the electron guns. If your video card isn't having your monitor refresh the screen fast enough, the phosphors will have time to fade, and this causes the effect we all affectionately know as flicker. Whether or not you can stand this is a function of your eyes - some people can handle 60Hz all day on an old CRT; personally, I can't take anything below 85Hz for an extended period of time.

Now, on to "FPS" in games: This one is probably a lot more complex than I am going to make it sound, but basically it depends on how much motion there is in the game. While flight sims and similar games get by well with lower frames per second (maybe 30 or 40), your favorite "FPS" (this time meaning First Person Shooter) games will probably need at least 60 FPS to make them free of the appearance of jumpyness, since there is usually much more motion.

Well, I'm getting tired now, but hopefully this helps a bit

#### allies

##### Platinum Member
Thx...I should try paragraphs from now on, they have an aura of organizideness (lol) to them.

#### cmdrdredd

##### Lifer
the way I always think of it is this.

It doesn't matter what you can see...if you can only see 60fps and say 60 is plenty it drops and thus you SEE the 30fps drop for that second of movement as lag. it's noticable because it goes up to say 80 and then to 30 it's a huge difference.

To make it easier higher fps = smoother gameplay regardless of what you think you can see or feel...what you want is a higher minimum. This way your lowest fps will always be up high and you'll never feel or see the lag.

##### Lifer
Didn't 3DFX release a program before they went under that ran an image at 30fps side by side with the same image running at 60fps? That should have put this damn argument to rest right then and there.

#### PliotronX

##### Diamond Member
Originally posted by: Gonad the Barbarian
Didn't 3DFX release a program before they went under that ran an image at 30fps side by side with the same image running at 60fps? That should have put this damn argument to rest right then and there.
Yes they did, the only problem was that it only runs under Glide which no video cards other than Voodoo cards will support (unless you have a compatible Glide wrapper). There is a similar program which will run on any 3D card, you can get it here.

#### jiffylube1024

##### Diamond Member
A lot of good points were brought up here. Personally, I think a CONSTANT 60fps is all anyone would ever need (for PC gaming anyways), but unfortunately it never stays constant, and the sharp changes in FPS is what we notice as jerkiness. Even a drop from 120-60 fps can be noticeable, just because of the change.

As for movies, they're played at 24 fps. The reason they seem smooth is because they're played in a darkened room (the image stays burned on your retina longer) and because they use motion blur between frames.

#### AnAndAustin

##### Platinum Member
Hi, sorry I caught this thread late, doesn't Vsync being enabled ensure the FPS never exceed the refresh of the monitor, after all if the monitor is set to 75Hz then it can't physically update more than 75 times per second, can it? Doesn't this just result in tearing (random flashing horizontal lines across the screen) and is only of use to reviewers to avoid many cards giving the same score?

#### allies

##### Platinum Member
As mentioned before, my monitor supports 1024x768 @ 60hz, so whenever I run games I have vsync disabled and I never have had any problems with the tearing you mentioned, although I have heard of it as well.

On a side note, I'm looking for a new monitor, prolly 19". I'm thinking about the 19" Compaq P910 mentioned in the Hot Deals section of the forums. Right now I have a 5 year old IBM 17". Do you think that this upgrade would be worth my \$240?

#### AnAndAustin

##### Platinum Member
Just to reiterate, very sage advice on FPS (Frames Per Second) and what is perceptible to the human eye and brain:

http://amo.net/NT/02-21-01FPS.html

I have never seen any card of any generation not experience tearing when Vsync is disabled, it is most likely just like the jaggies which nobody noticed until AA came on the scene!

As for the monitor, for unsurpassed features and quality the Iiyama Vision Master Pro 454 featuring the Diamond M2 tube is unsurpassed IMHO. It offers High Brightness mode which increases brightness and contrast to TV levels amking games and movies look excellent, it also gives games a perf-free AA effect, of course this is from reviews I've read, also expect to pay a bit more than \$240 for one. The 17" VMpro 1413 also sports the same excellent quality and high brightness technology, and may be closer to your budget, but would you want to give up the extra 2" though?

Iiyama

#### BFG10K

##### Lifer
Generally speaking my target is a minimum of 60 FPS (or higher) at all times and I like to have an average framerate of around 90 FPS - 120 FPS in game timedemos. Of course that's not to say that I can't see more or that I don't want anything better.

I can easily see the difference between a constant 60 FPS and 120 FPS in the ball demo and a constant 75 FPS and 150 FPS in a game when doing mouse movements, for example.

In reality there is no such as having too much FPS, just like there's no such thing has having too much MHz or too much RAM.

Refresh rates and fps are not the same,
Essentially they are the same thing. Each is a measurement of the number of screen updates per second so telling someone to try a 60 Hz monitor if they think 60 FPS is enough is a perfectly valid statement and is something I often do.

doesn't Vsync being enabled ensure the FPS never exceed the refresh of the monitor,
Yes although in reality it does much more than this. It often lowers the average framerate to close to half of your monitor's current refresh rate.

after all if the monitor is set to 75Hz then it can't physically update more than 75 times per second, can it?
You can't see more fully drawn frames but you can still see the effects of partially drawn frames which is why the statement "you can't see any more frames than your monitor can display" is false.

#### AnAndAustin

##### Platinum Member
QUOTE BFG10K: "In reality there is no such as having too much FPS, just like there's no such thing has having too much MHz or too much RAM."

Of course that's right, but in reality things aren't quite so clear cut, as usual it is a matter of compromises and personal preferences. Do you want 60FPS at 1600x1200x32, 1024x768x32 2xAA or 800x600x32 4xAA 4tap/L2xAniso? Or do you prefer 100FPS at 1280x960x32 or 800x600x32 2xAA? Of course this also relates to mhz/ghz and RAM too, for example a gamer can often be torn between 2.5ghz&GF2/Rad7500, 2.0ghz&GF3/Rad8500 and 1.6ghz&GF4TI. Clearly 2.5ghz&GF2 is not a smart move, while 1.6ghz won't be getting the most out of a GF4TI, for most of us it always comes down to a compromise, if only in the shorter term. RAM-wise should you go for 256MB RIMM-PC1066, 512MB DDR-PC2700 or 1GB SDR-PC133? Well of course the hit will be nasty if that 'small' amount of super-fast RIMM gets used up, and as for PC133 - capacity is nothing without speed, 512MB DDR-PC2700 seems the best compromise to make for most, but then you could get more RIMM and cheap out a little on the gfx card and CPU ...

QUOTE BFG10K: "Q: Doesn't Vsync being enabled ensure the FPS never exceed the refresh of the monitor? A: Yes although in reality it does much more than this. It often lowers the average framerate to close to half of your monitor's current refresh rate."

I would have expected that if your card is capable of 100FPS at a given res, detail and refresh rate that with Vsync on and a 75Hz refresh that the you would stay up at 75FPS for the vast majority of the time. Anybody have any hard links to info on Vsync?

QUOTE BFG10K: "You can't see more fully drawn frames but you can still see the effects of partially drawn frames which is why the statement "you can't see any more frames than your monitor can display" is false."

Isn't the tearing caused by partially drawn frames a significant impact on quality? I know from asking untechnical people's opinion that most didn't see much, if any difference between 1024x768 and 800x600 BUT did see a big difference with Vsync disabled as the tearing was VERY visable to them.

I am not intending any disrespect BFG10K, I respect your info, opinion and preferences, it is just a very grey area which I think comes down very much to personal tastes and preferences ... just like many people don't knowing or noticing the diff between 60Hz and 85Hz, while many people NEED 100Hz.

#### BFG10K

##### Lifer
as usual it is a matter of compromises and personal preferences.
And the higher the framerate the less compromise you have to make with respect to image quality. If you're getting 500 FPS at 640 x 480 then you should be easily run at 1600 x 1200 without a slowdown. OTOH somebody that claims that they only need 60 FPS @ 640 x 480 will never be able to run at 1600 x 1200.

I would have expected that if your card is capable of 100FPS at a given res, detail and refresh rate that with Vsync on and a 75Hz refresh that the you would stay up at 75FPS for the vast majority of the time.
False and in fact this almost never happens at all. If the current frame is just a little bit late getting rendered for the current monitor refresh cycle then it has to wait for the next cycle until it is drawn. When that happens your current framerate is effectively reduced to half of your current refresh rate.

Enabling triple buffering can help avoid this but this in turn causes an addtional set of problems and for this reason it's never a good idea to employ any kind of framerate capping.

Isn't the tearing caused by partially drawn frames a significant impact on quality?
Well it depends on how high the framerate is and how high the refresh rate is. I like to have a minimum 85 Hz refresh rate and I hardly ever notice tearing during gameplay because I'm moving and turning so quickly and I'm focusing on other things. At worst I'll see the odd line tear and that's about it.

The only time I can consistently see tearing is if I'm standing still in a room with a light that is flickering at random intervals. But even in that totally unrealistic gaming scenario the tearing still doesn't bother me at all.