Frame rates and refresh rates

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
Wow and now it get better, first you said that people cant tell the difference between 25fps and now you decide to back track. To say the whole eye visual range is a minor detail is wrong, perhaps when you have visited these forms for longer than a couple of months you will realise that this subject gets more flames than any other. As for my education, dont bother I have worked in the defense industry for 9 years as a project engineer before going into teaching :p
Unless motion blur is used 25 or 30 fps is highly visible to the human eye, even with motion blur 99% of the population can see flicker in panning scenes. The advised minimum for flicker rate is 50hz, anything less is likely to cause eye strain, if you think that 12-15 fps can fool the brain into seing continuos motion you should try the 3dglasses that are available which use the 2 lcd eye shutters alternatively, these cannot be used much lower than 100hz (50hz per eye) for very long, 120hz is OK but 140hz is the most comfortable.
 

Boogak

Diamond Member
Feb 2, 2000
3,302
0
0
But I advised and still do that for the MAJORITY of people a higher res with eye candy and 50-80 FPS is much more pref than lower res and 100+ FPS.
I agree with you 100% here, for me (ie. my personal opinion) anything over 60fps is just a bonus.

and since the human eye can't tell much difference beyond 30 FPS
I totally disagree with you here, I can tell a BIG difference between 30fps and 60fps in electronic gaming.
Here's a nice article I bookmark for occasions like this when the topic regarding how many fps a human eye can see crops up.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Thanks Boogak for expressing your opinions like a grown up, unlike Mingon whome I think is more likely a school student than a teacher. LOL!

QUOTE Minging, "Unless motion blur is used 25 or 30 fps is highly visible to the human eye, even with motion blur 99% of the population can see flicker in panning scenes. The advised minimum for flicker rate is 50hz, anything less is likely to cause eye strain"

;) Mingon, you really need to decide whether you are talking about refresh rates or FPS, or don't you know there is a difference. In any case I haven't suggested 25Hz, 50Hz or 60Hz are good refresh rates, I was just making a point with reference to TV sets. In terms of FPS I have never suggested that people play games in 25 or 30 FPS, just that there is little point in my opinion for the VAST MAJORITY (not everybody as you keep implying) to running 100+ FPS because 60ish AVERAGE FPS will give as much benefit, more when you consider better detail, eye candy or increased res. I simply said that the human eye scientifically speaking can't tell much (MUCH) difference when going beyond 30 FPS. Remember we're talking AVERAGE FPS (AND MOST NOT ALL PEOPLE), at 60 FPS a game is likely to drop to around 30 FPS when things get complex, and peak at around 90 FPS. If they tried playing at 30 FPS, NOT THAT I SUGGESTED THIS ANYWAY, the minimum would be around 12-15 FPS (Just enough for the brain to interpret continuous motion, not smooth not pleasant but motion rather than individual frames)! That is all I said and that is what you jumped me for! In reference to refresh rates I simply said that 75Hz is considered 'flicker free' for the majority of people (THE MAJORITY NOT EVERYONE), whereas you seem to think 50Hz is (?).

:D Let me just quote what I said and then what you said for a minute when all of this began, without worrying about all of the irrelevant information and opinion that came after ...
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
It started with a debate over Vsync, with minor reference to FPS and Refresh Rates ...

QUOTE AUSTIN:

Vsync is disabled to allow FPS to exceed the refresh rate of the monitor. Vsync on will avoid the tearing (random horizontal lines flickering across the screen), and since the human eye can't tell much difference beyond 30 FPS it makes little sense to having 100+ FPS, you're unlikely to see the any diff other than those annoying tearing effects. Also since the screen is only being updated 85 times a second at 85hz it is impossible to actually see more frames than the monitor can show (IIRC). People (mostly reviewers) disable Vsync in order to avoid having all cards giving FPS equal to refresh rates until the card can no longer keep up. That is all Vsync does, it doesn't have anything to do with aging the monitor ao damaging anything, it simply avoids the tearing.

Refresh rate is measured in Hz. 75Hz is considered 'flicker free' and the majority of people will be able to read and watch the screen comfortably at close quarters. The higher the Hz the more stable and generally better quality the image displayed. As the screen resolution increases the maximum refresh rate the monitor can use decreases.

In any case you really the maximum FPS you want is around 60-80 FPS, this should ensure that at the minimum FPS (when there is lots to render) you are still above 25-30 which is how many FPS TV sets use. So it is usually a matter of cranking up detail and resolution until your card gets to about 60-80 FPS. I can tell you 100% that 640x480x32 (or even 800x600x32) at 200 FPS will look sunstantially worse than 1024x768x32xAA at 60 FPS.

Try out the refresh rates yourself. R.click the desktop and select 'Properties', then 'Settings', 'Advanced', 'Monitor'. Ensure that your correct manu and model number are shown (and not 'default monitor') or else select something like 'Default Monitors: 1600x1200'. Now at the bottom of the Monitor Tab you will see 'Screen Refresh Rate' and a drop down list allows you to select higher numbers. Go up gradually and if the screen goes all wierd, your monitor can't handle that refresh rate at that resolution, just don't press anything and Windows will restore the previous setting. If only 60hz or 'default' are shown then select a lower res (eg 800x600) and more should be available. You should aim for 75hz minimum for day-to-day Windows use. WinXP allows you to untick a box that says 'Hide modes the monitor can't display', it always a good idea to uncheck this and find the higher settings yourself.

In a rush but HTH!
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
QUOTE ant80:

Thanks for ur reply AnAndAustin. Clarified things for me. Now, a few more issues.

How do you find the fps of the graphics card when playing some game? Does it say in the game menu or does it appear on the windows system? Also, how do you increase the detail? I know how to increase the resolution, but the detail foxed me.

It seems that we can increase the resolution and detail to some extent to get the fps to around 60 or 80. Now, there is a tradeoff between resolution and detail. As technology increases, both the resolution and detail would increase at a particular fps. Is there a particular (hard and fast) rule that defines this point at current technology? Or is it just our comprehension? Thanks.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Confused suggests FRAPS.

QUOTE AUSTIN:

Each game will have its own settings in the gfx menu. With modern cards like GF3, GF4, Rad8500 and Parhelia you really want everything on (or very near) maximum. If there's an option for AA then choose 2xAA, QxAA (GF3 & GF4) or FxAA (Parh) and where possible enable a low level of Aniso (maybe even Vsync too). Then start with a res of 800x600x32 and see how it plays. If it's jerky then disable AA/Aniso or reduce the details, you don't want to play any game lower than 800x600x32. If it's smooth then try the next res, prob 1024x768x32 and so on. Finding out the actual FPS is always nice, but you can't beat how the game feels. At the end of the day it's all down to your particular preferences. Some people like high res and no AA&Ani, other like lower res with everything on max, and some are in between.

Some settings can be forced from sw, tweak type tools or gfx card control panel. This is useful to force a game to use Ani, a certain type of AA, Vsync etc. Older cards like Rad7500 and GF2 cards hate AA and medium details would be better so res of 800x600x32 or 1024c768 are playable. Another factor is the amount of gfx RAM. Games like Max Payne tell you in the pre-game settings what effect and req a partic setting has. Generally 32MB will stutter with full details and medium res, 64MB should handle most games without a prob on max detail (providing the card is fast enough, eg GF2MX NO, GF4TI YES). 128MB means you don't have to worry about textures and details, for about 6-12 months anyway! LOL. This is all generalisations as it is almost impossible to comment on every card and every game. Just remember that a GF2TI 32MB will kick the ass of a GF2MX 64MB, a GF4TI4200-64MB will kick the ass of GF3TI200-128MB, memory is nothing without the bandwidth and speed of the gfx card as a whole unit.

Some more info on AA:

When it comes to AntiAliasing (removal of 'jaggies' / pixel step effect), ATI (and GF2) use SuperSampling (rendering more pixels than used in final output) which takes a larger perf hit and is quite old and inefficient. GF3 & GF4 use MultiSampling which involves more guess work; faster but slightly blurrier results. The blurriness can be overcome by use of Anisotropic Filtering which sharpens the textures and results in an image easily as good as Supersampling but is much faster. Aniso is only officially for OpenGL but the new 4xS mode works in DirX to use AA & still keep the detail. Matrox Parhelia-512 uses FragmentxAA, it works out which parts of the image need to be antialiased and only performs AA on those select parts and does so with only a small perf hit. Unfortunately it misses some jaggies, results in eratic frame rates and can't always be used, at which point 4xAA (SS) is the only other option and has a big perf hit.

(Then I cover details settings like 16bit vs 32bit colour and say, "Of course beauty is in the eye of the beholder and it all really comes down to personal preference."
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Quote cdub:

Thanks AnAndAustin!

Quick question: are you saying that mysterious 4xS FSAA setting is Quincunx AA with some preset level of Aniso, or is it something else entirely? I am mainly wondering in terms of the performance hit...
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
QUOTE AUSTIN:

4xS AA is 4xAA MS-AA but it uses some SS-AA techniques (probably reminiscent of Fragment AA that Parhelia uses). This gives it the appearance of 4xAA & Ani with a similar hit BUT you can't officially use Ani in D3D hence 4xS AA is the D3D equivilent of OpenGL 4xAA & Ani combination with similar perf hit. However QxAA (2xAA hit on GF4, inbetween 2x & 4x on GF3) with Ani offers 4xAA & Ani / 4xS AA looks with far less perf hit. So in essense I would rec QxAA & Ani as by far the best quality and perf combination. Then see how far you can up the res with full details without things getting jerky.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Quote AUSTIN from about 20 posts ago, "and since the human eye can't tell much difference beyond 30 FPS"

MINGING:

Wrong, wrong, wrong, wrong and wrong
most people can see the difference between 30 and 60 fps. I personally can tell the difference between 75 and 100 easily.

Have a look at this here and next time try not to spout of rubbish that causes more flame than any other subject
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
QUOTE minging, "more flame than any other subject"

So without considering world poverty or legalising Canabis etc etc, even if we concentrate on Computer stuff, don't you think Intel vs AMD, SktA vs Skt478, RIMM vs DDR, nVidia vs ATI vs Matrox, nForce audio vs Audigy vs Live are hotter debated than what the average human eye can and can't easily differentiate between?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
QUOTE from Boogak's very informative link:

"anything over 100 fps will give exceptionally minimal improvements in visual quality or the "suspension of reality" effect that higher framerates give."

Okay so the link suggests that my rec of 60-80 AVERAGE FPS is not far wrong. Gee, most of what I said is backed up by the link though ...

QUOTE AUSTIN:

"Vsync is disabled to allow FPS to exceed the refresh rate of the monitor. Vsync on will avoid the tearing (random horizontal lines flickering across the screen), and since the human eye can't tell much difference beyond 30 FPS it makes little sense to having 100+ FPS, you're unlikely to see any diff other than those annoying tearing effects. Also since the screen is only being updated 85 times a second at 85hz it is impossible to actually see more frames than the monitor can show (IIRC). People (mostly reviewers) disable Vsync in order to avoid having all cards giving FPS equal to refresh rates until the card can no longer keep up. That is all Vsync does, it doesn't have anything to do with aging the monitor ao damaging anything, it simply avoids the tearing."

Although I don't specify 30 FPS on a computer monitor it is a fair assumption to make. However, that wasn't in any way a key point I was making. THERE IS LITTLE POINT IN RUNNING 100+ FPS, that was my key point! Since I NEVER suggested anybody run at 30 FPS, I don't think this is worth jumping on me for in the manner which you did. Correct me on the eye thing with evidence in a mature way as Boogak did, but all of the crap that you came out with after was irrelevant and childish.