Frame rates and refresh rates

ant80

Senior member
Dec 4, 2001
411
0
0
Currently, the monitor refresh rates are between 60 and 85Hz. Some people may have upto 100Hz. In this case, what is the necessity of having frame rates exceeding 100fps? The monitor is just gonna ignore some of the frames generated by the GPU right? Or am I missing something? Thanks
ant
 

AMDfreak

Senior member
Aug 12, 2000
909
0
71
Frame rates are tied to refresh rates only when the vsynch setting is on. With vsynch off, it is entirely possible to have 200fps on a monitor with a 100Hz refresh rate.
 

cdub

Senior member
May 31, 2002
254
0
0
I am curious why people do not use the vsync restriction. I haven't done too much testing, but in GTA3, when I use the option to turn Vsync OFF, everything seems really choppy. Same with turning the Frame Limiter off.

People mentioned no vsync helps in FPS games... what is the advantage? Thanks!
 

SpiffyGuy

Member
Jun 4, 2002
71
0
0
I am kinda stupid so exactly what does the refresh rate do then? Provide a clearer picture on the monitor of what it is being set from the vid card? Or does it mainly have to do with eyestrain?
 

ant80

Senior member
Dec 4, 2001
411
0
0
With vsynch off, it is entirely possible to have 200fps on a monitor with a 100Hz refresh rate.

Er... just for the newbies, what is vsynch (vertical sync???)? How do I turn it off?

What are the side effects of turning off vsynch? Does it shorten the monitor life? And how does vsynch work? Thanks.
ant
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Vsync is disabled to allow FPS to exceed the refresh rate of the monitor. Vsync on will avoid the tearing (random horizontal lines flickering across the screen), and since the human eye can't tell much difference beyond 30 FPS it makes little sense to having 100+ FPS, you're unlikely to see the any diff other than those annoying tearing effects. Also since the screen is only being updated 85 times a second at 85hz it is impossible to actually see more frames than the monitor can show (IIRC). People (mostly reviewers) disable Vsync in order to avoid having all cards giving FPS equal to refresh rates until the card can no longer keep up. That is all Vsync does, it doesn't have anything to do with aging the monitor ao damaging anything, it simply avoids the tearing.

Refresh rate is measured in Hz. 75Hz is considered 'flicker free' and the majority of people will be able to read and watch the screen comfortably at close quarters. The higher the Hz the more stable and generally better quality the image displayed. As the screen resolution increases the maximum refresh rate the monitor can use decreases.

In any case you really the maximum FPS you want is around 60-80 FPS, this should ensure that at the minimum FPS (when there is lots to render) you are still above 25-30 which is how many FPS TV sets use. So it is usually a matter of cranking up detail and resolution until your card gets to about 60-80 FPS. I can tell you 100% that 640x480x32 (or even 800x600x32) at 200 FPS will look sunstantially worse than 1024x768x32xAA at 60 FPS.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D Try out the refresh rates yourself. R.click the desktop and select 'Properties', then 'Settings', 'Advanced', 'Monitor'. Ensure that your correct manu and model number are shown (and not 'default monitor') or else select something like 'Default Monitors: 1600x1200'. Now at the bottom of the Monitor Tab you will see 'Screen Refresh Rate' and a drop down list allows you to select higher numbers. Go up gradually and if the screen goes all wierd, your monitor can't handle that refresh rate at that resolution, just don't press anything and Windows will restore the previous setting. If only 60hz or 'default' are shown then select a lower res (eg 800x600) and more should be available. You should aim for 75hz minimum for day-to-day Windows use. WinXP allows you to untick a box that says 'Hide modes the monitor can't display', it always a good idea to uncheck this and find the higher settings yourself.

In a rush but HTH!
 

ant80

Senior member
Dec 4, 2001
411
0
0
it is usually a matter of cranking up detail and resolution until your card gets to about 60-80 FPS


Thanks for ur reply AnAndAustin. Clarified things for me. Now, a few more issues.

How do you find the fps of the graphics card when playing some game? Does it say in the game menu or does it appear on the windows system? Also, how do you increase the detail? I know how to increase the resolution, but the detail foxed me.

It seems that we can increase the resolution and detail to some extent to get the fps to around 60 or 80. Now, there is a tradeoff between resolution and detail. As technology increases, both the resolution and detail would increase at a particular fps. Is there a particular (hard and fast) rule that defines this point at current technology? Or is it just our comprehension? Thanks.
ant
 

Confused

Elite Member
Nov 13, 2000
14,166
0
0
FRAPS will show your frame rate (FPS) in any OpenGL or DirectX program/game (almost everything now is OGL or DX) :)

Confused
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Each game will have its own settings in the gfx menu. With modern cards like GF3, GF4, Rad8500 and Parhelia you really want everything on (or very near) maximum. If there's an option for AA then choose 2xAA, QxAA (GF3 & GF4) or FxAA (Parh) and where possible enable a low level of Aniso (maybe even Vsync too). Then start with a res of 800x600x32 and see how it plays. If it's jerky then disable AA/Aniso or reduce the details, you don't want to play any game lower than 800x600x32. If it's smooth then try the next res, prob 1024x768x32 and so on. Finding out the actual FPS is always nice, but you can't beat how the game feels. At the end of the day it's all down to your particular preferences. Some people like high res and no AA&Ani, other like lower res with everything on max, and some are in between.

;) Some settings can be forced from sw, tweak type tools or gfx card control panel. This is useful to force a game to use Ani, a certain type of AA, Vsync etc. Older cards like Rad7500 and GF2 cards hate AA and medium details would be better so res of 800x600x32 or 1024c768 are playable. Another factor is the amount of gfx RAM. Games like Max Payne tell you in the pre-game settings what effect and req a partic setting has. Generally 32MB will stutter with full details and medium res, 64MB should handle most games without a prob on max detail (providing the card is fast enough, eg GF2MX NO, GF4TI YES). 128MB means you don't have to worry about textures and details, for about 6-12 months anyway! LOL. This is all generalisations as it is almost impossible to comment on every card and every game. Just remember that a GF2TI 32MB will kick the ass of a GF2MX 64MB, a GF4TI4200-64MB will kick the ass of GF3TI200-128MB, memory is nothing without the bandwidth and speed of the gfx card as a whole unit.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Some more info on AA:

;) When it comes to AntiAliasing (removal of 'jaggies' / pixel step effect), ATI (and GF2) use SuperSampling (rendering more pixels than used in final output) which takes a larger perf hit and is quite old and inefficient. GF3 & GF4 use MultiSampling which involves more guess work; faster but slightly blurrier results. The blurriness can be overcome by use of Anisotropic Filtering which sharpens the textures and results in an image easily as good as Supersampling but is much faster. Aniso is only officially for OpenGL but the new 4xS mode works in DirX to use AA & still keep the detail. Matrox Parhelia-512 uses FragmentxAA, it works out which parts of the image need to be antialiased and only performs AA on those select parts and does so with only a small perf hit. Unfortunately it misses some jaggies, results in eratic frame rates and can't always be used, at which point 4xAA (SS) is the only other option and has a big perf hit.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
Some info on 16bit COLOUR vs 32bit COLOUR (eg 1024x768x16 vs 1024x768x32):

:) Slower card of TNT2 & 3dfx generations, esp with 16MB RAM really benefit from 16bit colour. For modern gfx cards like GF3, GF4TI & Radeon8500 you only want to consider 32bit colour, and prob AA as well!

:( Of course actual perf diffs do depend upon the card and CPU used. With 'slower' CPUs and higher end gfx cards you really want to max out gfx settings with AA, Aniso and of course 32bit colour in order to use the full GPU potential that the CPU may not tap.

3Dmark2001 using a mid-range AthlonXP and default res of 1024x768:

Voodoo4 32 = 1600 & 9.5FPS (Car Chase High Detail)
Voodoo4 16 = 2250 & 14.5FPS

GF2 GTS/Pro/TI 32 = 6000 & 41.5
GF2 GTS/Pro/TI 16 = 6100 & 40

GF3 32 = 8800 & 51
GF3 16 = 7500 & 44 (Yes slower I double checked)

GF4TI4200 32 = 10500 & 55
GF4TI4200 16 = 9500 & 51 (Again slower!)

You would expect most benefit to come in higher resolutions where the bandwidth is more limited (eg 1600x1200):

GF3 32 = 5000 & 39
GF3 16 = 6400 & 47.5

Or with AA enabled for the same reason (1024x768):

GF3 32 = 5200 & 38
GF3 16 = 5500 & 40

;) Obviously you get fewer comparable results when shifting from the default 1024x768x32 as fewer people run or submit them but this should still be quite accurate.

:D If it's perf you're after then lowering the resolution or detail settings a little may be more pleasurable than 16bit colour, do bear in mind that on GF3 & GF4TI cards you may be actually slowing things down by switching to 16bit! Of course beauty is in the eye of the beholder and it all really comes down to personal preference ;).

:) 16bit colour, 16bit uses 2 to the power of 16, ie 65536 colours or more accurately 'shades of colour'. 32bit uses 24bit colour (the other 8bits are reserved) which gives 2 to 24, ie 16777216 shades (16.8 million). It is difficult for most people to distinguish between 16bit and 32bit colour but with the way in which modern gfx cards have evolved 32bit is now very nearly as fast as 16bit, and as seen above performance can actually be worse at 16bit. This is most probably due to any relatively modern game actually using 32bit samples for textures and game detail which means that they can be used without translation, but by switching to 16bit you are freeing up bandwidth (less colours means less data to be processed) but you then have to convert 32bit to 16bit before processing. This 'scaling' of colour often leads to distortion and 'banding' (where shades of colours should be smooth but visible different colour bands can be seen).

;) IMHO, if you have a modern gfx card (GF3 or Radeon8500 or higher) then you really should consider 32bit as default and alter detail levels or resolutions in order to make the game smoother. With so many detail options and gfx card capabilities it does take some experimenting to find the balance between high performance and high quality but it really comes down to what is personally acceptible and preferable to you.
 

cdub

Senior member
May 31, 2002
254
0
0
This has turned into a good lecture on how to maximize the use of your video card! Thanks AnAndAustin!

Quick question: are you saying that mysterious 4xS FSAA setting is Quincunx AA with some preset level of Aniso, or is it something else entirely? I am mainly wondering in terms of the performance hit...
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) 4xS AA is 4xAA MS-AA but it uses some SS-AA techniques (probably reminiscent of Fragment AA that Parhelia uses). This gives it the appearance of 4xAA & Ani with a similar hit BUT you can't officially use Ani in D3D hence 4xS AA is the D3D equivilent of OpenGL 4xAA & Ani combination with similar perf hit. However QxAA (2xAA hit on GF4, inbetween 2x & 4x on GF3) with Ani offers 4xAA & Ani / 4xS AA looks with far less perf hit. So in essense I would rec QxAA & Ani as by far the best quality and perf combination. Then see how far you can up the res with full details without things getting jerky.
 

cdub

Senior member
May 31, 2002
254
0
0
No way! It's a good lecture! ;) Lecture is kind of a loaded word though... let's say instead, droppin' da bomb knowledge! So I just tried a few combinations of AA and Aniso and got a first hand idea of the pluses and minuses of each. Thanks alot, you've really helped my understanding of this stuff.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
and since the human eye can't tell much difference beyond 30 FPS
Wrong, wrong, wrong, wrong and wrong :p
most people can see the difference between 30 and 60 fps. I personally can tell the difference between 75 and 100 easily.

Have a look at this here and next time try not to spout of rubbish that causes more flame than any other subject
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Thanks for the thanks guys. No need for those couple of idiots to be so critical over such a small point though, like they've never been wrong. However I don't believe I was wrong.

QUOTE AnAndAustin "and since the human eye can't tell much difference beyond 30 FPS it makes little sense to having 100+ FPS"

;) The AVERAGE human eye only needs 12-15 FPS to fool the brain in to seeing continous movement instead of individual frames. Regarding TV sets, which most would say show very good motion, NTSC displays 30 FPS and PAL 25 FPS. Although note the words MUCH and LITTLE (you twits). Try a blind test and a random sample of 10 people. I doubt many would be able to distinguish between 50 FPS (likely minimum FPS 30 FPS) and 100 FPS (beyond the usuall probability of lucky guesses) but I would guarantee that most people could tell the difference between 1024x768xAA 50 FPS and 800x600 100 FPS. THAT was the point I was making and it was hardly a crucial one. If you bothered to read any of my posts you would notice I say "it all really comes down to personal preference". There are always a handfull of people who can distinguish between very 'small' differences that the majority of people can't, usually enthusiasts or experts, whether that would be 100 FPS vs 50 FPS, 85Hz vs 75Hz refresh rates, Audigy vs nForce vs Live etc.

:D Anyway to give the full quote:

"Vsync is disabled to allow FPS to exceed the refresh rate of the monitor. Vsync on will avoid the tearing (random horizontal lines flickering across the screen), and since the human eye can't tell much difference beyond 30 FPS it makes little sense to having 100+ FPS, you're unlikely to see any diff other than those annoying tearing effects. Also since the screen is only being updated 85 times a second at 85hz it is impossible to actually see more frames than the monitor can show (IIRC). People (mostly reviewers) disable Vsync in order to avoid having all cards giving FPS equal to refresh rates until the card can no longer keep up. That is all Vsync does, it doesn't have anything to do with aging the monitor ao damaging anything, it simply avoids the tearing."
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
No need for those couple of idiots to be so critical over such a small point though, like they've never been wrong. However I don't believe I was wrong.
rolleye.gif


The AVERAGE human eye only needs 12-15 FPS to fool the brain in to seeing continous movement

WOW dug yourself an even bigger hole there, firstly lets no bother mentioning interlacing on a TV because I am sure a person who like who is never wrong knows this anyway. As for 12-15fps for constant motion what an utter load of garbage I cant believe you have said that. Most people need a absolute minimum of 50fps for a nice constant motion picture, I and alot of people I know need 75 as a minimum. Did you actually try that program at all ? at 12-15 fps was it smooth? I think not. I think the only way for you to learn is to site in front of a monitor with a 25hz refresh rate for about 10 minutes and then hopefully your brain will be erased and we can start again :D
 

pulse8

Lifer
May 3, 2000
20,860
1
81
Originally posted by: Mingon
No need for those couple of idiots to be so critical over such a small point though, like they've never been wrong. However I don't believe I was wrong.
rolleye.gif


The AVERAGE human eye only needs 12-15 FPS to fool the brain in to seeing continous movement

WOW dug yourself an even bigger hole there, firstly lets no bother mentioning interlacing on a TV because I am sure a person who like who is never wrong knows this anyway. As for 12-15fps for constant motion what an utter load of garbage I cant believe you have said that. Most people need a absolute minimum of 50fps for a nice constant motion picture, I and alot of people I know need 75 as a minimum. Did you actually try that program at all ? at 12-15 fps was it smooth? I think not. I think the only way for you to learn is to site in front of a monitor with a 25hz refresh rate for about 10 minutes and then hopefully your brain will be erased and we can start again :D

I'd kill myself if my games played at 12-15fps.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
LOL you guys take things so seriously ... lighten up.

:D Anyway to address your points,

QUOTE Mingon, "firstly lets no bother mentioning interlacing on a TV because I am sure a person who like who is never wrong knows this anyway."

;) I didn't say I was never wrong, what I said was that with how critical you are on such a miniscule unimportant detail it would seem as though YOU believe you are never wrong. However I didn't and don't believe I was wrong on the point I made, most people would be better off playing at 50-80 FPS with enhanced res or eye candy than running 100+ FPS where the refresh of the monitor simply isn't showing each frame anyway. A lot of people either use the highest res their monitor can handle or else use default refresh rates, in both cases they are likely to only have a 60Hz refresh rate and hence frame rates beyond 60 are not going to give any effect other than tearing.

QUOTE Mingon: "As for 12-15fps for constant motion what an utter load of garbage I cant believe you have said that."

;) Mingon, I NEVER SAID THAT! I said "The AVERAGE human eye only needs 12-15 FPS to fool the brain in to seeing continous movement", NOTE seeing a continuous motion, nothing about SMOOTH or RECOMMENDED, I'm talking Science here, or did you not finish the basic level of education?

QUOTE Mingon "I think the only way for you to learn is to site in front of a monitor with a 25hz refresh rate for about 10 minutes and then hopefully your brain will be erased and we can start again "

:D I never suggested 25Hz, or even 25 FPS! I think people like you are very sad, simply contributing nothing to forums other than criticism on irrelevant points. Look at what people say, not what you think they say, learn to read, and if you can't say anything constructive then don't say anything at all.

QUOTE pulse8: "I'd kill myself if my games played at 12-15fps"

;) Yeah, I know what you mean, I wouldn't play games any where near that. I never suggested anybody should. What I did suggest was that people should try settings out and see what they prefer. But I advised and still do that for the MAJORITY of people a higher res with eye candy and 50-80 FPS is much more pref than lower res and 100+ FPS.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D Anyway since you seem to like barking on about the irrelevant, TV interlacing.

;) It came about due to the technology in the early days of the TV sets and signals couldn't cope with providing the whole picture in one go. So what they did was send all of the odd horizontal lines followed by all of the even horizontal lines to make up just 1 complete frame. As TVs developed this was no longer req (even though little has truely changed in the methodology of the CRT), but interlacing was kept for compatibility and easiness. Interlacing does lead to more headaches and eystrain, particularly close up. Generally standard TV set uses either 60Hz (NTSC) or 50Hz (PAL) interlaced, which equates more to 30Hz and 25Hz in reference to Computer Monitors which are not interlaced. However since TV sets rarely show still text, fine detail and are rarely viewed close up, they get away with it.