What is considered a "smooth framerate"?!

Dance123

Senior member
Jun 10, 2003
387
0
0
Hi,

1/ First of all, what do most people consider a "smooth framerate"? Is it true that this is 70fps, what do you think?

2/ Second, is it true that you can get certain problems if your framerate is the same as your refresh rate or someting like that. I believe I once read something like that. Anybody knows more about this?!

3/ I believe 75Hz refresh rate is good enough, but if you can select higher like 100Hz or above, should you do that? Any benifits to go highe then 75Hz, what are the possible disadvantages if going (too) high? What are the guidelines here? Can a higher refresh rate lower performance or something? Anything else I should know?..

I hope you can help me with these questions! Thanks!!
 

imhungry

Golden Member
Jul 30, 2005
1,740
0
0
"Smooth" is subjective. But usually 50 and up seems 'smooth' enough for me, unless it's a really competetive and intense game. Then I need my 100 FPS. :p.

Your framerate (ITHINK), can exceed your refresh rate, but it leads to graphic tearing, which doesn't look so good. This is where vsync comes it. It synchonizes your frames to 'align' with each other to greatly reduce tearing. It's still there for me every once in a while if I look really hard.

And if you can set to 100hz without damaging your monitor, I'd do it, unless your eyes don't like it. 100hz means your monitor will refresh 100 times as opposed to 75 times. I THINK, that'd there'd be more tearing without vsync on at 75hz, and if you have vsync enabled at 100hz, you can get 100 fps..

 

tvdang7

Platinum Member
Jun 4, 2005
2,242
5
81
i think 40 + frames.....30 is sluggish...... i am pointing to COD2!!........id prefer 50 + usually.......bf2..........(CSS is a differnt story)
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
alright here are some guidelines.
1-10 is crap fro everybody.
10-24 is no fun. you can adapt and infer moving and motion from these framerates but it's not fun or nice and things will look jerky.
24-35 is decently okay. depends though. for full motion video which comes from a tv or movie source, it's great since it uses motion blur to make it look much more convincing, in real time sratgey games, 30 is decent. for First person shooters, 30 is pretty much a minimum framerate you'd like to have to not lose the immersion.
35-60 is pretty good.
60 and above is pretty convincing
120 is pretty much full motion. your brail will blend the images together to get full motion.

In general in first person shooters, 60 average is good. that way in firefights, it wont drop too low and you arent sacrificing too mcuh in visual quality.

for refresh rates, set as high as you can. lower rates make it look like it;s flickering. the higher, the more comftable it is to stare at the screen longer(reduces eye strain). Always set the refresh rate as high as you can for the particualr resolution.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
This is all of course my own opinion and others will feel differently.

For single player games 40+ is decent. Below that you can really start to notice the screen lagging. For Online def try to have above 60FPS at minimum. Mostly because other players are going for high FPS over image quality so they will have the competitive advantage, esspecially in high action fire fights.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: JBT
This is all of course my own opinion and others will feel differently.

For single player games 40+ is decent. Below that you can really start to notice the screen lagging. For Online def try to have above 60FPS at minimum. Mostly because other players are going for high FPS over image quality so they will have the competitive advantage, esspecially in high action fire fights.

well, that;s if you are realyl hardcore about your online gameplay like if you're in the CPL. I think all CPL players paly at 800x600 so they can get those max framerates.
 

CKXP

Senior member
Nov 20, 2005
926
0
0
Frames per second Gameplay
<30 FPS very limited gameplay
30-40 FPS average yet playable
40-60 FPS good gameplay
>60 FPS best possible gameplay

that's taken from Guru3d.com
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
60 is good for me in most games, but as a minimum, not an average, so I try to get the average somewhere around 100.

If your monitor supports 100hz at the resolution you're using you might as well switch to that. If nothing else, you'll get better performance in some cases with vsync on.
 

Nirach

Senior member
Jul 18, 2005
415
0
0
People can't see above 21.

I find 30 is more than enough :shrug:

Having played FEAR at 30 and 60, I've not really seen a difference.
 

KeepItRed

Senior member
Jul 19, 2005
811
0
0
60+ is what the eye percieves as smooth. Anything over 60 cannot be seen or noted by the eye. Anything under 60, you will notice it's becoming a slide-show.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
I guess it's a law of nature that there have to be some "the human eye can't notice beyond x fps" posts in any thread on framerates. :p I can tell differences up to 115 and it's different for every person.
 
Jun 14, 2003
10,442
0
0
i usually see 40fps as smooth.......and even on UT2004 down to 20fps is totallly fine, though with my system i cant say ive experienced such lows in a while.

if you FPS is higher than your refresh rate you get tearing....the image looks like its being torn in two or more sections, usually horizontally.

if you have a consitantly high FPS like say 80-100fps for the majority of a game and your refresh rate is 75Hz....then you will get some tearing but in these conditions you can use V-Sync which will match the fps to the refresh rate.

i think there are probs with vsync tho, if you use it when your frames mainly lie in the 40-50fps it doesnt work too well or i think it runs at the nearest lowest fraction of your refresh so it can keep a sync
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: CP5670
I guess it's a law of nature that there have to be some "the human eye can't notice beyond x fps" posts in any thread on framerates. :p I can tell differences up to 115 and it's different for every person.

Because its biological fact. Your eye/brain cannot see those 115fps, but due to the way objects are rendered and presented on screens, you will feel it as being smoother - but not because your eye/brain can register each of the 115 frames per second. Otherwise you could distinguish between slow motion secene at 50fps and 100fps, which you cant. Its the fast moving scenes where its at.

That doesnt mean higher fps is not good or wasted. And yes, people have different preferences also, as well as different screens that also affect how we perceive motion in games. :)

 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Pretty much what mwmorph said.

The target should be 60fps where, for 90% of the people, there is no perceivable difference between that and anything above it. However, there are games where it doesn't matter much and you can go as low as 15fps because they're not action oriented. Games like puzzle games, some rts's, adventure games, etc. You'd still like 30fps minimum on these games but it won't matter as much. Games that are action oriented like platform games, action games, fps, etc you'd want 40fps minimum and hopefully 60fps or higher.
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
60+ for silky smooth, its subtle but its there. one can get by with less,but not by choice lol:)
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
since i crave graphical quality, most games I play are hovering around 40. CoD2 dips to 20, although 30-40 is the norm. And its fine for me. HL2 I need about 40-50 for smoothness, FEAR plays fine for me around 30-40, and other games right around those numbers.

its a joke that people think there is a competitive edge to have 100fps over 50fps. You're not rendering the same number of frames at half the speed, or else it would look like a slowmotion video. games are set to render copies of frames. its illogical that there would be a gaming edge.
now run a demo-loop (like quake 4 demo recordings, or half life 2 demo recordings), and there is a set number of frames. so on some systems, some scenes may look like slow motion while others look like fast-forward, all depends on the system playing the demo-file as well as the system that recorded it. games are not played in this manner. again, because either 50fps would look like slowmotion to a 100fps system, or the 100fps system would look like the game is on fast-forward compared to the 50fps rig.

if you are getting 40-60fps, there is likely no loss of competitive edge compared to a 100+fps rig. under 30, probably.
at above 40, the only source of character positioning (on screen) discrepencies is going to be networking hardware.
 

Conky

Lifer
May 9, 2001
10,709
0
0
Ahh, the old framerate argument. :D

The key to framerates is not the average framerate but the minimum framerate. If you never dropped below 60fps you wouldn't be able to tell the difference between 60fps, 150fps or 5000fps... the human eye/brain can not process more than 60 frames per second. Period.

Guys who "see" a difference at an average of 100fps versus 120fps are simply noticing the dips below 60, which are detectable. :p

:beer:
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
Originally posted by: destrekor
since i crave graphical quality, most games I play are hovering around 40. CoD2 dips to 20, although 30-40 is the norm. And its fine for me. HL2 I need about 40-50 for smoothness, FEAR plays fine for me around 30-40, and other games right around those numbers.

its a joke that people think there is a competitive edge to have 100fps over 50fps. You're not rendering the same number of frames at half the speed, or else it would look like a slowmotion video. games are set to render copies of frames. its illogical that there would be a gaming edge.
now run a demo-loop (like quake 4 demo recordings, or half life 2 demo recordings), and there is a set number of frames. so on some systems, some scenes may look like slow motion while others look like fast-forward, all depends on the system playing the demo-file as well as the system that recorded it. games are not played in this manner. again, because either 50fps would look like slowmotion to a 100fps system, or the 100fps system would look like the game is on fast-forward compared to the 50fps rig.

if you are getting 40-60fps, there is likely no loss of competitive edge compared to a 100+fps rig. under 30, probably.
at above 40, the only source of character positioning (on screen) discrepencies is going to be networking hardware.

naw just think of a 180 or 360 degree spin in one second, its the kinda speed some of us move in fps games on occasion. the degrees of jump between each frame is good for accuracy, very subtle at that level but u might miss it if you gotused to it i figure.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
nanoseconds and a billionth of a degree will not add or subtract from a competitive edge.
one, the differences never keep adding up, so its always only going to result in maybe the most minute difference of character placement on the screen compared to a guy with better framerate. that will not effect your chance of hitting him. sure, maybe if you had horribly bad aim and were going to hit him on the furthest edge of the hitbox, and that most minute enough of a difference between the two systems in regards to character placement relevant to the bullet path, then sure, you might miss. but then you aren't good enough to be in a competitive playing field if thats the case. ;)
seriously though, even if hitboxes were perfect, the difference is so minute its really not measureable. again, the only time a character's position on your display is going to differ in regards to his correct position, is when there is network lag. if the machine can display a solid 40fps the entire time, and you are on a lan with good hardware, there's no exuse not to kill the other guy. :D
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Yeah, playing competitively with almost cheating fast settings is like benchmarking your graphics card with High Performance settings via 3DMark. There's no point. You'd want High Quality for an actual test.