Acceptable FPS

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Laminator
Originally posted by: tuteja1986
play something like battlefiled 2 on :
9600pro
512MB ram
P4 2.8Ghz

You will find : the frame rate is playable in ground and in tank. But when you get into a Jet fighter plane you will find :! your FPS is the major reason you loose every time in a duel !! You can't turn fast enough and some major hardware lag. That is when you want Frame rate of atleast 120FPS + : ) because as soon as you try to do a quick turn you can loose about 40FPS in a second.

This is what I used to play Battlefield 2 on:

Athlon XP 2600+
512MB PC3200 RAM
Radeon 9600 Pro EZ (slower memory clock than 9600 Pro)

I was playing at all medium settings, 1024x768. I got 21 frames per second when I was in the tank's chase view and it was perfectly playable! At those settings, though, flying the F-35's was also quite playable.


You want to play online ;) I will bet you :! you can never take me out in a dogfight :!

You may think its your skills but i will tell you its your hardware :!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I generally like a minimum of 60 FPS in twitch shooters and 45 FPS in slower shooters.
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Console animation seems fluid but slow. Since consoles don't have to worry about resolutions larger than 2Mpixels (1920x1080=2.07MP), how to twitch shooters look as opposed to counterparts on the PC?

For hardware that is slower than a 8800GTS save the PS3's processing capability, you'd think that if a game's fps is acceptable on a PS3 or XBox360 or a Wii for Christ's sake, it can work on a 8800GTS 320. But benchmarks for Lost Planet, Call of Juarez, etc that I have seen predominantly on FS show that the 320 card flops at 1920x1200 or even 1600x1200.

Do game developers employ any tricks to make motion fluid while accepting low frame rates?
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Originally posted by: tuteja1986
Originally posted by: Laminator
Originally posted by: tuteja1986
play something like battlefiled 2 on :
9600pro
512MB ram
P4 2.8Ghz

You will find : the frame rate is playable in ground and in tank. But when you get into a Jet fighter plane you will find :! your FPS is the major reason you loose every time in a duel !! You can't turn fast enough and some major hardware lag. That is when you want Frame rate of atleast 120FPS + : ) because as soon as you try to do a quick turn you can loose about 40FPS in a second.

This is what I used to play Battlefield 2 on:

Athlon XP 2600+
512MB PC3200 RAM
Radeon 9600 Pro EZ (slower memory clock than 9600 Pro)

I was playing at all medium settings, 1024x768. I got 21 frames per second when I was in the tank's chase view and it was perfectly playable! At those settings, though, flying the F-35's was also quite playable.


You want to play online ;) I will bet you :! you can never take me out in a dogfight :!

You may think its your skills but i will tell you its your hardware :!

I'd really love to see Warhawk come to the PC. I have a next-cube worker who plays it and thinks it gives him a lot of options. Switching from a tank to stalking with a sniper's rifle to something else all with graphics better than BF2.
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: speckedhoncho
Console animation seems fluid but slow. Since consoles don't have to worry about resolutions larger than 2Mpixels (1920x1080=2.07MP), how to twitch shooters look as opposed to counterparts on the PC?

For hardware that is slower than a 8800GTS save the PS3's processing capability, you'd think that if a game's fps is acceptable on a PS3 or XBox360 or a Wii for Christ's sake, it can work on a 8800GTS 320. But benchmarks for Lost Planet, Call of Juarez, etc that I have seen predominantly on FS show that the 320 card flops at 1920x1200 or even 1600x1200.

Do game developers employ any tricks to make motion fluid while accepting low frame rates?

The difference however, between a console (PS3, Xbox, Wii etc.) and a PC, is that a console is *always* going to be a set configuration of hardware. It will never change. A PC on the other hand, could be made of millions of different combinations of hardware, as well as overhead from the OS, antivirus, email etc. etc. So programmers for PC versions have to take into account that the end user may be using anything from an intel integrated video chipset with a Celeron, to a high end ATI/Nvidia video card and a quad-core CPU..

Plus as I mentioned before, the OS in a PC has to take into account many other things (USB devices, printer drivers, network access etc.), whereas a OS in a console really has a set limited number of things it needs to do (on a set hardware configuration).

PC's will always need higher end hardware to do the same as a console...

Look at PGR2 on the Xbox.. Do you think a PC could do the same game with a GeForce3 and Celeron/PIII 700Mhz?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: speckedhoncho
Console animation seems fluid but slow. Since consoles don't have to worry about resolutions larger than 2Mpixels (1920x1080=2.07MP), how to twitch shooters look as opposed to counterparts on the PC?

For hardware that is slower than a 8800GTS save the PS3's processing capability, you'd think that if a game's fps is acceptable on a PS3 or XBox360 or a Wii for Christ's sake, it can work on a 8800GTS 320. But benchmarks for Lost Planet, Call of Juarez, etc that I have seen predominantly on FS show that the 320 card flops at 1920x1200 or even 1600x1200.

Do game developers employ any tricks to make motion fluid while accepting low frame rates?

That's poor development. They try to port code over from a console to the PC and expect it to run well. It runs, but it runs slow because it is not optimized for a PC's strengths. A game fully developed with the PC in mind will run much better. Than what you saw in Lost Plannet. Other DX10 titles including Bioshock only add certain features on top of DX9 code and aren't written very effeciently from what I can tell. Everything I read about how DX10 works is supposed to allow a smoother gameplay experience with faster framerates than DX9 can deliver for the same effects due to being more effecient in rendering. If this is true I can't say, but we will probably need to wait for a full DX10 game like Crysis to be released before we can say for sure.
 

Arc337

Junior Member
Sep 1, 2007
21
0
0
IMO there should be a double-vsync option in games that caps your fps at twice that of your monitor's refresh rate (this would be great for Half-Life 2 based games in which you need over 100FPS to update to and from fast gameservers). It would also make games look MUCH smoother :D
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Originally posted by: cmdrdredd
Originally posted by: speckedhoncho
Console animation seems fluid but slow. Since consoles don't have to worry about resolutions larger than 2Mpixels (1920x1080=2.07MP), how to twitch shooters look as opposed to counterparts on the PC?

For hardware that is slower than a 8800GTS save the PS3's processing capability, you'd think that if a game's fps is acceptable on a PS3 or XBox360 or a Wii for Christ's sake, it can work on a 8800GTS 320. But benchmarks for Lost Planet, Call of Juarez, etc that I have seen predominantly on FS show that the 320 card flops at 1920x1200 or even 1600x1200.

Do game developers employ any tricks to make motion fluid while accepting low frame rates?

That's poor development. They try to port code over from a console to the PC and expect it to run well. It runs, but it runs slow because it is not optimized for a PC's strengths. A game fully developed with the PC in mind will run much better. Than what you saw in Lost Plannet. Other DX10 titles including Bioshock only add certain features on top of DX9 code and aren't written very effeciently from what I can tell. Everything I read about how DX10 works is supposed to allow a smoother gameplay experience with faster framerates than DX9 can deliver for the same effects due to being more effecient in rendering. If this is true I can't say, but we will probably need to wait for a full DX10 game like Crysis to be released before we can say for sure.

We can only hope that game developers trust the way GPU architecture is going. The real money is still made on consoles, tragically. The amount of customers transitioning to Vista, too, will be watched in order to track how much money can be made designing a game around DX10.

Does anyone know if there is a performance tool released by Nvidia or ATI that can track how many SP's are being used during parts of a game: a register/counter in the GPU that can be accessed at some time?
 

TheOtherRizzo

Member
Jun 4, 2007
69
0
0
Short answer: Movies (24 fps) are smooth because of motion blur. TV (sports etc.) is smooth because it is 50 or 60 fps.

A very common misconception is that interlacing means that 2 fields = 1 frame. This is only true for movies. For real interlaced content a field can be treated as a frame in terms of motion smoothness.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nullpointerus
Above a certain minimum level (i.e. 24fps?), consistency is more important than absolute fps numbers for perceiving something as being "fluid." For example, video encoded at 24fps will feel much smoother than a game's action fluctuating between 25-35fps. I've been enabling v-sync and dropping resolution and settings only as need to maintain close to 60fps even in game scenes with lots of action; personally, I find this to be a better (i.e. more "console-like") gaming experience even though it will not yield the best screenshots.

I do the same thing... I am more interested in smooth gameplay, than a choppy experience.
 

NickelPlate

Senior member
Nov 9, 2006
652
13
81
Originally posted by: A554SS1N
Anything over 20fps is ok to me :/ Some people have far too much money to spend chasing silly frame-rates (I'm talking the anything-over-60fps crew people here).

Just because 20fps is ok to you does not make it "silly" for other people to want more. Neither does having alot money to spend on such hardware.

NP

 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: NickelPlate
Originally posted by: A554SS1N
Anything over 20fps is ok to me :/ Some people have far too much money to spend chasing silly frame-rates (I'm talking the anything-over-60fps crew people here).

Just because 20fps is ok to you does not make it "silly" for other people to want more. Neither does having alot money to spend on such hardware.

NP

Some games do not benefit from higher fps. Look at Oblivion at 30fps. drop setting down and get 70fps but is the experience better?
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Projections and blur in games need to emulate the eye better by taking into account concentration of focus and blurring around the area of focus and further away.

Some games use this technique (COD2 in pivotal moments in a campaign and/or history will slo-mo and blur around the blur outward). It's just not a permanent feature.

The emulation of the senses is what puts you closer to the character's predicament, and giving you loss of control of what you can see can be made a part of non-twitch shooters that are more exploration and change-of-pace oriented.

BTW, the first result for "frame rate of the eye" in Google search is

http://www.newton.dep.anl.gov/...sci/gen01/gen01025.htm

This answers somebody's question as to why somebody can see a difference in movement from 30-40fps to 60-80fps.

 

Laminator

Senior member
Jan 31, 2007
852
2
91
Bring it on, tuteja1986! I'll pWn you with a 1GHz Pentium III and a GeForce2 MX! :p

Actually, I did manage to shoot down some people using the F-35 when playing on the 9600 Pro EZ rig, probably because it was when the demo first came out and everyone was a n00b so they didn't know how to use the countermeasures. ;)

Back on topic: I'm sure many of us here have been "good-enough" gamers at some point, or even now. Even recently, when I sold off my 7900GT and was using an X850 XT while waiting for a new card, I learned to live with low frame rates again in certain games. We can live with low frame rates if we're forced to.
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Originally posted by: Laminator
Bring it on, tuteja1986! I'll pWn you with a 1GHz Pentium III and a GeForce2 MX! :p

Actually, I did manage to shoot down some people using the F-35 when playing on the 9600 Pro EZ rig, probably because it was when the demo first came out and everyone was a n00b so they didn't know how to use the countermeasures. ;)

Back on topic: I'm sure many of us here have been "good-enough" gamers at some point, or even now. Even recently, when I sold off my 7900GT and was using an X850 XT while waiting for a new card, I learned to live with low frame rates again in certain games. We can live with low frame rates if we're forced to.

Now if I can only loosen the spring in my joystick, I can keep playing HL2 until Orange Box comes out.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: NickelPlate
http://en.wikipedia.org/wiki/Frames_per_second


Personally, for me motion becomes "fluid" at 60fps and I can't discern one frame to the next, although I can detect flickering in a monitor at 60Hz. Not sure what the upper limit really is for the human eye/brain but I'm sure there is one. As far as games are concerned if I can stay at or above 60fps I'm happy. But yeah watching movies at 24fps is sometimes painful for me especially when they do fast panning shots.

NP

A practical limit, sure... but not absolute. Your eyes are not digital... they don't capture frames like a movie camera does.
 

NickelPlate

Senior member
Nov 9, 2006
652
13
81
Originally posted by: cmdrdredd
Originally posted by: NickelPlate
Originally posted by: A554SS1N
Anything over 20fps is ok to me :/ Some people have far too much money to spend chasing silly frame-rates (I'm talking the anything-over-60fps crew people here).

Just because 20fps is ok to you does not make it "silly" for other people to want more. Neither does having alot money to spend on such hardware.

NP

Some games do not benefit from higher fps. Look at Oblivion at 30fps. drop setting down and get 70fps but is the experience better?


In Oblivion, or any FPS game for that matter I can see a huge difference between 30fps and 60fps. Most definitely better. I can't see any improvement past 60fps though, but other people say they can. My point to Assasin was people are different.

NP

 

PingSpike

Lifer
Feb 25, 2004
21,757
600
126
Originally posted by: JBT
Originally posted by: biostud
how fast kan you do a 180 degree turn in a FPS?

if it takes .1 sec and the framerate is 30 your turn will consist of 3 frames, while at 60fps you'll get 6 frames and therefore a smoother turn. The reason it looks smooth in movies is because of motion blur, and probably some other things.

Thats what I was goign to say. Motion blur is a big part of it.

Yep. Since movies have a linear, known progression they can use motion blur to simulate more frames without actually needing them. However, a game is interactive and cannot rely on this device so you can preceive choppiness.

Myself, anything above 45fps is usually pretty good for playing. I can see the difference between 45 and 60 and even some above that. But after that point I think I'm just imagining things. To me, the minumum framerate is the most important value and I'm glad hardware reviews are starting to include this in their calculations. If a game gets an average of 75fps, but drops to 20 all the time and runs at 125 the rest of the time thats much worse then if the game played at 50 most of the time but only dropped to 35 occasionally.

They did tests on air force pilots and I believe they could preceive changes at up to 300fps. Now, thats not to say they could really tell the difference between 200 and 300fps, its just that they were capable of seeing those changes.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: PingSpike
Originally posted by: JBT
Originally posted by: biostud
how fast kan you do a 180 degree turn in a FPS?

if it takes .1 sec and the framerate is 30 your turn will consist of 3 frames, while at 60fps you'll get 6 frames and therefore a smoother turn. The reason it looks smooth in movies is because of motion blur, and probably some other things.

Thats what I was goign to say. Motion blur is a big part of it.

Yep. Since movies have a linear, known progression they can use motion blur to simulate more frames without actually needing them. However, a game is interactive and cannot rely on this device so you can preceive choppiness.

Myself, anything above 45fps is usually pretty good for playing. I can see the difference between 45 and 60 and even some above that. But after that point I think I'm just imagining things. To me, the minumum framerate is the most important value and I'm glad hardware reviews are starting to include this in their calculations. If a game gets an average of 75fps, but drops to 20 all the time and runs at 125 the rest of the time thats much worse then if the game played at 50 most of the time but only dropped to 35 occasionally.

They did tests on air force pilots and I believe they could preceive changes at up to 300fps. Now, thats not to say they could really tell the difference between 200 and 300fps, its just that they were capable of seeing those changes.

They don't exactly "use motion blur" intentionally. Motion blur just happens when you're filming fast moving objects. The camera captures light when the shutter is open. If while the shutter is open things are moving, the image is blurry but the frames seem to fit together seamlessly.