• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Playing some older games recently, reminded how *smooth* blistering FPS feels :)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I guess everyone's perception is unique. I've always been skeptical of those folks who claim 50, 60, 70+ FPS is 'so much better.' I personally can't tell the difference between 35 FPS and 100FPS yet the flickering of normal fluorescent lighting drives me crazy. Perhaps it's a combination of a particular wavelength and frequency.
 
to me there is no certain framertate that is smooth. every game is different and even those built with the same engine can feel completely different at the exact same framerate. some games feel stuttery and sluggish at 50-60 fps while others are just fine at 30fps. some games look and feel like crap just when moving the mouse around regardless of framerate. I have only played a few games on the 360 but they felt perfectly fine and smooth to me.
 
The only games where I absolutely need 50-60 fps are racing games. I won't play any PC racing game with 30-35 dips or even 40 fps dips. Also, any fast FPS games such as Quake 3 Arena, Unreal Tournament cannot be enjoyed at just 30fps.
 
Last edited:
Alternatively you could just lower your settings on newer games. I was going to write this is a different thread but then decided not to. It was about Crysis's "Ultra" settings. It may start a little off topic.

I for one believe in having unplayable settings at the time of a games release. People moaned and groaned and continue to do so because Crysis is "unoptimized". The fact is, the game is just stressful on a system. Subsequent patches "optimized" the game, but actually just lowered the amount of things going on.

Having unplayable settings such as infinite draw distance removes the stigma attached to playing with slightly-under-max settings. Tell me how many people strain their system just to know that everything is maxed out? While there are some that may genuinely get a better experience from playing with an extra 4 feet of draw distance while reducing the framerate to something noticeably stuttery, I believe the user gets an overall better experience by having a much smoother framerate. The OP seems quite impressed with replaying these older games at a speed they were meant to be played. There are diminishing returns on both visuals and framerate, yet I believe the want to max game settings out makes the returns lower on visuals.

Having unreasonable max settings allows a person to play a game at whatever settings fits them best without their e-peen getting in the way. However there are some major disadvantages to this method. The first one I can think of is that it requires more tweaking from the end user. This definitely discourages novice end users or casual gamers. A more accurate system to automatically judge system performance and enforce default settings would go an incredible way to alleviate this; however, there needs to be some input from the user as well. Everyone has a different definition of what smooth is. Possibly a desired framerate, with a threshold for expected drops/slowdowns.
 
Good post Ben90, and as you mentioned, tinkering individual settings yields good results, but can be extremely frustrating when individual settings are locked(like in Metro 2033).
I usually just start my settings how they were recommended, then turn on things that don't hurt fps too much, but add considerable eye candy. Tinkering with individual settings is one of the reasons I love PC gaming so much, however this can turn off a lot of people to PC gaming.

When I bought an Xbox 360, I was under the impression that console games run at very fast frame rates with no slowdowns and I was very wrong about that. As was mentioned earlier in this thread, I think that a lot of gamers don't know any better and don't notice things like aliasing and slow frame rates. When I played PS3 games on my monitor, I found the low resolution to really take away from the experience and had to put use it with my TV so that I could get far enough that the picture didn't look terrible.

I've actually shown a few of my friends who were Xbox360/PS3 fanatics some of my games on a paltry GTX260 and they were so blown away that they went out and bought gaming PCs and never looked back again. Like someone mentioned, you don't know what you're missing until you see it for yourself. Ignorance is really bliss in this case.
 
Good post Ben90, and as you mentioned, tinkering individual settings yields good results, but can be extremely frustrating when individual settings are locked(like in Metro 2033).
I usually just start my settings how they were recommended, then turn on things that don't hurt fps too much, but add considerable eye candy. Tinkering with individual settings is one of the reasons I love PC gaming so much, however this can turn off a lot of people to PC gaming.

When I bought an Xbox 360, I was under the impression that console games run at very fast frame rates with no slowdowns and I was very wrong about that. As was mentioned earlier in this thread, I think that a lot of gamers don't know any better and don't notice things like aliasing and slow frame rates. When I played PS3 games on my monitor, I found the low resolution to really take away from the experience and had to put use it with my TV so that I could get far enough that the picture didn't look terrible.

I've actually shown a few of my friends who were Xbox360/PS3 fanatics some of my games on a paltry GTX260 and they were so blown away that they went out and bought gaming PCs and never looked back again. Like someone mentioned, you don't know what you're missing until you see it for yourself. Ignorance is really bliss in this case.

I've seen commercials for XBox360 games where you can actually see the FPS slowdowns during the gameplay clips. I always thing, "why in Hell would anyone want to buy that?"
 
I've seen commercials for XBox360 games where you can actually see the FPS slowdowns during the gameplay clips. I always thing, "why in Hell would anyone want to buy that?"
Yep, and even on the PS3. I've seen the PS3 marketed as the "powerhouse" console, and I remember playing FEAR 2 on it and the game literally chugged through most sections. If I had to guess, I'd say it was dipping down to 15FPS.
 
Yep, and even on the PS3. I've seen the PS3 marketed as the "powerhouse" console, and I remember playing FEAR 2 on it and the game literally chugged through most sections. If I had to guess, I'd say it was dipping down to 15FPS.

That's pretty funny since my first reaction to playing FEAR 2 on PC was that the frame rate was so high for a modern game. I've seen CoD MW2 slow down on PS3 whenever there was a lot of smoke on screen and I was pretty stunned since I heard that game was supposed to run 60fps constant on consoles, hence the reason the resolution was toned down to 600p. Again, most people wouldn't notice the slowdown. I've even talked to people who didn't notice the horrid slowdown on GTA4 on Xbox which made the game near unplayable at times. I had it for Xbox, but choose to buy it again on PC just so it was more playable for me. Maybe it's me, but I seem to remember console games running a lot faster back in the Xbox/PS2/Gamecube days. Maybe there's such an emphasis on graphics over frame rate in the mainstream community that developers latched onto this idea. Or I could have forgotten what gaming on the Xbox/PS2/Gamecube days was really like.
 
Last edited:
That's pretty funny since my first reaction to playing FEAR 2 on PC was that the frame rate was so high for a modern game. I've seen CoD MW2 slow down on PS3 whenever there was a lot of smoke on screen and I was pretty stunned since I heard that game was supposed to run 60fps constant on consoles, hence the reason the resolution was toned down to 600p. Again, most people wouldn't notice the slowdown. I've even talked to people who didn't notice the horrid slowdown on GTA4 on Xbox which made the game near unplayable at times. I had it for Xbox, but choose to buy it again on PC just so it was more playable for me. Maybe it's me, but I seem to remember console games running a lot faster back in the Xbox/PS2/Gamecube days. Maybe there's such an emphasis on graphics over frame rate in the mainstream community that developers latched onto this idea. Or I could have forgotten what gaming on the Xbox/PS2/Gamecube days was really like.

Nah, what you see with consoles now is pretty much the staple. Poor frame rates, low resolutions, low texture details, selective or no AA, AF is a pipe dream. Maybe you remember older consoles handling better because they didn't look as good as PC games thus, there was no comparative (that and back then it was rare to see a console game to PC and vice versa.)

It's funny how they balance things on the console. They sacrifice resolution for lighting (Halo 3), they sacrifice frame rate for graphics (Halo Reach) and they sacrifice everything for no pop-in (GTA IV on PS3.)

The PS3 exclusives seem to run better than the 360 Exclusives at a higher resolution to boot. Where the comments of "Powerhouse" come from I take it.
 
Nah, what you see with consoles now is pretty much the staple. Poor frame rates, low resolutions, low texture details, selective or no AA, AF is a pipe dream. Maybe you remember older consoles handling better because they didn't look as good as PC games thus, there was no comparative (that and back then it was rare to see a console game to PC and vice versa.)

It's funny how they balance things on the console. They sacrifice resolution for lighting (Halo 3), they sacrifice frame rate for graphics (Halo Reach) and they sacrifice everything for no pop-in (GTA IV on PS3.)

The PS3 exclusives seem to run better than the 360 Exclusives at a higher resolution to boot. Where the comments of "Powerhouse" come from I take it.

Ironically, I prefer the exact opposite of these three tactics. I prefer native res over any effect, fast frame rates over graphics and faster frame rates over pop-in(except in some first person shooters). I wouldn't mind the inability to change settings as much if the games ran well in the first place, but so many games run poorly on console these days. I'll give credit where credit is due for Unchartered 2 which looks very good and holds a good frame rate. When my Xbox 360 died after close to four years of use, I didn't bother replacing it because I was so disappointed with the console as a whole. It had very little redeemable qualities to me and then it died. I've had almost every mainstream console since SNES and the 360 was my least favorite of the group. It definitely had a lot of good games, but I don't have time to play games as much as I've gotten older. What I love about PC gaming is that you can pick a game or two and get really good at it. I only game a few hours a week at most, but when I do I want an amazing gaming experience with what little time I have, nothing has provided me with a better experience than my PC.
 
Consoles suck. Just try playing a FPS with a gamepad. PC gaming is so much better as we all know.

I know people who have told me that they find a gamepad easier to use than a mouse. I think this might be because of the absurd amount of autoaim on console FPS. When I played MW2 on PS3 I was amazed that I could kind of guess where someone was near cover and I would often hit them. I've played rounds of CoD 4 on PC where I would just barely miss someone with all thirty rounds of ammunition because there's no autoaim to pull my crosshair toward my target. I can't play a FPS that uses so much autoaim that you can tell you didn't deserve the kill.
 
Playing older games is much easier on the wallet. Since I've taken a hiatus from PS3 gaming, I've been having a blast with $5 and bundled/free games. By the time I'm done with them, this year's blockbusters will probably be $5 as well.
 
Some you guys are so full of BS. Most consol games are 30fps and no one is screaming

this game is so jerky it making my eyes bleed.

Minimums are always the problem. so long as the game never drops below 30fps "how consoles are" most of you would never know in a double blind playing test.

Unless the console is locking your turn speed at 30 degrees per second, 30fps is NOT enough.

FPS isn't a problem when standing still. It's moving the mouse that will expose your 30fps for the crap it is. If you had 360 frames per second, you'd still have to limit your turn speed to 360 degrees per second to see a flawless experience and I can move my mouse a hell of a lot faster than that.

You fail at the entire concept of why fps is important.
 
Last edited by a moderator:
Some you guys are so full of BS. Most consol games are 30fps and no one is screaming

this game is so jerky it making my eyes bleed.

Minimums are always the problem. so long as the game never drops below 30fps "how consoles are" most of you would never know in a double blind playing test.

30fps with a joystick and 30fps with a mouse are two entirely different animals. Mouse is pure hand-eye. With a joystick you're not so dependent on visual cues because you can substitute timing.
 
Last edited by a moderator:
I know people who have told me that they find a gamepad easier to use than a mouse. I think this might be because of the absurd amount of autoaim on console FPS. When I played MW2 on PS3 I was amazed that I could kind of guess where someone was near cover and I would often hit them. I've played rounds of CoD 4 on PC where I would just barely miss someone with all thirty rounds of ammunition because there's no autoaim to pull my crosshair toward my target. I can't play a FPS that uses so much autoaim that you can tell you didn't deserve the kill.

http://www.rahulsood.com/2010/07/console-gamers-get-killed-against-pc.html (Emphasis added in bold underline.)

"There was a project that got killed at Microsoft. This project was designed to allow console gamers and PC gamers to interact and battle over a connected environment. Personally I wish it would have stayed the course. I've heard from reliable sources that during the development they brought together the best console gamers to play mediocre PC gamers at the same game... and guess what happened? They pitted console gamers with their "console" controller, against PC gamers with their keyboard and mouse.

The console players got destroyed every time. So much so that it would be embarrassing to the XBOX team in general had Microsoft launched this initiative. Is this why the project was killed Who knows, but I'd love to hear from anyone involved --- what happened?

Those of us who have been in the gaming business for over a decade know the real deal. You simply don't get the same level of detail or control as you do with a PC over a console. It's a real shame that Microsoft killed this -- because had they kept it alive it might have actually increased the desire of game developers and gamers alike to continue developing and playing rich experiences on the PC which would trickle down to the console as it has in the past."
 
http://www.rahulsood.com/2010/07/console-gamers-get-killed-against-pc.html (Emphasis added in bold underline.)

"There was a project that got killed at Microsoft. This project was designed to allow console gamers and PC gamers to interact and battle over a connected environment. Personally I wish it would have stayed the course. I've heard from reliable sources that during the development they brought together the best console gamers to play mediocre PC gamers at the same game... and guess what happened? They pitted console gamers with their "console" controller, against PC gamers with their keyboard and mouse.

The console players got destroyed every time. So much so that it would be embarrassing to the XBOX team in general had Microsoft launched this initiative. Is this why the project was killed Who knows, but I'd love to hear from anyone involved --- what happened?

Those of us who have been in the gaming business for over a decade know the real deal. You simply don't get the same level of detail or control as you do with a PC over a console. It's a real shame that Microsoft killed this -- because had they kept it alive it might have actually increased the desire of game developers and gamers alike to continue developing and playing rich experiences on the PC which would trickle down to the console as it has in the past."

I always wanted some games to be cross-platform, but maybe it would have knocked out dedicated servers, or given consoles dedicated servers. Maybe it'll be a reality one day and console gamers can choose if they want to play another console or another PC.
 
Last edited:
Some you guys are so full of BS. Most consol games are 30fps and no one is screaming

this game is so jerky it making my eyes bleed.

Minimums are always the problem. so long as the game never drops below 30fps "how consoles are" most of you would never know in a double blind playing test.

You are the only one thats full of BS. Consoles are so slow its OBVIOUS that the freamerates are low. I've played GTA IV on both PC and PS3 and the PS3 gets so choppy sometimes while driving a fast car/bike down the long main streets that other cars and people will drop out of thin air right infront of you because the game is running at such low FPS. Falout 3 same thing, played on both console and PC, noticed the performance drop on PS3 when there was alot going on the screen and lots of fire/explosions. If you think consoles never drop below 30 you are insane.

And 30FPS is not enough for most game, anyone is going to be able to tell the differnce between 30 and 60FPS. 30 looks noticably choppy in most games, 60 does not.

We allow cussing in P&N and OT, not in the tech forums.

Moderator Idontcare
 
Last edited by a moderator:
@BFG: SSAA seems to take out most of the shimmering for me, at least in FarCry. Is that right? Everything seems *sharp*.
Yup; unfortunately Crysis is far too slow to run full scene SSAA, even with my reduced details @ 1920x1200. It'll be years before I can run SSAA in Crysis

Far Cry 2 isn't quite fast enough at my settings either, though a “GTX580” will probably manage it. It should be nice for play-through #3. :thumbsup:
 
I agree to a point. I think some of the vegetation looks better in FC2, but character models, explosions, and other physics (including physical interactions) are done better in Crysis.
I’ve played both games several times from start to finish, and the Dunia engine is far superior overall. Physics and character models are comparable in both games, and in fact Dunia has physics CryEngine doesn’t do, like fire propagation, and pieces falling off vehicles as they take damage (e.g. the bonnet or bumper). Also vehicles get dirty as you use them.

Dunia also has a fully dynamic day/night cycle (as opposed to CryEngine’s implementation which appears to be scripted), plus it has larger draw distances, much bigger maps with more variety (desert, jungle, savannah, swamp, lake, etc), and has much faster loader times.

And of course the visuals look superior in Far Cry 2 without being the massive shimmer-fest that Crysis is, and performance is much better too.

I can run Far Cry 2 at 2560x1600 with 2xTrMS and maxed in-game details, and still get better performance than Crysis at 1920x1200 with 2xTrMS with reduced in-game details (with many set to medium or low).

The visuals in Crysis are vastly overrated.
 
I guess everyone's perception is unique. I've always been skeptical of those folks who claim 50, 60, 70+ FPS is 'so much better.' I personally can't tell the difference between 35 FPS and 100FPS yet the flickering of normal fluorescent lighting drives me crazy. Perhaps it's a combination of a particular wavelength and frequency.

I'm the same way and tried to test it by trying to gauge frame-rate without a frame-counter -- guess -- then see -- and never was close. I'm pretty good with lower numbers 40 and down and was one of the first posters to raise the point of micro-stuttering before it was named.

I hear views that 240 or more frames are needed for perfection. I'm glad that stuff doesn't bother me...and rather use that frame-rate to pour on image quality and immersion.
 
You can definitely tell the difference in a 120hz monitor when gaming. Just got my Acer GD235HZ in today and played some TF2 and you can definitely see the difference. Obviously that's a game that you can easily get over 60fps in and the difference in smoothness is huge. Also played some Left 4 Dead 2 and Borderlands and the difference was also huge. I'll play some Crysis to see how it is but since my system can't pull awesome fps I don't expect the difference to be that great at all.
 
Zerocool, I finally got to use a 120hz Asus monitor and it definitely looks different. Even on the desktop, mouse movements look more fluid. As far as games go, I didn't see a difference in a majority of games I was playing, however I don't think the frame rates were high enough to make a difference.
 
Back
Top