Why do pc games look terrible at lower framerates (~30) compared to console games?

Xenphor

Member
Sep 26, 2007
153
0
76
In an effort to not have to upgrade my videocard every new cycle, I would like to know why pc games look so bad at lower frame rates. I would be perfectly happy, if for example, Metro 2033 ran at a locked 30 fps but without all of the motion judder/ghosting/tearing that happens on a pc.

In comparison, even some of the worst performing n64 games (perfect dark, banjo tooie) still look at least consistent in motion compared to moderately performing pc games. Neither banjo tooie nor perfect dark have tearing or stuttering camera movement, even at their lower frame rates. I've noticed that games which use motion blur like Crysis tend to look better in motion at lower frame rates, but they still don't look as consistent as a console game locked at 30fps.

I don't really need all my games to run at 60fps nor do I want to spend the required amount of money to do so as newer games come out (currently have gtx 680). All I want is a consistent image at lower frame rates so that I could then settle for lower end cards because I don't really need a gtx 680 for most things.
 

cl-scott

ASUS Support
Jul 5, 2012
457
0
0
This is in no way a comprehensive answer, but a lot of it has to do with the kind of customizations you can do when you have a fixed hardware platform like a console.

And without getting into a PC v Console debate, but the upgrade treadmill of PC gaming is why I went to console gaming. Yes, the graphics may not always be as pretty, and yes FPS games suck on consoles pretty much universally, and probably many other things people could come up with. I understand all of that and for me it's a perfectly equitable tradeoff if I can buy one console and be guaranteed that every game for that console will work. Even if you consider the $600 initial pricetag for the PS3, that can be a bargain if you consider it might last like 6 years now. If you add up all the hardware upgrades for a gaming PC over that period of time, I can pretty much guarantee it'll be more than $600.

Anyway, that's my semi-off-topic rant.

Now, since someone is going to say it sooner or later: Delete your post.....
 

Craig234

Lifer
May 1, 2006
38,548
348
126
As I understand it, people can only see up to about 30FPS, so I don't think that's it.
 

Xenphor

Member
Sep 26, 2007
153
0
76
I didn't really even think to mention it but this is in no way console vs pc thread, at all. I don't really see how anything I said could be construed in that way. Like I said, I have a gtx 680, and I can play most games right now at 60fps so it's not that big of an issue. I'm thinking into the future. Prior to getting a 680, I also had a 7770, 550 ti, 6850 and a 670. I thought it was total horseshit that while all of these cards could run modern games fine, they could not provide a pleasing visual aesthetic in motion, because aside from the 670, none of them were able to maintain a constant 60 fps in the most demanding games like metro 2033 and Crysis 2. Once the frames dropped from 60, the visual fidelity decreased dramatically. I really don't think it should be necessary to spend 400+ on a video card just to achieve a stutter/tear/ghost free image.

As I said, I don't care at all about high frame rates and I think it's a waste of time and energy on both the consumer and developer end that the situation is as it is. I think it's sad that with all the power even lower end pcs have, a console port, while having greatly diminished IQ and texture detail, would appear to be running more consistently in motion.

I would be more than happy with 30fps, as long as that 30fps was as consistent and stable in motion as it is on consoles. Obviously there are tons of console games that also have problems with tearing and have even lower frame rates than 30fps but that's not the issue here.

I'm well aware that there are plenty of people out there who simply run with vsync off and don't even notice anything that I'm saying. That's great and this thread isn't for you then.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
The main reason you can see is a difference is because a mouse is capable of faster horizontal and vertical movements in game. You can move a mouse fast enough to actually "outrun" the framerate and make it things not smooth. Since console controllers can't move at such rapid rates those missing frames from a quick movement don't get noticed.
 

Xenphor

Member
Sep 26, 2007
153
0
76
The main reason you can see is a difference is because a mouse is capable of faster horizontal and vertical movements in game. You can move a mouse fast enough to actually "outrun" the framerate and make it things not smooth. Since console controllers can't move at such rapid rates those missing frames from a quick movement don't get noticed.

Wow I actually think the opposite. IMO, using a mouse makes inconsistent motion more tolerable because you eliminate the slow panning of the camera you would get with a joystick. If you're able to just lock on to a target with one swipe of the mouse then you actually make the tearing/stuttering less perceptible.
 

Absolution75

Senior member
Dec 3, 2007
983
3
81
I was having a technical discussion about controllers with someone knowledgeable. The conclusion was that there is basically a ton of input smoothing on consoles - hardly any on pc games. There are other techniques as well - the unreal engine uses some form of frame rate smoothing (and there are a ton of games based on unreal).
 

Xenphor

Member
Sep 26, 2007
153
0
76
I was having a technical discussion about controllers with someone knowledgeable. The conclusion was that there is basically a ton of input smoothing on consoles - hardly any on pc games. There are other techniques as well - the unreal engine uses some form of frame rate smoothing (and there are a ton of games based on unreal).

Well the issue has existed since well before the current gen. I remember first noticing it with Quake 2. In comparison, any console system that implemented 3d early on performed just about as consistently as consoles of today, but obviously with the reduced visuals of the time.
 

shortylickens

No Lifer
Jul 15, 2003
82,854
17,365
136
As I understand it, people can only see up to about 30FPS, so I don't think that's it.
You understand it wrong.

The human eye doesnt see in frames and the brain doesnt process images that way anyhow.
And when watching video many tests have shown people can tell the difference in frame rates up to 100 or so.

Also, pilots have been shown silhouettes of planes for 1/200th of a second and not only been able to ID the type, but often the model.
 

BFG10K

Lifer
Aug 14, 2000
22,674
2,824
126
As I understand it, people can only see up to about 30FPS, so I don't think that's it.
LOL, no.

The main reason you can see is a difference is because a mouse is capable of faster horizontal and vertical movements in game. You can move a mouse fast enough to actually "outrun" the framerate and make it things not smooth. Since console controllers can't move at such rapid rates those missing frames from a quick movement don't get noticed.
That's pretty much it. Mouse input is much more direct so it's more likely to pick up a laggy or choppy response.
 

Veliko

Diamond Member
Feb 16, 2011
3,597
127
106
The main reason you can see is a difference is because a mouse is capable of faster horizontal and vertical movements in game. You can move a mouse fast enough to actually "outrun" the framerate and make it things not smooth. Since console controllers can't move at such rapid rates those missing frames from a quick movement don't get noticed.

This a fair point actually, I never thought of it like that.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
My theory is that images that are reliably delivered in a smooth fashion fool the eye into motion much better than higher rates of refresh but where the images differ from the ideal smooth delivery. That is 45 fps with vsync is much worse than 30 fps consistently.

When you look at console reviews the frame rates are remarkable stable. When you look at PC reviews they often judder around all over then place. Thus based on my theory I would say you could get a console like experience with good frame limiting.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This thread makes me think of an episode of Top Gear.

The test driver had just driven the Veyron to 254MPH. He was coming in to the pits and went to open the door before the car had come to a complete stop- he was still going 70MPH.

Movies that we watch in theaters are a rock solid 24fps. Our European counterparts enjoyed 25fps for their main TV broadcasts for many years.

You can get adjusted to most framerate situations in a relatively short period of time. Going from 100fps to 22fps is actually going to seem a *lot* worse then going from 24fps to 20fps.

Another factor, on the PC side most people tend to shut VSync off to raise the ceiling on their framerate, on consoles it is only ever shut off to bring up the floor.
 

reallyscrued

Platinum Member
Jul 28, 2004
2,617
5
81
This is absolutely a legitimate question. I never understood it. The only explanation I can think of is there is a magical filter consoles use to smooth out movement.

Scenario:

I have Goldeneye for the Wii. Runs at 30 fps internally. Playing the game on the Wii, you can't notice the rigidness of the frames being drawn. It's "smooth" for lack of a better term. Smooth but not fluid (like many 60 fps Wii games)

I use Dolphin (emulator) and run Goldeneye on my computer. It's locked at 30 fps (become of the game code, cannot be changed), same fps as the console counterpart, but it's rigid from frame to frame. It's neither smooth nor fluid. It's like if you tried to play HL2 with a 30 frame cap.

Same television used, same resolution, same framerate. But the Wii 'feels' smoother.


As I understand it, people can only see up to about 30FPS, so I don't think that's it.

You keep thinking that.

framespersecond.gif
 
Last edited:

Craig234

Lifer
May 1, 2006
38,548
348
126
On the one hand, my answer was partly not correct; I understand that some computer graphics 'look better' at higher than 30FPS for whatever reason.

But, tv and movies look smooth to the human eye; despite being (under) 30fps, try to actually notice one frame compared to another.

I know some tv's now offer higher frame rates - but the benefit is debated, primarily claimed for action scenes, I didn't notice an improvement.
 

Fenixgoon

Lifer
Jun 30, 2003
31,493
9,824
136
On the one hand, my answer was partly not correct; I understand that some computer graphics 'look better' at higher than 30FPS for whatever reason.

But, tv and movies look smooth to the human eye; despite being (under) 30fps, try to actually notice one frame compared to another.

I know some tv's now offer higher frame rates - but the benefit is debated, primarily claimed for action scenes, I didn't notice an improvement.

constant 30fps is different than 30fps average.
 

AVP

Senior member
Jan 19, 2005
885
0
76
How many frames long is the white oval on the top right of the movie screen? That is really easy to pick out and I am pretty sure it is only there for one frame.
 

zokudu

Diamond Member
Nov 11, 2009
4,364
1
81
Might have something to do with the distance you sit from a monitor compared to how far away you are from your TV? I'm just grasping at straws here though.
 

PieIsAwesome

Diamond Member
Feb 11, 2007
4,054
1
0
Yeah, what gives? I was playing OOT on an emulator before and it was running at 18 FPS, but it looked fine. A PC game at that framerate would be choppy as hell.
 

shortylickens

No Lifer
Jul 15, 2003
82,854
17,365
136
Yeah, what gives? I was playing OOT on an emulator before and it was running at 18 FPS, but it looked fine. A PC game at that framerate would be choppy as hell.

Well that depends. Was it a platformer or a 3d shooter? Cuz that really does matter. Ditto RTS's.
 

Xenphor

Member
Sep 26, 2007
153
0
76
well thanks for some good replies. I would be interested to have someone who has a 120hz monitor weigh in on the topic. Cap any game like half life, half life 2, quake, etc. to around 30fps and see if it looks any better than it does on a 60hz monitor (assuming you have both). I suppose you could use msi afterburner or for half life use fps_max 30 in console. You could try other lower frame rates as well.
 

shortylickens

No Lifer
Jul 15, 2003
82,854
17,365
136
I had a big ass CRT that could do 640x480 at 120hz. I really didnt notice any improvement. Would much rather game at higher resolutions.

It also did 2048x1536 at 85hz. That was pretty good. But when I switched to 1920x1200 @ 60hz I was fine with it. No big loss.
 

PieIsAwesome

Diamond Member
Feb 11, 2007
4,054
1
0
Well that depends. Was it a platformer or a 3d shooter? Cuz that really does matter. Ditto RTS's.

The game I was talking about was Ocarina of Time, a third-person action type game on the N64. Lots of camera movement.

I can't imagine any PC game running at 18 FPS not looking like shit. Even Crysis with its epic frame-blurring would like awful.