Why aren't developers locking down framerates more often?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
Fact is, if it wasn't for people delving into the technical aspects we wouldn't have a lot of the improvements that games have seen.

Like what improvements?

No, the fact is, the majority of computer graphics is stripping out things people won't notice to improve performace. If you aren't a programmer, you don't need specialized tools to analyze the scene because you're playing with your eyes and a controller.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Like what improvements?

No, the fact is, the majority of computer graphics is stripping out things people won't notice to improve performace. If you aren't a programmer, you don't need specialized tools to analyze the scene because you're playing with your eyes and a controller.

There's a lot of developers who toss everything at the scene to make it look the best but forget that it has to be playable by people who don't have $1000 in graphics cards on their PC. Consoles don't have the same issue since they have a locked spec and it's much easier to pick and choose what effects to add or reduce to achieve the proper performance. It's all a trade off. If you reduce some graphical effects you get better framerates, if you add effects you reduce the framerate. Every developer has a target for the most part and have to try to hit that target.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
^That has nothing at all to do with what I said. It doesn't matter whether you're playing Crysis 3 on a GTX Titan or Tetris on your TI-84, the programmer had to strip out details to make it playable on your hardware. And which details are best to remove? Ones the user won't know are missing.

LOD and texture compression reduce visual quality but come with such a huge increase in performance that it isn't even an option as to whether you want to implement them. Every modern game uses these features.

As far as framerate, a static game like Civilization can get away with a lower framerate than a shooter like Battlefield, so the former is more likely to implement triple buffering and the like to improve frame consistency.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
^That has nothing at all to do with what I said. It doesn't matter whether you're playing Crysis 3 on a GTX Titan or Tetris on your TI-84, the programmer had to strip out details to make it playable on your hardware. And which details are best to remove? Ones the user won't know are missing.

LOD and texture compression reduce visual quality but come with such a huge increase in performance that it isn't even an option as to whether you want to implement them. Every modern game uses these features.

As far as framerate, a static game like Civilization can get away with a lower framerate than a shooter like Battlefield, so the former is more likely to implement triple buffering and the like to improve frame consistency.

Over on crydev I remember looking at Crysis 3 shots where people have pointed out they tessellated areas that didn't need it to be. They didn't strip it out, but left it in. Also people to this day claim Crysis 2 had tessellation in parts that weren't even viewed by the player.

You said developers take stuff out to hit the proper performance level, I'm saying not every developer does this all the time.
 

Fulle

Senior member
Aug 18, 2008
550
1
71
On consoles we've seen problems with "judder" from the FPS exceeding the screen's refresh rate. Just happened on Xbox One with UK users on 50 Hz tvs. If you're refresh rate is 50Hz, and the game goes over 50FPS, it'll cause "judder" from the increased frame latency. A fix was released for that in early Xbox One firmware patches.

You'll see this sort of thing on the PC, when a person's videocard is able to run the game at over 60FPS. Sometimes people need to set a frame limit to 59, also. Typically the problem doesn't exist when the frames are capped at the monitor's refresh rate. The problem also isn't there when you have Vsync or Tripple Buffering.

In a console game running on a display with a refresh rate of 50, 60, or 120 Hz, you're not going to see increased frame latency by not capping the FPS to 30. You might see that on a 30Hz tv.

You might see a judder while panning horizontally, without triple buffering or vsync, but it wouldn't be any worse than if the frames were capped at 30.

DF has you mislead, OP.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Judder can happen anytime the framerate is changing due to the inconsistent frametime latency that directly results.

Another example is Max Payne 3 on 360, which is an unlocked framerate that tops out at 35. Why not cap to 30? Those extra 5 just cause inconsistent performance depending on how many graphics are on screen.
 

Fulle

Senior member
Aug 18, 2008
550
1
71
Judder can happen anytime the framerate is changing due to the inconsistent frametime latency that directly results.

Another example is Max Payne 3 on 360, which is an unlocked framerate that tops out at 35. Why not cap to 30? Those extra 5 just cause inconsistent performance depending on how many graphics are on screen.

Only there wouldn't be noticeable increase to frametime latency unless the FPS hits past the display's frequency. For example, if you got 62 FPS on a game when your display was at 60Hz.

When FPS is under the display's max frequency, the increased FPS is pretty much universally good.

There's no actual explanation to the "judder" described by DF. It's a bullshit comment, made to downplay the PS4's performance advantage on the Xbox One, and you're reading it as an actual thing. It's NOT a thing. There's practically no benefit to capping the FPS to something so far below the display's refresh rate.

I say almost no benefit, because capping to 30 FPS would make the thermal ceiling easier to manage, and maybe the game's dev would be able to use the extra performance they left on the floor to make improvements elsewhere....

But in terms of this particular argument, it would do nothing to improve perceived smoothness of the gameplay.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
There is judder under 60fps... play Far Cry 3 and set the settings so the framerate bounce between 30-40fps, you will notice some kind of micro-stuttering judder most noticebable if you look at the ground textures and not in the distance.

Why would DF try to downplay PS4 to Xbone, they have been shitting on Xbone since day 1.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
Over on crydev I remember looking at Crysis 3 shots where people have pointed out they tessellated areas that didn't need it to be. They didn't strip it out, but left it in. Also people to this day claim Crysis 2 had tessellation in parts that weren't even viewed by the player.

You said developers take stuff out to hit the proper performance level, I'm saying not every developer does this all the time.

Yes, every developer does this all the time. Are you familiar with what the Cryengine does at all or why people spend so much money on it?

A bug in culling doesn't mean they didn't optimize the game. That's just silly.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Keep telling yourself that. Crytek always leaves stuff in that other devs would not have.
 
Last edited:

Fulle

Senior member
Aug 18, 2008
550
1
71
There is judder under 60fps... play Far Cry 3 and set the settings so the framerate bounce between 30-40fps, you will notice some kind of micro-stuttering judder most noticebable if you look at the ground textures and not in the distance.

Why would DF try to downplay PS4 to Xbone, they have been shitting on Xbone since day 1.


When you're under the monitor's refresh rate, any increase in FPS actually REDUCES the frame variance. It doesn't cause additional "judder". It's my opinion that DF has exaggerated negative impact of variable FPS on certain PS4 games, so that it doesn't seem like all they're doing lately is putting out graphs condemning the Xbox One. I think it's something meant to sound more neutral, but it's clearly confused some people that don't understand the data.

The "Micro-stuttering" on ground textures you've had in Far Cry 3, might be something to help illustrate my point. I think that you're talking about the same kind of "judder" I am, and it's something that's more impacted by lack of proper V-Sync and Triple buffering, and not a locked FPS, which would have zero effect on the situation. If you force triple buffering and Vsync in Far Cry 3, the "judder" is significantly reduced.

Which is why I think the comments on Second Son are disingenuous, in regards to the "judder". I think the author felt he was coming across overly positive, and felt compelled to question the variable Frame Rate, which is clearly using Triple Buffering, as we can tell from ZERO screen tearing with a variable FPS.... which would mean that frame rate going over 30 would only REDUCE frame variance, and REDUCE judder.

I'm calling BS on a BS comment is all. They made this same mistake in the Tomb Raider analysis, and I'm actually pretty annoyed with DF for it. They need to be more careful with just giving facts, since small comments they make seem to confuse the living hell out of some people. No offense.
 

gorcorps

aka Brandon
Jul 18, 2004
30,739
452
126
Just saw that the next Infamous update will include an option to lock at 30 fps if you prefer. Now at least I'll be able to compare one vs. the other and see if I have a preference.