• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why 30 or 60 FPS only?

Madia

Senior member
It seems that all console games are either 30fps or 60fps. Is there any reason why a game can't be a different number such as 40fps if the console can handle it? I understand why some games have a locked framerate but don't know why they only have 30 or 60 fps.
 
This is the easiest answer I've found.

Your screen, most likely, refreshes every 1/60th of a second. If you provide it 60 frames of video, each frame lasts 1/60th of a second. If you provide it 30 frames of video each frame lasts 2/60ths of a second. If you provide it 45 frames of video, 30 of them will last 1/60th of a second, while 15 of them will last 2/60ths of a second (typically).

Having some frames last longer than others makes the video stutter slightly and it won't look very smooth.

Though some games like God of War ran at uneven frame rates that hovered around 40fps.
 
Historical reasons mostly. When the NTSC standard was first developed, the frame rate was set to match the mains frequency. This was done to reduce interference AC power systems cause with analogue sets.

Analogue TV works by interlacing images. A single frame is made up of two fields of alternating lines. A single field is drawn at 60hz, while a single frame is drawn at 30hz. The same rates were brought into the digital age to ensure backwards compatibility with old video.

Displays can only run at even multiples of their native refresh rate. So a 60hz monitor can natively display video at 5, 15, 30, 60 fps. It does so by repeating frames. 120 and 240hz displays came about to natively display film's 24fps rate.

For non-standard resolutions, displays will employ pulldown. Back in the old days, films converted for TV used 3:2 pulldown for NTSC displays. It would show two full frames, then half a frame. This adds judder to the image. I'm not 100% sure how digital displays handle this, but I assume it's something similar. Probably repeating frames, or making certain frames run longer. However, it introduces judder into the image.

If a game's frame rate is faster than the native refresh rate of the display, you can get screen tearing. This is because the computer is outputting more image data than the display can handle. Using V-Sync clears that up.

IIRC, a lot of modern digital TVs are compatible with PAL signals as well. So they can display video at 50 and 25fps.
 
Last edited:
This is the easiest answer I've found.

That's pretty much it. The refresh rate is either a multiple or an even divisor of a standard television refresh rate (60Hz). 40Hz is fine if your TV can handle 120Hz (actual, not interpolated), but if not, you'll be required to perform 2:3 pulldown to create the frames to match (same you do with 23.976 ("24 Hz") -> 29.97 ("30 Hz")).

If a game's frame rate is faster than the native refresh rate of the display, you can get screen tearing. This is because the computer is outputting more image data than the display can handle. Using V-Sync clears that up.

Your explanation of screen tearing seems a little... odd. While it does occur when the framerate exceeds the refresh rate, it isn't any sort of issue with the monitor being able to handle it. Tearing occurs when the display receives a new frame while its currently drawing one. If there's no change or minimal changes between frames, you'll probably never see anything.

I don't know if my eyes just suck, but I've barely ever noticed tearing in my games. I even use a G-Sync monitor (ROG Swift) and I still don't notice the "amazing fluidity" that everyone always talks about. I'm defective. :'(
 
Thanks for the replies. Is there a difference with computer monitors since most people (including myself) tweak the graphics setting to get the highest fps they can.
 
Thanks for the replies. Is there a difference with computer monitors since most people (including myself) tweak the graphics setting to get the highest fps they can.

Well, no they work the same but some people have monitors that have actual refresh rates of 120hz or 144hz. On a PC you can turn off vsync as well and let the framerate go to whatever it wants. Some people prefer this because there's lower input lag since the system is not waiting for the monitor to refresh the signal.

Now when you start getting into gsync and freesync, you don't need your fps to lock to 60 at 60hz for a good experience.
 
Your explanation of screen tearing seems a little... odd. While it does occur when the framerate exceeds the refresh rate, it isn't any sort of issue with the monitor being able to handle it. Tearing occurs when the display receives a new frame while its currently drawing one. If there's no change or minimal changes between frames, you'll probably never see anything.
(

Cut me some slack jack. Anything I post after midnight should definitely not be read into too deeply. 😉
 
Back
Top