Does letterboxing introduce additional input latency?

andeedoo

Junior Member
Sep 23, 2017
1
0
6
I'm thinking about buying a 25" ultrawide primarily for PS4 use, and possibly some PC use in the future.

Just wondering if anyone knows or can test whether letterboxing 16:9 on an ultrawide will cause extra input latency due to the internal scaler being thrown into the mix? Or does it just intelligently crop the edges with minimal lag?

I know that when running a non-native resolution on an LCD there is typically a cost in input latency, but not sure about letterboxing. And unfortunately, most monitor reviewing websites and input latency databases don't account for the additional lag when running a non-native resolution.

The LG 25UM58 I'm looking at has an average input latency of 9 ms.

Uncharted 4 MP is rendered at 900p then upscaled or stretched to fill (not sure which) but there is probably already additional latency introduced as a result.

This is all on top the fact that most console games already have an inherent 5-10 frames or more of input latency because of vsync and the game engine, so I'm trying to limit any additional sources of lag.

If letterboxing is going to cost more than a few ms of latency I think I'll just go with a 21.5" or 24" 1080p native monitor.

I'd prefer 21.5" because it's slightly sharper at 102 ppi. A 25" ultrawide has a 111 ppi advantage, as well as the fact that letterboxing can actually chop out a lot of the backlight bleed of edge-lit IPS panels. I've used mainly IPS panels in the past and the backlight bleed + IPS glow can be pretty distracting in darker scenes.
 
Last edited:

lefenzy

Senior member
Nov 30, 2004
231
4
81
This discussion is academic. I've never noticed increased latency from running a non-native resolution on a monitor. So any increase is probably insignificant relative to the inherent input lag of the device.

If you're talking about console games like Uncharted 4 (singleplayer is 1080p 30 FPS last time I checked), then you'll have no problems. They designed the game to play on laggy TVs, so any computer monitor will probably do.
 

Ichinisan

Lifer
Oct 9, 2002
28,298
1,235
136
A wider-than-16:9 TV would never letterbox 16:9 content. It would add pillar boxes.

Displaying 1920x1080 on a 2560x1080 display, stretching should require more image processing than pillarboxing. Because you can’t stretch a pixel to 1.33333... times the original size, the image processing algorithm would have to make some pixels from the source twice as wide while others remain only 1 pixel wide. To minimize strange appearance, it would analyze the colors and the surrounding pixels.

So resizing the picture to stretch/fill would typically require more work for the image processor.
 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
This discussion is academic. I've never noticed increased latency from running a non-native resolution on a monitor. So any increase is probably insignificant relative to the inherent input lag of the device.

If you're talking about console games like Uncharted 4 (singleplayer is 1080p 30 FPS last time I checked), then you'll have no problems. They designed the game to play on laggy TVs, so any computer monitor will probably do.
I certainly have. Repeatedly. Never been wrong about it either.

FYI: I install UltraHDMI and Hi-Def NES, so I'm particularly discriminating.
 

lefenzy

Senior member
Nov 30, 2004
231
4
81
I certainly have. Repeatedly. Never been wrong about it either.

FYI: I install UltraHDMI and Hi-Def NES, so I'm particularly discriminating.

Ultimately, it would require a measurement to determine whether or not non-native resolutions are slower than native resolutions, and of course it would monitor-dependent. In my experience, I have not felt increased lag on the monitors I own.

For what it's worth, professional CS players use a variety of non-native resolutions such as 1024x768 on high refresh rate monitors. I don't think they would do so if input lag were an issue.
 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
Ultimately, it would require a measurement to determine whether or not non-native resolutions are slower than native resolutions, and of course it would monitor-dependent. In my experience, I have not felt increased lag on the monitors I own.

For what it's worth, professional CS players use a variety of non-native resolutions such as 1024x768 on high refresh rate monitors. I don't think they would do so if input lag were an issue.
They are general rules but they still apply almost universally.

Integer scaling is faster because it doesn't require sampling multiple pixels to generate new pixels. It's just line/column doubling. It's why scaling from 720p to 4K is faster than scaling from 1080p. It's a pretty solid and predictable rule.

The main rules:

#1: disable any overscan simulation. If you can't and your display forces it then virtually nothing else is going to matter.

#2: enable "game mode" or similar and disable any other mode that processes the image (SmoothMotion, for example).

#3: set an output resolution that is either native or an evenly divisible integer of the display's native horizontal and vertical resolution.

I have seen crap displays that can't even disable simulated overscan, like a "1080p" Insignia 50" from 2007 that you couldn't actually use natively with a 1080p signal. Yeah, uh, don't use those.

PC display typically do not have overscan simulation. In fact, some TVs only disable overscan simulation when you potentially lie to it by labeling the input "PC" (how unintuitive is that?!). Some TVs have a marketing name like "JustScan" or "Full Pixel," others just say something like "1:1."
 

lefenzy

Senior member
Nov 30, 2004
231
4
81
They are general rules but they still apply almost universally.

Integer scaling is faster because it doesn't require sampling multiple pixels to generate new pixels. It's just line/column doubling. It's why scaling from 720p to 4K is faster than scaling from 1080p. It's a pretty solid and predictable rule.

The main rules:

#1: disable any overscan simulation. If you can't and your display forces it then virtually nothing else is going to matter.

#2: enable "game mode" or similar and disable any other mode that processes the image (SmoothMotion, for example).

#3: set an output resolution that is either native or an evenly divisible integer of the display's native horizontal and vertical resolution.

I have seen crap displays that can't even disable simulated overscan, like a "1080p" Insignia 50" from 2007 that you couldn't actually use natively with a 1080p signal. Yeah, uh, don't use those.

PC display typically do not have overscan simulation. In fact, some TVs only disable overscan simulation when you potentially lie to it by labeling the input "PC" (how unintuitive is that?!). Some TVs have a marketing name like "JustScan" or "Full Pixel," others just say something like "1:1."

I thought we were talking about computer monitors, not TVs. I have no experience with non-native resolutions on televisions, and it may be that TVs are slower at non-native resolutions.

I was also under the impression that non-native resolutions that evenly divide the non-native resolution are not treated any differently than any other non-native resolution. That is, a 4k display running at 1080p will not use exactly four display pixels to display each signal pixel. The same interpolation algorithms are used regardless. I often read reviews of 4k displays that state displaying 1080p resolution is not as clear as it would be on a native 1080p display.
 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
I thought we were talking about computer monitors, not TVs. I have no experience with non-native resolutions on televisions, and it may be that TVs are slower at non-native resolutions.

I was also under the impression that non-native resolutions that evenly divide the non-native resolution are not treated any differently than any other non-native resolution. That is, a 4k display running at 1080p will not use exactly four display pixels to display each signal pixel. The same interpolation algorithms are used regardless. I often read reviews of 4k displays that state displaying 1080p resolution is not as clear as it would be on a native 1080p display.
We are, but the rules are relevant to both. Heck, even my old Dell 2005FPW and Dell 2007WFP worked as televisions and, thus, would overscan certain inputs. I still use two Hanspree 1080p TVs in my workshop that actually use 1200p 16:10 LCD panels that I assume were intended for PCs (and I do use them with PCs). All of your experience with non-native resolutions applies to both.

1080p cannot be integer scaled to 4K so it will need to interpolate detail or create unevenly scaled pixels. 720p can be integer scaled to 4K every 4K TV I've checked has handled it perfectly. For example, if I scale 240p image to 1200p (5x integer) and then sent that to a 4K TV I can distinctly see hazy edges of the 240p pixels where the scaler sampled from two different lines/columns to interpolate detail. If, instead, I scale 240p to 720p (3x integer) and then send that to a 4K TV the hazy edges disappear. That's because every pixel of the 4K image was line/column-doubled from a solid source pixel from the original 240p image. It did not sample surrounding pixels to interpolate detail in either scaling process thanks to integer scaling.

Yes, all non-native resolutions get scaled, but an integer scale is faster and sharper with no made-up detail.
 

lefenzy

Senior member
Nov 30, 2004
231
4
81
1080p cannot be integer scaled to 4K so it will need to interpolate detail or create unevenly scaled pixels.

2 * (1920 x 1080) = 3840 * 2160 = 4k ?

I've tried playing games at 1280x720 on my 2560x1440 display. Things are blurred.