At what point do pixels become too small to make visible difference?

amenx

Diamond Member
Dec 17, 2004
3,972
2,199
136
I just roll my eyes at some 4k products being released these days, I mean 4k on a 15" notebook? Pure marketing or can someone really make out visible differences between 4k and 1440p (on 15" screens) for example? At what point would one begin to say, OK, cant see it, so not going to bother? Also notice a lot of 27/28" 4k screens in the market as well. Just curious for those who have sampled both 1440p and 4k, would you think the res jump is worth it for 27/28" screens. Sure no brainer 1080p vs 4k at that size. But at what point/screen size do you think 4k would have a definite edge over 1440/1600p in visual impact to be worth going for? Some folks here seem impressed with these Benq 32" 1440p displays and the pixels dont seem to bother them at that size, but I can see 4k as a good contender at that size. But who went from 1440/1600p to 4k at under 30" and was impressed by the results? Would be interesting to know your thoughts.
 

amenx

Diamond Member
Dec 17, 2004
3,972
2,199
136
Yes but about smartphones. While some general points may apply, I dont think it carries over the same to large screens. A whole set of other variables come into play, incl TFT tech, brightness, distance, etc. I'd like to know on a practical and first hand basis (for those who went 1440p > 4k under 30") how it applies to larger screens and their impressions.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
14
81
As a pretty good rule of thumb, the human eye (20/20 vision) can see pixels as small as 1 minute of arc (0.016 degrees), or about 0.6 minutes of arc (20/12 vision).

Apple's iphone "retina" display is designed to have a pixel pitch of 1 minute when held at 12 inches.

Broadly speaking, you can apply the same criterion to larger displays without significant alteration.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
As a pretty good rule of thumb, the human eye (20/20 vision) can see pixels as small as 1 minute of arc (0.016 degrees), or about 0.6 minutes of arc (20/12 vision).

Apple's iphone "retina" display is designed to have a pixel pitch of 1 minute when held at 12 inches.

Broadly speaking, you can apply the same criterion to larger displays without significant alteration.

This is not entirely correct, the human eye can distinguish line pairs (not pixels) at 1/0.6 arcminutes (20/20 and 20/12 vision respectively). To reproduce a line pair (a line pair is a black line and a white line), you need 2 pixels, thus the pixel pitch needs to correspond to half of the above, i.e. 0.5/0.3 arcminutes.

At 12 inches this corresponds to 573/955 PPI, thus a 2560x1440 5 inch display would reach the lower bound and 4K screen would get relatively close to the second bound.

Of course the above is defined using line pairs which are very high contrast (black versus white), and a lot of content isn't nearly that high contrast (movie, games etc), although text is generally high contrast.

Also this doesn't even start to touch on the subject of vernier acuity (which comes in to play with aliasing/anti-aliasing), which can be as low as 0.13 arcminutes for 20/12, corresponding to 2203 PPI, or a resolution of about 9600x5400 (5 times fullHD) on a 5 inch display.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
4K is impressive on a desktop display but it is not retina. At 32" you need a 12K display just to match an iPhone 6. That's like 9 times the pixel density of 4K just to get where current smart phones are at.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
4K is impressive on a desktop display but it is not retina. At 32" you need a 12K display just to match an iPhone 6. That's like 9 times the pixel density of 4K just to get where current smart phones are at.

What you're overlooking is that a desktop display is at least twice as far away from the user's face as a smartphone display, so you don't need quite the same DPI on a desktop monitor to get the same experience as on a smartphone or tablet. In practice, 200 DPI on a desktop or laptop monitor provides about the same experience as 300-400 DPI on a portable device.
 

Pia

Golden Member
Feb 28, 2008
1,563
0
0
What you're overlooking is that a desktop display is at least twice as far away from the user's face as a smartphone display, so you don't need quite the same DPI on a desktop monitor to get the same experience as on a smartphone or tablet. In practice, 200 DPI on a desktop or laptop monitor provides about the same experience as 300-400 DPI on a portable device.

And the 140PPI of a 32" 4K display is already very nice - about 30% higher better than a run-of-the-mill 27" 1440p display offers. Move it a bit farther back on the desk than usual, and it's "retina".
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
For my vision, 27" 2560x1440 is as low as pixel pitch can go. It was already very noticeable going from 24" 1920x1200. For 4k, I would not consider anything less than 32". And it's not limited to Windows text scaling problems, in plenty of games it results in disastrous UI sizing issues.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I just roll my eyes at some 4k products being released these days, I mean 4k on a 15" notebook? Pure marketing or can someone really make out visible differences between 4k and 1440p (on 15" screens) for example?
Easily. Look at letters like l, and a, in common fonts.

At what point would one begin to say, OK, cant see it, so not going to bother?
For what kind of use? On a TV, I can't tell a good 720P upscale of a movie from native 1080P. But, that's many feet away, with content that is not particular about pixels. OTOH, I can't see how people manage gaming with hideous always-wrong resolutions. Even with the losses all over again on places like YT, they're crap compared to correct res, and in person, it's physically uncomfortable to see. I also can't stand monitors set to incorrect resolutions, except for basic text console use.

With computer-generated content, done by pixel, the needs are going to be much higher than captured content. Just composing this, I notice how Cleartype is screwing up every lower-case A, C, and S. That wouldn't happen with much higher DPI. Zooming with photos, videos, screenshots, etc., will also be nicer with higher DPI. Aliasing artifacts are on the main reasons higher DPI displays look better, since they are much smaller, and with more pixels to fill in a given shape, the number of noticeable ones will be much reduced.

Trouble is that since we've hovered around 100 DPI for so long, software hasn't sufficiently gone vector, despite cries for that year after year, so DPI directly affects UI appearance.
 

NTMBK

Lifer
Nov 14, 2011
10,246
5,037
136
Pfft, you crazy people going higher than 800x600. How do you expect to get 300FPS otherwise?!
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
This is not entirely correct, the human eye can distinguish line pairs (not pixels) at 1/0.6 arcminutes (20/20 and 20/12 vision respectively). To reproduce a line pair (a line pair is a black line and a white line), you need 2 pixels, thus the pixel pitch needs to correspond to half of the above, i.e. 0.5/0.3 arcminutes.

I am having trouble wrapping my head around this. Can you explain like I'm 5 yrs old, or is there somewhere I can read further on this? Specifically I don't understand the justification for dividing the pitch by 2, just because there are 2 lines?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You mainly see contrast between two points, and those two points will be best referenced by their difference in angle, not distance.
 

amenx

Diamond Member
Dec 17, 2004
3,972
2,199
136
Easily. Look at letters like l, and a, in common fonts.
Yes, text pixelation is rather obvious and noticeable on 24" 1080p screens. At 1440p 27" its gone from about 1 1/2 t 2' ft away (for me at least). If I eyeball the screen from about 8-12" I can just barely make them out, so distance is obviously the variable with greatest impact, whether smartphones, PC monitors or TVs.

For what kind of use? On a TV, I can't tell a good 720P upscale of a movie from native 1080P. But, that's many feet away, with content that is not particular about pixels.
Agreed. But what content is about pixels beyond that? Watching a blu-ray HD movie on a 60" HDTV from about 15-18', the detail is so fine I just cant see how much more can be done there. I'm sure content will be there, but to what extent and how big of a screen would you need to take it all in?

Anyway, back to a specific point of my argument, how does 4k fare v 1440p at 27/28"? Anyone with first hand experience with both those size screens?
 

Mark R

Diamond Member
Oct 9, 1999
8,513
14
81
I am having trouble wrapping my head around this. Can you explain like I'm 5 yrs old, or is there somewhere I can read further on this? Specifically I don't understand the justification for dividing the pitch by 2, just because there are 2 lines?

The technicalities (which I lazily didn't go into - or as I like to tell myself, for reasons of simplicity) are that resolution (perceptibility) is measured using "line pairs". In other words, dark, light, dark (or light, dark, light - but there are complicating factors with this). The limiting resolution is when the two lines no longer look like two lines, but merge into 1.

The justification is that when analysing resolution as a frequency (as is normal in scientific work), you measure the frequency by the distance between 2 transitions - this is for consistency with the normal way of measuring frequency: if you have a sine wave, 1 cycle is the distance from crest-to-crest; in other words, crest, trough, crest.

The other issue is that, you need 2 pixles per line pair (in the limiting case of a striped pattern).


The final issue raised, that of vernier acuity is (I think) a bit of a red herring. Vernier acuity refers to the fact that humans can detect a "step" in a line which is smaller than the resolution that they can see; for example, if the smallest line pair a person can detect is 0.6 arc minutes, then they could still tell if a line is "stepped" if the step is only 0.1 arc minutes.

If you look at a line that has been rendered with anti-aliasing, you might see that it is 2 pixels wide, but one pixel is brighter than the other. If the line is at co-ordinate 5.3; then you might see pixel 5 show 70% of the line's color, and pixel 6 show 30% of the line's color.

Vernier acuity works, because the brain can reverse the anti-aliasing process to estimate the position of the line. Because this is already an algorithmic process, it still works, even if the image of the stepped line is artificially blurred. So, as long as you have accurate anti-aliased rendering, you don't actually need your screen to have 10x the resolution to get this effect.
 
Last edited:

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
^what kind of anti aliasing, and how much? Even at 2560x1440, SGSSAA above 2x starts to take a real toll real quick. I imagine at 4k and beyond it's an even more violent case of diminishing returns
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
^what kind of anti aliasing, and how much? Even at 2560x1440, SGSSAA above 2x starts to take a real toll real quick. I imagine at 4k and beyond it's an even more violent case of diminishing returns

The above mentioned value of 0.1 arcmin for vernier acuity only applies to high contrast images (i.e. black versus white), which doesn't really apply to games. It is really only an issue with text, but luckily we have stuff like subpixel rendering to help there, which is vastly more efficient than anti-aliasing.

How much anti-aliasing is required for a given game at 2560x1440, is impossible to objectively answer since it depends on the game and the person playing.
 

OlyAR15

Senior member
Oct 23, 2014
982
242
116
A lot depends on what you use the computer for, your eyesight, distance to the monitor, and OS/software. On a 30" 2560x1600 and 27" 2560x1440 monitor, the text is reasonably sharp and size is reasonable at 100% text size in Win 8.1. I would imagine if someone was using CAD or other vector drawing programs, a 4k monitor would be a godsend. For someone working primarily with text, it can be useful as long as the programs that they use are able to scale text properly.

My ipad with its retina display is noticeably sharper than my computer monitors, so while the 1440/1600 resolution is adequate, it is not quite at the point of diminishing returns. On the other hand, there is no way I would go to the 4k display since I play a lot of games and would rather not decrease the framerate.