What resolution does the human eye see in?

Discussion in 'Highly Technical' started by DannyBoy, Nov 18, 2009.

  1. DannyBoy

    DannyBoy Diamond Member

    Joined:
    Nov 27, 2002
    Messages:
    8,816
    Likes Received:
    0
    I read somewhere that there's no point in increasing video resolution past 1080p (1920x1080) because the human eye won't notice the difference and that future technological advancements for displays are focusing on colour reproduction / depth because of this.

    Is this true?

    -D
     
  2. Loading...

    Similar Threads - resolution human Forum Date
    Why a resolution discrepancy? Highly Technical Sep 4, 2015
    HDMI 1.3 1000+ Hz Low Resolution Highly Technical Apr 5, 2014
    Why do we need a high resolution/pixel count? Highly Technical Sep 18, 2006
    Human eye "resolution" Highly Technical Jul 5, 2005
    Resolution and human eye Highly Technical Sep 19, 2002

  3. bobdole369

    bobdole369 Diamond Member

    Joined:
    Dec 15, 2004
    Messages:
    4,504
    Likes Received:
    0
    Interesting reading.

    http://www.clarkvision.com/imagedetail/eye-resolution.html

    They say 74MP on a 20x14 print at 20". Don't have time to do the math, but know ye that 4k and 8k projectors and cameras exist today, lots of pro sales to 4k theaters are installed. Films are being shot in both. I think there is a long way to go until we hit the resolution wall.
     
  4. exdeath

    exdeath Lifer

    Joined:
    Jan 29, 2004
    Messages:
    13,683
    Likes Received:
    4
    DPI is what is important. Number of pixels and resolution are pretty meaningless without knowing anything about the size of the display. 1080p at 13" and 1080p at 106" are two entirely different things.

    If you play back 1080p in a cinema you can clearly observe it isn't "high enough".
     
    #3 exdeath, Nov 18, 2009
    Last edited: Nov 18, 2009
  5. CycloWizard

    CycloWizard Lifer

    Joined:
    Sep 10, 2001
    Messages:
    12,352
    Likes Received:
    0
    The human eye has about 100,000,000 neurons in its optic nerve. Each can be thought of more or less as a pixel of a digital camera, though some only transmit grayscale intensity information (rods), while others transmit color/intensity information (cones). I don't recall the distribution of rods:cones, but there are many more rods than cones, perhaps by a factor of 5-10. The brain then does a lot of fancy image processing to further improve the apparent resolution of what your eye sees: this is the role of ~80% of your brain, so it's pretty serious and not necessarily well understood (at least by me).

    So, the simple answer is that you'd need 10^8 pixels covering your entire visual field to correspond to the 10^8 neurons. You can use simple geometry to compute what fraction of your visual field is occupied by the display, then calculate the number of neurons that might be in that vicinity. Unfortunately, this will be inaccurate because receptor density is much higher in the center of your vision and less dense peripherally. The receptors are also distributed more laterally than vertically, which is why your eye likes widescreen. Does that not answer your question?
     
  6. 0roo0roo

    0roo0roo No Lifer

    Joined:
    Sep 21, 2002
    Messages:
    64,736
    Likes Received:
    35
    plus you don't stare straight ahead. eyes flit the center of focus with its higher resolution from place to place constantly and unconsciously while watching stuff. its why scientists can study folks with eye tracking while making them watch stuff.
     
  7. silverpig

    silverpig Lifer

    Joined:
    Jul 29, 2001
    Messages:
    27,710
    Likes Received:
    0
    It's not necessarily about resolution though. Even though your eyes can't detect individual pixels, there is still information transmitted by them that you can pick up on.
     
  8. BarkingGhostar

    BarkingGhostar Diamond Member

    Joined:
    Nov 20, 2009
    Messages:
    4,206
    Likes Received:
    58
    This is interesting in that a) the human eye is more sensitive to vertical resolution than horizontal, and b) some metal-dye manufacturers have researched the naked eye's ability to discern dyed cracks down to about 1100 line pairs.

    A line pair is a line of information adjacent to a line of non-information. Two line pairs afford the two information lines to be identified separately. Also, keep in mind (no pun) that the number of rods and cones are not linear in nature when talking about their density.

    Have a look at this. You can see that 'resolution' isn't a monolithic concept, but there is pixel resolution and then color resolution. Is it any wonder that the current ATSC specification is running near the 1100-line pairs of the human eye? :biggrin:

    Now, the better question might be at what resolution, both pixel and color, does the human mind start to not tell apart virtual from reality.
     
  9. Fallen Kell

    Fallen Kell Diamond Member

    Joined:
    Oct 9, 1999
    Messages:
    4,983
    Likes Received:
    8
    Also don't forget that the eye's "resolution" is in degrees of an arc. The retina is a curved surface, and so is the eye. This means the DPI that is discernible changes depending on how close or far away the object is to the eye (because the angle of differentiation between the points gets smaller the farther the object(s) is from the eye, and larger when the object(s) is closer).

    So if you are looking for trying to determine the optimal screen resolution that will produce a picture indistinguishable from real life, the resolution chosen will be limited by the distance from the eye.

    The next issue with making a realistic display resolution has to do with focal length. Until we develop a 3D display which allows the eye to change its focus on individual objects in the Z plain, we will still easily detect that the display is a "display" and not reality.
     
    #8 Fallen Kell, Nov 23, 2009
    Last edited: Nov 23, 2009
  10. DannyBoy

    DannyBoy Diamond Member

    Joined:
    Nov 27, 2002
    Messages:
    8,816
    Likes Received:
    0
    So in theory, any resolution + DPI photograph or television from a far enough distance would eventually saturate the available viewing capacity of the human eye?
     
  11. CycloWizard

    CycloWizard Lifer

    Joined:
    Sep 10, 2001
    Messages:
    12,352
    Likes Received:
    0
    Yes. Even an old SDTV will be visually "perfect" if viewed from far enough away. It's just that that distance is very, very far away.
     
  12. BarkingGhostar

    BarkingGhostar Diamond Member

    Joined:
    Nov 20, 2009
    Messages:
    4,206
    Likes Received:
    58
    I read somewhere that the arc is one arc second, which would be º/3600. Knowing this angle, the distance between the retina and lens, and the density of cones (or rods) within the retina's field of highest density equated to approximately 2200 lines.

    But, do to the ability to tell the difference between a line of information and a line of non-information they are paired. hence, the term line-pairs. And in the human eye's case, it is approximately 1100 line pairs.

    Of course, one must recognize that the human eye is more sensitive to vertical resolution (e.g. lines and line-pairs), than horizontal resolution (e.g. bars). This is why early adoption of capturing HD content in 1440x1080 was stretched to 1920x1080.

    Of course the above is only useful if a) you have 20/20 vision (corrected or natural). and b) you are looking at something of similar resolution within its viewing domain.
     
  13. CycloWizard

    CycloWizard Lifer

    Joined:
    Sep 10, 2001
    Messages:
    12,352
    Likes Received:
    0
    This is a good point that I forgot to mention. In any person's eye, the real limit of resolution is generally limited by the optics rather than the retina.
     
  14. lyssword

    lyssword Diamond Member

    Joined:
    Dec 15, 2005
    Messages:
    5,740
    Likes Received:
    4
    If I sit 2-3 feet away from 20" monitor, I can easily see artifacts in 1080p video.
     
  15. fleabag

    fleabag Banned

    Joined:
    Oct 1, 2007
    Messages:
    2,451
    Likes Received:
    0
    Those are compression artifacts... That has absolutely nothing to do with the resolution and more about the encoding of the video itself. That's why comcast HD looks like garbage but OTA HD looks amazingly good...
     
  16. Pandamonium

    Pandamonium Golden Member

    Joined:
    Aug 19, 2001
    Messages:
    1,628
    Likes Received:
    0
    This reply is far from scientific or technical. I was wondering this very question years ago, and I stopped after I read a source that seemed somewhat trustworthy stating that 4000dpi was close to the upper limit for the average person's eye. I don't know where to begin looking it up again, but that's what I found. I'll take that as the truth until I take neurology and can back it up with textbook/literature refs.
     
  17. Knavish

    Knavish Senior member

    Joined:
    May 17, 2002
    Messages:
    908
    Likes Received:
    3
  18. William Gaatjes

    Joined:
    May 11, 2008
    Messages:
    15,149
    Likes Received:
    132
    When looking at the picture...
    Is that not amazing, that the light information has to travel through various cells before it reaches the actual cell that registers the light ?

    According to some people, if we would have the eyes of an octopus we would have spectacular vision.

    http://pandasthumb.org/archives/2006/11/denton-vs-squid.html


    On a side note, in my opinion the denser packing of cones in the focus point of the eye also has an automatic advantage. This distribution of cells would allow a hardware compression of information. What you focus on has the highest information ratio. While the surroundings are being seen with a lower resolution.

    I wonder if there is ever build a cmos or ccd sensor with a layout similair to that of the human eye. I think some "simple" addition and/or averaging logic would be suffcient to process the surrounding view, thereby reducing the amount of information to store. While at the same time using the signal before it is added together but after averaging together as a means to detect movement in combination with some local memory to hold previous results.

    I wonder how the vision works of birds. I hear some eagles and hawks have amazing vision.
     
    #17 William Gaatjes, Dec 12, 2009
    Last edited: Dec 12, 2009
  19. William Gaatjes

    Joined:
    May 11, 2008
    Messages:
    15,149
    Likes Received:
    132
    The mantis shrimp seems to have the best eyes. Possible that it even can detect the polarization of light.

    http://en.wikipedia.org/wiki/Mantis_shrimp#The_eyes

    From wikipedia.

    Another link :

    http://www.blueboard.com/mantis/bio/vision.htm


    It is interesting, for example :
    Through polarization, you can tell if light comes from the sun directly or from a reflective surface.
     
    #18 William Gaatjes, Dec 12, 2009
    Last edited: Dec 12, 2009
  20. tommo123

    tommo123 Platinum Member

    Joined:
    Sep 25, 2005
    Messages:
    2,345
    Likes Received:
    1
    wonder if our brains could handle it if our eyes could pick up that much information (genetically engineered eyes?) or would our brains have to be altered too. hmmm
     
  21. William Gaatjes

    Joined:
    May 11, 2008
    Messages:
    15,149
    Likes Received:
    132
    I am wondering about that too. When it comes to robotic limbs and artifical prostates , the brain at least from our cousins seems to be almost to a scary level adaptive.

    The monkeys only needed a few days of practice...

    http://www.nytimes.com/2008/05/29/science/29brain.html?_r=1
    http://www.youtube.com/watch?v=TK1WBA9Xl3c

    And i read somewhere long ago that our brains can adapt to new forms of motoric movement quickly. These scientists glued some electrodes on the skin of a subjects head. eeg or ecg. I am not sure, i have to look it up.
    The scientists then let a computer analyze the electric fields given off and used that information to control some robotics.
    After some practice the brain had learned that a certain thought like a thinking of applejuice could trigger movement from the robot. After some more practice , just thinking about moving the robot was enough. The brain had learned that the neuron pattern for applejuice and robotic movement where the same.

    To come back to the eyes :

    http://babylon.acad.cai.cam.ac.uk/people/rhsc/oculo.html

    I would really not be surprised that our brains could adapt to new sensors automatically.

    A good example is tactile processing for blind people. A blind person received a device that was build up of a grid of spikes on his skin . Each spike would push out with a tiny force. This force depends on the light intensity received by a camera. More light is more force. The camera has pixels placed in a grid too.

    Here is some info on a system using the tongue.

    http://www.4to40.com/health/index.asp?id=339


    A bionic eye. Only sixteen pixels but enough to see movement, light intensity and shapes.

    http://www.guardian.co.uk/science/2007/feb/17/medicineandhealth.uknews

    I think the brain can actually use it.

    In my opinion:
    Some psychiatric diseases may arise from the brains desire for sensory input.
    Maybe with some of these people the brain lowers it's threshold for information to much creating phantom data.

    I read something about a method of hallucination.
    When you lay in a bath with water the same temperature as your skin. And there is no sound. No light, totally dark. No smell. Then within minutes you will start to feel things, see things hear things and even smell things. Things that are not there. But your brain is tuning up the amplifiers and lowering the thresholds of it's neurons in parts of your brain. Because your brain needs sensory input to function.

    If no data >> seek data :)

    I do think you would have to use some overlay principle. Like a rattle snake uses it's infrared pit organs. It maps the infrared data over it's visual data.

    http://en.wikipedia.org/wiki/Infrared_sensing_in_snakes

    It seems the brain just wants data and has some central sensor input like a main station where all the data is gathered. if that is the case, then the brain could use any form of sensor as long as it is given in the right format.

    Visual thinkers for example would have it easy. Their brains already process data faster then they can speak or hear it. Speech is just to slow.
    "1 picture means more than a thousand words" is a common used phrase. It is very true for visual thinkers. A lot of people do not think in words or language but in images and just translate that into words.
    i think the next renaissance in humanity will occur when we can connect to eachother similair as the internet.

    Robot controlled by cultivated rat neurons.

    http://www.youtube.com/watch?v=1-0eZytv6Qk&feature=related
     
    #20 William Gaatjes, Dec 12, 2009
    Last edited: Dec 12, 2009
  22. JF060392

    JF060392 Senior member

    Joined:
    Apr 2, 2005
    Messages:
    348
    Likes Received:
    0
  23. 0roo0roo

    0roo0roo No Lifer

    Joined:
    Sep 21, 2002
    Messages:
    64,736
    Likes Received:
    35
    nope else you wouldn't be able to tell if a print was cr@p or from a sweet laser printer or whatever.
    lcd resolution density is sh*t compared to human vision.
    sure the density of receptors is concentrated mostly a the center but the way human eyes work is to unconsciously flit from area of interest to area of interest, focusing the entire screen at once is not important, it matters the area in focus has detail.

    future development focus on color only is because hd resolution is the most popular, and you don't get any bonus for putting out a 2048p screen:p plus increasing the resolution is far harder...more dead pixel defects possible.
     
    #22 0roo0roo, Dec 26, 2009
    Last edited: Dec 26, 2009
  24. Cogman

    Cogman Lifer

    Joined:
    Sep 19, 2000
    Messages:
    10,117
    Likes Received:
    19
    Before we start talking about higher resolutions, I would rather talk about better compression methods. Many companies are really using some crap media compression algorithms which leads to poor looking quality even at HD. A better compression system would result in cleaner pictures at roughly the same bandwidth.

    I've been watching DirectTV recently, and while the compression artifacts aren't terrible, they are noticeable. Once we can move those to the realm of not noticeable then I'll be willing to discuss moving up to higher resolutions.
     
  25. yhelothar

    yhelothar Lifer

    Joined:
    Dec 11, 2002
    Messages:
    18,326
    Likes Received:
    15
  26. Modelworks

    Modelworks Lifer

    Joined:
    Feb 22, 2007
    Messages:
    16,237
    Likes Received:
    0
    I wonder how much of the human ability to tell the difference in images is the resolution of the image and how much is the brains interpretation of the image.

    I could take a giga-pixel image and change one pixel and people would not detect it, but change that one pixel over time and in different areas and people can see it. So I think it also has a lot to do with the brain, not just the 'sensor'.