Forget Anti Aliasing - Where is PPI

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

chucky2

Lifer
Dec 9, 1999
10,018
37
91
Imho,

More PPI is welcomed but desire more dimensional innovation to displays -- if one desires that eventual HoloDeck for that holy grail!

That's the key, there are so many desires, tastes and tolerances from individuals, where do the major players place their focus and resources?

Flexible 4K (and 8K in a decade or so) OLED installed on a curved frame should do the trick. Both for PC use, and TV viewing.

Chuck
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I want a 30" 4k (or higher) AMOLED, 120hz, and 5ms response time display w/ a .15 inch bezel.

That would be about 1000X slower then the slowest AMOLED display. 5ms response time? That's like waiting for the next ice age ;)
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I really hate what Apple and its marketing for the 'Retina Display' has done.

There have been more threads like this now than ever before.

That's all it really is, marketing. Retina isn't some new technology, no fancy new display. It's just a buzzword for high resolution/high PPI displays. Funnily enough the Nexus 10 has higher PPI than Apple's display. Irrelevant though.

If we could have higher polygon models, sharper textures, ray tracing and other new effects in our games at 1080p with AA methods that eliminate all forms of aliasing, we wouldn't necessarily be having this discussion here. There is a lot of room left to improve the quality of the image without moving resolution up yet. GPUs couldn't do it yet, a lot of games push the limits even at 1080p. This is all in regards to gaming. I have no doubt that higher PPI helps text sharpness and all that though. Depends on your usage and goals I guess.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
If monitors really did switch their pixels in 5ms we would all be happy. 60Hz is 16.6ms, 120Hz is 8ms, 5ms relates to 200 Hz. But the problem as always is that the figures are inflated, atypical and don't represent the actual performance of the monitor.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
So much bad information in this thread and I only got to page 2.

The premise is sound, a higher PPI (more pixels per inch) is better than using any form of anti-aliasing, with a high enough PPI your eyes would stop being able to discern aliasing in a scene.

However the reality of the situation is that in the short term we'll not likely be able to increase the PPI sufficiently to eliminate the need for AA entirely.

As a performance trade off MSAA and other optimized versions of AA can yield faster results than simply increasing the PPI, but then you'd expect that, increasing the PPI leaves you with a perfect picture where as using AA, especially shader based AA leaves you with a blurred scene and a non-perfect result.

We have Anti-Aliasing to stop Aliasing, and we have Aliasing because we have imperfect displays, our ultimate goal should always be to create perfect displays, that is to say displays so good our eyes cannot distinguish between reality and the display.

In the meantime we need to overcome this idea of maintaining ~100 PPI or less for monitors, it would really be nice to see something like 2560x1600 shrunk down to maybe 22-24", the PPI on the 2560x1440@27" and 2560x1600@30" displays is already an improvement over the smaller monitors (somewhat counter-intuitively, PPI tends to go up with display size) and on my 2560x1600@30" display I can already see diminishing returns with 8xAA not really giving a noticeable benefit over 4xAA.

One thing is for sure, with the gaming scene shifting towards a console dominated market we're going to stop seeing the steady increase in graphical fidelity over time, instead only seeing major jumps every 6-7 years, meanwhile as graphics power continues to increase we need something to "spend" it on, and higher PPI monitors is a great way to put that extra horsepower to use.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
We have Anti-Aliasing to stop Aliasing, and we have Aliasing because we have imperfect displays, our ultimate goal should always be to create perfect displays, that is to say displays so good our eyes cannot distinguish between reality and the display.

Personally desire dimensions and move a way from a 2 plane of existence -- holograms and then the discussion may be imperfect holograms!:)
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
So much bad information in this thread and I only got to page 2.

The premise is sound, a higher PPI (more pixels per inch) is better than using any form of anti-aliasing, with a high enough PPI your eyes would stop being able to discern aliasing in a scene.

However the reality of the situation is that in the short term we'll not likely be able to increase the PPI sufficiently to eliminate the need for AA entirely.

As a performance trade off MSAA and other optimized versions of AA can yield faster results than simply increasing the PPI, but then you'd expect that, increasing the PPI leaves you with a perfect picture where as using AA, especially shader based AA leaves you with a blurred scene and a non-perfect result.

We have Anti-Aliasing to stop Aliasing, and we have Aliasing because we have imperfect displays, our ultimate goal should always be to create perfect displays, that is to say displays so good our eyes cannot distinguish between reality and the display.

In the meantime we need to overcome this idea of maintaining ~100 PPI or less for monitors, it would really be nice to see something like 2560x1600 shrunk down to maybe 22-24", the PPI on the 2560x1440@27" and 2560x1600@30" displays is already an improvement over the smaller monitors (somewhat counter-intuitively, PPI tends to go up with display size) and on my 2560x1600@30" display I can already see diminishing returns with 8xAA not really giving a noticeable benefit over 4xAA.

One thing is for sure, with the gaming scene shifting towards a console dominated market we're going to stop seeing the steady increase in graphical fidelity over time, instead only seeing major jumps every 6-7 years, meanwhile as graphics power continues to increase we need something to "spend" it on, and higher PPI monitors is a great way to put that extra horsepower to use.

Thats exactly what i said.

Mass effect 3 has scenes from the console version where the aliasing is horrible. When its back to 1900x1200 its gone. So my point is true. i also dont think the PPI needs to be that high and i bet current gen tech can make it work
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
I understand PPI just fine. It's a moot point because 2560x1600 will not exist on a 23 inch panel any time soon, pretty sure i've stated this 10 times. There is no demand for a desktop panel that isn't usable in windows - perhaps iOS/OSX can make that work but 2012 is the last year that apple will make the iMac. Obviously the current iMac uses a 27 inch 2560x1440 display, and the iMac will not be refreshed in 2013. The 2012 model is the last one that will be produced.

http://www.theverge.com/2013/1/9/3855420/lg-display-30-inch-4k-tv-at-ces-2013

HAHA 4k in a 30" monitor. Fastest disprove ever?