Forget Anti Aliasing - Where is PPI

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
View this on your iPhone and be forever ruined by dat graphical quality:
untitled.png


Your right though its the lack of monitors thats

Seriously your such an idiot i cant

Glad your happy.

Jesus Christ.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
http://www.popsci.com/technology/article/2012-12/researchers-plotting-death-pixel-five-years

Reposting this because no one read it before. Long story short. They are working on a type of post processing that converts pixel based images into a vector based image. This would allow you to theoretically render and image at a tiny resolution with extremely high performance, and then use the vector based output to scale it up to any resolution you can imagine. Since they say this is still about 5 years away I can't comment on what the performance hit of this type of post processing would be, but this could solve the performance problems of extremely high PPI displays for any type of media.

Exciting stuff if you ask me.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
http://www.popsci.com/technology/article/2012-12/researchers-plotting-death-pixel-five-years

Reposting this because no one read it before. Long story short. They are working on a type of post processing that converts pixel based images into a vector based image. This would allow you to theoretically render and image at a tiny resolution with extremely high performance, and then use the vector based output to scale it up to any resolution you can imagine. Since they say this is still about 5 years away I can't comment on what the performance hit of this type of post processing would be, but this could solve the performance problems of extremely high PPI displays for any type of media.

Exciting stuff if you ask me.

Damn right it is. A new era of PC gaming is coming and its bringing huge changes with it. Just when everyone is talking about console based stagnation in PC games, it appears that we have lots of reasons to be optimistic. I know I am.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
So you can indirectly insult someone on AT but not directly?

Ill keep that in mind. Thanks
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Just bare in mins you can't directly compare pixels per inch of laptops to desktops. Because they have different viewing distances a laptop needs higher PPI to achieve the same result.

The right way that is independent of viewing distance is pixel per degree, which is just a conversion of PPI based on the viewing distance.

When you use PPD for example you find the first cases of retina quality screens came with 720p 32" TVs from the early HD era around when the Xbox was released. HD TV screens viewed at around 6 feet or so are already retina quality.

The issue for laptops was that the PPD was very low as they used desktop PPI. Now finally laptops are getting much higher PPI but its mostly bringing them into parity with HD TVs in terms of PPD. The shame is on PCs we have had good enough but not quite retina for a while. Its either going to arrive soon or never. Desktops are in decline as are separate monitors so I suspect we will yet again get TV tech on the desk and not 2560x monitors at 24".
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Just bare in mins you can't directly compare pixels per inch of laptops to desktops. Because they have different viewing distances a laptop needs higher PPI to achieve the same result.

The right way that is independent of viewing distance is pixel per degree, which is just a conversion of PPI based on the viewing distance.

When you use PPD for example you find the first cases of retina quality screens came with 720p 32" TVs from the early HD era around when the Xbox was released. HD TV screens viewed at around 6 feet or so are already retina quality.

The issue for laptops was that the PPD was very low as they used desktop PPI. Now finally laptops are getting much higher PPI but its mostly bringing them into parity with HD TVs in terms of PPD. The shame is on PCs we have had good enough but not quite retina for a while. Its either going to arrive soon or never. Desktops are in decline as are separate monitors so I suspect we will yet again get TV tech on the desk and not 2560x monitors at 24".

Blackened wants his 32" monitor to have high resolution apparently now. Even though he spent the last 5 pages poo pooing the idea of high PPI monitors.

I dont know about you but i would have thought a 24-27" monitor would serve the biggest market. Its not like we all have a command center in our moms basement with a desk long enough to make a 32" screen work.

I think Apple will drive the panels in their 27" iMac line and that will be the start of the high PPI monitor revolution.
 
Last edited:

kache

Senior member
Nov 10, 2012
486
0
71
http://www.popsci.com/technology/article/2012-12/researchers-plotting-death-pixel-five-years

Reposting this because no one read it before. Long story short. They are working on a type of post processing that converts pixel based images into a vector based image. This would allow you to theoretically render and image at a tiny resolution with extremely high performance, and then use the vector based output to scale it up to any resolution you can imagine. Since they say this is still about 5 years away I can't comment on what the performance hit of this type of post processing would be, but this could solve the performance problems of extremely high PPI displays for any type of media.

Exciting stuff if you ask me.
Wow, seems awesome.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Nevermind the fact that 99% of users don't have a GPU capable of supporting 2560x1600.

Yes they do, the hd 4000 can support 2560 x 1600 fairly easily for 2d applications. Using word or checking your email doesn't require much gpu power. 3d demanding games obviously not. But they as a whole very few people with a computer play demanding games on them.

Note: Yes the rmbp does seem to have some problem with the hd 4000 displaying smooth images on the screen but there are two things to be aware of. First, OSX renders at a higher resolution and then scales down. For 1080p the entire image is rendered at 3840 x 2160 and scales down to 1080p. Secondly, especially with the web browsing sub 60 fps scenarios some of the problem is with the way the software is encoded. The cpu process displaying the webpage uses only a single core with is often maxed out and bottlenecks the computer creating stuttering. With a software fix this should be eliminated.

Resolution makes a huge difference with regards to AA. I have two 15.6 inch laptops. One is 1366 x 768 and the other is 1080p. The 1366 x 768 NEEDED AA if you were going to game to get rid of the jagged lines. The laptop with the 1080p screen does not really need AA. Sure if you look for it you can see jaggies but you have to be fairly close to the computer and have to look for them. 2x AA at 1080p is easily equivalent to 8x AA at 1366 x 768. Anyone running a game at native res on the 15 inch rmbp does not need AA as the pixels are so small they cannot effectively see pixels.

And yes text is smaller but set font scaling to 125% in windows and everything is fixed.

Skyrim at minimum settings at 1080p (textures high)



Uploaded with ImageShack.us

Skyrim ultra (shadows med) at 1280 x 720



Uploaded with ImageShack.us

The ultra quality at 1280 x 720 looks better (mainly because of AF) but gets significantly worse frame rates. (Note: Both screenshots were uploaded at the same resolution because otherwise one would be much bigger than the other eliminating the ppi argument). Ultra at 720p gets 56 fps. Low, textures high at 1080p gets 60 fps at 75-80% gpu usage, without v-sync this would probably be around 75 fps. Adding AF (which costs almost nothing but would clear up the muddy ground) and a few other settings would likely make the two identical (or very close) at the same frame rate. (AF makes the two almost identical but higher fps for the 1080p one). Now disregarding texture quality, there are slightly more jaggies on the 1080p low screenshot

Running everything maxed at 1080p including shadows gets 35 fps for that scene.
Running everything maxed at 720p including shadows gets 48 fps for that scene.
(Note: No fxaa because it looks horrible).

The relationship between resolution and fps is not linear.

1080p is 1920 x 1080 or about 2.1 M pixels. 720p is 1280 x 720 or 0.92 M pixels. 2.1/0.92~2.3 times. Yet you get 48 fps instead of the expected 80 (35 x 2.3=80.5).

http://www.tomshardware.com/reviews/digital-storm-x17-radeon-hd-7970m-enduro,3345-6.html

Despite the fact the 1080p is about 2.28x more pixels than 720p you will not see a single instance in which 720p gets twice the fps than 1080p does at the same settings.


1080p with 2x AA gets 40 fps (this is roughly comparable to 720 8x AA with regards to jaggies).
1080p without any AA gets 45 fps.

So 720p with 8x AA gets roughly the same as 1080p no AA on the same settings, at least for skyrim.

http://www.tomshardware.com/reviews/digital-storm-x17-radeon-hd-7970m-enduro,3345-7.html
Supporting my data, there is very little variation in frame rate between resolutions at the ultra setting.

I have only tested one game with a static scene. More data points are needed to reach a real analysis. Me personally I would prefer to play at 1080p on med than 720p on ultra.

Computer specs

Lenovo y580
8 gb ram
i7 3630qm
gtx 660m (runs at 1085/2500)
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
The relationship between resolution and fps is not linear.

1080p is 1920 x 1080 or about 2.1 M pixels. 720p is 1280 x 720 or 0.92 M pixels. 2.1/0.92~2.3 times. Yet you get 48 fps instead of the expected 80 (35 x 2.3=80.5).

Yes, this has been true for almost a decade now. Back when GPUs first started out, they were typically fill rate limited and performance was predictable as one raised the resolution.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
High resolution and current parts not scaling.

The GeForce 2 GTS had a peak theoretical pixel fill rate of 800MPixels

The GeForce 680 GTX has a peak theoretical pixel fill rate of 32,000 MPixels

That's a 40x increase.

The GeForce 2 GTS packed 25 million transistors, the 680 GTX packs 3.5 billion transistors.

That's a 140X increase.

GPU vendors could, very easily, make a GPU that could handle 4K displays without a hint of a problem, it would simply require reallocation of resources. The good thing about this in a hypothetical sense is that 1080p is a non scaled resolution for 4K displays(it maps 4:1 perfectly).

GPU vendors haven't been driving fillrate because we are at the 'good enough' point with the terrible displays we have on the market. It wouldn't be a challenge for them to push 100GPixel GPUs out the door if the market was calling for them, we just aren't.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yes they do, the hd 4000 can support 2560 x 1600 fairly easily for 2d applications. Using word or checking your email doesn't require much gpu power. 3d demanding games obviously not. But they as a whole very few people with a computer play demanding games on them.

What I meant was running games well at that resolution - in a single gpu configuration even the best gpu's struggle at 2560x1600 in demanding games. The topic had been centered on gaming discussion, so that's the context that I meant.

Pretty much the only reason I use SLI is because of 2560x1600. Quite a few games are sluggish at that resolution with even the best GPUs, and others cannot be maxed out. You are correct, for just 2d purposes nearly everything "supports" 2560x1600 - If I didn't enjoy gaming I would be pretty content with HD4000.
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
When I game on machines that cannot handle the native resolution of whatever monitor they are connected with AA to I prefer using a lower resolution combined with AA to get me better image quality than a higher resolution without any AA. At least with CSS and a 7300 Go this was the case.

Going on the PPI argument I have a phone with a 4.5" screen that runs at a resolution of 720p, playing games on that I notice jaggies even on benchmarks that run at the native resolution of that screen. So even at "retina" levels of PPI there is still a need for AA. Resolution is not everything, it does reduce the need for AA I will admit but some level of AA is still necessary. There will always be some crawling jaggies at any resolution and PPI
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
What I meant was running games well at that resolution - in a single gpu configuration even the best gpu's struggle at 2560x1600 in demanding games. The topic had been centered on gaming discussion, so that's the context that I meant.

Pretty much the only reason I use SLI is because of 2560x1600. Quite a few games are sluggish at that resolution with even the best GPUs, and others cannot be maxed out. You are correct, for just 2d purposes nearly everything "supports" 2560x1600 - If I didn't enjoy gaming I would be pretty content with HD4000.

I get that, I just mean that lack of gpu power is not the main reason why higher resolution screens have not become more common.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
What I meant was running games well at that resolution - in a single gpu configuration even the best gpu's struggle at 2560x1600 in demanding games. The topic had been centered on gaming discussion, so that's the context that I meant.

Pretty much the only reason I use SLI is because of 2560x1600. Quite a few games are sluggish at that resolution with even the best GPUs, and others cannot be maxed out. You are correct, for just 2d purposes nearly everything "supports" 2560x1600 - If I didn't enjoy gaming I would be pretty content with HD4000.

They only struggle because they have to run 4x MSAA and other effects.

As we constantly keep telling you high PPI does away with the need for AA.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
When I game on machines that cannot handle the native resolution of whatever monitor they are connected with AA to I prefer using a lower resolution combined with AA to get me better image quality than a higher resolution without any AA. At least with CSS and a 7300 Go this was the case.

Going on the PPI argument I have a phone with a 4.5" screen that runs at a resolution of 720p, playing games on that I notice jaggies even on benchmarks that run at the native resolution of that screen. So even at "retina" levels of PPI there is still a need for AA. Resolution is not everything, it does reduce the need for AA I will admit but some level of AA is still necessary. There will always be some crawling jaggies at any resolution and PPI

You cant use a phone as an example. Screen tech can play a huge difference.

Looking at the rMBP screen shows whats possible.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
They only struggle because they have to run 4x MSAA and other effects.

As we constantly keep telling you high PPI does away with the need for AA.

So if a SLI/CF rig struggles with some games at 2560x1600 (4,096,000 pixels) and 4xAA, do you think we're ready for 3840 × 2160 (8,294,400 pixels)?

Turning down eye candy (excluding AA) to run a higher PPI monitor seems like a poor compromise. I'm all for higher PPI monitors but we aren't close to having the GPU power to do it yet.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
As we constantly keep telling you high PPI does away with the need for AA.
Wrong.

Higher PPI cannot eliminate all forms of temporal aliasing. Spatial aliasing is something different. For screenshots it would work, but for scenes in motion there are definitely limitations. Also MSAA and even SGSSAA are much more quality/performance-efficient than increasing PPI. Look up "EER", read up on it and you'll know why. Increasing PPI considerably ist not the solution. It is one part of the puzzle (waaaay off in the future), nothing more.

3840x2160 without AA gives about 1/3 fps compared to 1920x1080. But 1920x1080 with 4xSGSSAA gives about 1/2 fps AND looks better.
I've been using downsampling (quadrupling the PPI) for about 2 years now and I know what I'm talking about since I have a direct comparison on my setup.
 
Last edited:

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
So if a SLI/CF rig struggles with some games at 2560x1600 (4,096,000 pixels) and 4xAA, do you think we're ready for 3840 × 2160 (8,294,400 pixels)?

Turning down eye candy (excluding AA) to run a higher PPI monitor seems like a poor compromise. I'm all for higher PPI monitors but we aren't close to having the GPU power to do it yet.

One 7970 Ghz can run eyefinity. So why cant it run 2x 1080p which is what we are talking about.

high resolutions are already here when we run 3x1080p. 4K is 2/3's of that resolution.

I have been playing farcry 3 without AA on today and its barely noticeable at 1920x1200 on 24". AA does smooth it out but it also blurs it. Its a trick where as PPI is resolution.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
I'm wondering how this would work in games, would the pixels be packed close enough so that the eye can't see the jaggies like on the phone screens?
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
I'm wondering how this would work in games, would the pixels be packed close enough so that the eye can't see the jaggies like on the phone screens?

Thats exactly what we are saying.

If you cant see the pixels then you wont see the jaggies.

The game would have to be made with a 1:1 pixel setting and not upscaled like some games do.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
One 7970 Ghz can run eyefinity. So why cant it run 2x 1080p which is what we are talking about.

high resolutions are already here when we run 3x1080p. 4K is 2/3's of that resolution.

I have been playing farcry 3 without AA on today and its barely noticeable at 1920x1200 on 24". AA does smooth it out but it also blurs it. Its a trick where as PPI is resolution.

One 7970 Ghz can run Eyefinity (3x1080p) if you reduce the eye candy settings. If the point of higher PPI screens is for better image quality, than why would you want to reduce eye candy to get there? Seems like you're chasing your tail. Today's graphics solutions just aren't powerful enough to run a 4k resolution with high/ultra settings.

Also 4k screens = 4x1080p, not 2x1080p.
 

kache

Senior member
Nov 10, 2012
486
0
71
One 7970 Ghz can run Eyefinity (3x1080p) if you reduce the eye candy settings. If the point of higher PPI screens is for better image quality, than why would you want to reduce eye candy to get there? Seems like you're chasing your tail. Today's graphics solutions just aren't powerful enough to run a 4k resolution with high/ultra settings.

I was actually thinking: for a competitive FPS player having a super high resolution with super low details (which removes all useless stuff from the screen) would be actually pretty good. :ninja:
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
I was actually thinking: for a competitive FPS player having a super high resolution with super low details (which removes all useless stuff from the screen) would be actually pretty good. :ninja:

I thought the super competitive guys liked really low resolutions so pixels are bigger and thus the "hit box" is bigger too?