Forget Anti Aliasing - Where is PPI

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
27 and 30" monitors on a desk are huge to sit less than 1 foot away especially when most people with 42" TV's sit 6-12 feet away normally.

Id rather have more detail in a 24" monitor and with high DPI smooth edges and even smooth text. With higher resolution comes far more benefits than MSAA

I sit about 2 feet from my 30. I had a 1920 x 1080 20 incher that had a real fine ppi but it was a tn panel and the viewing angles were abysmal.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
MSAA is indeed shit for the performance hit you take, ill take post over it any day. Its amazing how some games suffer so bad from aliasing while others barely need any correction at all.
 

HeXen

Diamond Member
Dec 13, 2009
7,835
37
91
I'd rather have better AI and physics than stupid pixels and AA. Pixels doesn't make a game fun to play.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
I was pretty much feeling the same way after reading your ridiculous assertion that higher pixel count is somehow more efficient than anti aliasing. :rolleyes: If you think the performance hit for AA is bad (it really isn't) -- the performance hit for 2560x1600 compared to 1080p is far, far worse.

Apparently, you want higher pixel count and expect your 1GB GTX 460s to handle it. You get upset that you can't use 8x MSAA on your 460s, perhaps you should look into upgrading your GPUs. Pretty sure 8x MSAA isn't a problem with 2GB at your resolution. But somehow, this is a pixel problem and not a GPU problem.

I get it. You want better graphics. You just don't want to pay for it with a better gpu or better monitor.

Far Cry 3 is the first game i couldnt max out at 1920x1200.

If you check the reviews you cant max out Far Cry 3 on a 7970 GHZ CF either at 2560x1600

Id rather have my GPU's driving pixels because when you have high PPI you dont need to AA the jaggies out of your game. AA softens the detail in games and is a pure trade off.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Since AA is so costly on performance (playing Far Cry 3 shows just how much)

Why dont we just abandon all this AA trickery and focus on driving more pixels.

AA is redundant when the image is sharper so why dont we move to 250 PPI gaming monitors and forget AA

AA makes images more tolerable but doesnt exactly add any quality but more pixels does.

It must be simpler to drive more pixels than figure out all the complicated algorithms to improve the image on current resolutions.

Anyone else agree?


OP, just try to sift through all the low IQ posts.

As much as I agree with you, I don't think we are ready for higher PPI displays for a few reasons:

GPUs are not really fast enough yet. i.e. If you want to see how a game will run at 3840x2160, just run 4xSSAA on 1920x1080, and you'll see that the performance is just not good enough yet. While a 2560x1440 in a 22-24" size is very plausible, people are generally wanting bigger screens nowadays (especially if they are upgrading from an existing monitor, they usually want something bigger). Plus don't forget most people are not running high end graphics, but integrated graphics or mid range discrete cards. And lastly, everything on the internet that is not vector graphics looks like absolute crap. You don't realize this until you browse on a rMBP.

So yea, lots of reasons why high PPI is not ready for the masses.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Far Cry 3 is the first game i couldnt max out at 1920x1200.

If you check the reviews you cant max out Far Cry 3 on a 7970 GHZ CF either at 2560x1600

Id rather have my GPU's driving pixels because when you have high PPI you dont need to AA the jaggies out of your game. AA softens the detail in games and is a pure trade off.

Do you believe 2560x1600 / no AA will perform better than 1080p with AA? If I understood your argument correctly, you believe higher resolution/PPI/pixel count to be more efficient than lower resolution with AA. I would strongly disagree with this. 2560 will generally always perform much worse.

If you just want better graphics, I can respect that. But in no way is higher pixel count more efficient than higher detail with AA at 1080p. I disagree with you on this aspect of your argument - I agree with you that higher resolution is a great thing. I'm very anxious to get a 4k2 monitor when they're on the market.
 
Last edited:

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Do you believe 2560x1600 / no AA will perform better than 1080p with AA? If I understood your argument correctly, you believe higher resolution/PPI/pixel count to be more efficient than lower resolution with AA. I would strongly disagree with this. 2560 will generally always perform much worse.

If you just want better graphics, I can respect that. But in no way is higher pixel count more efficient than higher detail with AA at 1080p. I disagree with you on this aspect of your argument - I agree with you that higher resolution is a great thing. I'm very anxious to get a 4k2 monitor when they're on the market.

I think PPI is more important than AA. I have games with 16xAA in the settings.

How much work has Nvidia and AMD put into their GPU to accomodate AA. None of us here know how much hardware is dedicated to AA and what it costs.

In theory to increase the Pixel count you just need to add more cores the same way SLI increases performance. Its far simpler than how AA works.

Today AA costs less in FPS but who knows the next GPU could change that if the consumer wanted. If we could get 250-300 PPI screens on gaming PC's than AA would be redundant
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think PPI is more important than AA. I have games with 16xAA in the settings.

How much work has Nvidia and AMD put into their GPU to accomodate AA. None of us here know how much hardware is dedicated to AA and what it costs.

In theory to increase the Pixel count you just need to add more cores the same way SLI increases performance. Its far simpler than how AA works.

Today AA costs less in FPS but who knows the next GPU could change that if the consumer wanted. If we could get 250-300 PPI screens on gaming PC's than AA would be redundant

Alrighty, apologies for the earlier posts -- I respect this opinion. Like you, I enjoy higher resolutions myself but I don't see the performance dynamic (between higher AA/detail versus higher resolution) shifting, maybe a new GPU architecture can change this. In terms of gaming I think the trend with consoles will be higher detail at existing resolutions. I know the xbox 720 and PS4 are both targetting 1080p at 30 fps.

Trust me - I cannot wait for 4k2 and ultra HD to become standard. I love high PPI and high resolution - I will be very happy indeed when that happens. I don't think GPUs are quite ready yet, though.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
To get rid of the aliasing we need an increase of the pixels per degree the eye sees to about 85, that is the so called retina level. Most desktop displays are actually quite close to this at a normal view distance but need a good 50% more pixels to exceed it. 2560x1600 does have a higher PPD at the same viewing angle than 1080p at the same distances.

One thing I think is interesting is that at usual viewing distance a 1080p TV is very much at retina quality. Either gamers are sitting to close or the resolution reductions used in modern console games are what is driving the need for ppaa. The next generation of consoles might not need AA at all if they can display 720p/1080p natively and that is likely to drive not needing it on the PC as well. When combined with the industry movement to higher PPD screens (although so far limited to laptops) I think its quite likely that in 2 years time AA will be limited to people who haven't upgraded their screens.

I anticipate Microsoft will update their scaling ands improve that, their OEMS must be screaming at them to do so ever since the Mac book pro released with the screen it did.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Yep, GPUs are far far far from making 4K, or even 2560x1440/1600 playable with average setups. Yes, you can game at resolutions like that with the right settings with top-end GPUs, but with no AA you can still see jaggies easily, and with too much AA, you can crush even $1k+ of GPUs easily with a maxed out game.

After considering all of that, look at the average gaming rig (Steam stats, etc). People with even moderately decent GPUs are a distinct minority. The average 'gamers' PC is something in the range of a Core 2 Duo @ 3Ghz, 2GB of ram, and a 4850 or so.

Higher PPI will come in time, but it will be a fairly slow march. The major problem is that even if you show the average computer user a retina Macbook Pro, they aren't even very impressed, certainly not enough to spend a huge premium on it. Our market is absolutely dominated by the $300-$600 range of models for complete PCs/notebooks. Similarly in displays, probably 5,000 1080p and lower displays are sold for every 1440p/1600p screen, maybe even higher. The vast majority of stores don't even have 1440 or higher screens available at all. Even 1200p is about impossible to find.

We live in a lowest-common-denominator dominated world. Until that lowest common denominator is 4k (2025-2030?) I wouldn't expect to see a lot of movement forward. The best we can hope for in the interim is for GPU grunt for us enthusiasts to more effciently push higher resolutions, and for 1080p to begin to feature better AA and higher poly environments. Take the best looking game you've ever seen, and look at it in 2560x1600. Now go look at a good BluRay on a 1080p Plasma at a typical viewing distance. Do you see a bunch of jaggies? Which looks more realistic to you. Hence why 1080p has a ton more to offer as time goes by, even if I personally would like to see higher res AND higher PPI screens become more readily available at sane prices. Take something like BF3, give it 10x the poly count, add in better particle and color/brightness detail, and ~20x the texture resolution (so that even things at point blank range are still no worse than 1:1 pixel detail), and you'd have something that looked quite nearly real. So we're a few generations out from even being able to do that, that would require on the order of something at least 10 times as powerful as a GTX680 with 32GB or more of insanely fast video memory. Again, this is probably 10 years out, assuming we keep pushing the bar a bit at a time.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Its sad to say that the only company pushing PPI is Apple, but at least it will make the PC OEM's wake up.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Its sad to say that the only company pushing PPI is Apple, but at least it will make the PC OEM's wake up.

I don't think that's the case. We may see Dell release some $2,000+ 30" 4K display in the next few years, but it's a loooooong way off before moderately large screens will become common in high PPI. There just aren't enough typical consumers who would walk into a best buy, look at a 27" 1080p for $240, and a 24" 4K for $1200, and walk out with the 4K screen. It's a price-driven market. I'm also pretty sure that PC makers are much more envious of the MBA and the huge sales there (hence the silly push to 'ultrabooks', despite terrible value for money), than the tiny fraction of sales of the MBP retina model.

Now with small displays (Ipad, Iphone, the high-end Androids), high PPI is dramatically more feasible in both the current and near future. We're probably inside of 5 years before non-'retina' level mobile devices are nearly obsolete even at the entry level.

With PCs and even Apple's own MBA, retina displays are unfeasible from a cost perspective from three factors :

(1)- What a typical customer is willing to pay for it, leading to -

(2)- The screen itself is too expensive to fit (1), and -

(3)- The hardware necessary to drive that extremely high resolution screen (say 2560x1440 for a 13") is also dramatically more expensive to employ. Suddenly you're well out of the range of the iGPU or even high-end discrete mobile GPUs to push it effectively in anything but documents and internet spreadsheets, and compressed HD video (sourcing higher than 1080p is also a barrier in the industry until an official set-top standard is commonplace).
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
To cement this point, look at the MBP retina model selection :

$1700 = 13" with Intel HD 4000, utterly useless for anything but gaming at 1/4 resolution, along with internet and light office work. Driven by a 2.5Ghz i5.

up the range to :

$2800 = 15" with Nvidia 650ti, a card approximately the equal of a desktop GTS450 at best. In other words, this thing couldn't even convincingly game at 1080p, let alone retina levels. It's far better than the HD4000, but still basically useless for anything other than internet/office work at native resolution. Paired with a 2.6Ghz i7, the CPUs for these form factors are heavily gimped compared to desktop models.

I know I'm sounding downbeat about this. I love high PPI, I'm just realistic about it for the near to mid future. It's going to remain niche outside of mobile for an irritatingly long time, and many of those factors are just the limits of today's technology combined with how much money the typical consumer is willing to pay as a premium for the displays. It's also important to remember that the vast vast vast majority of computer buyers don't care about high-end gaming in the least. We're the frontier here. And even in "everyone a millionaire with a veyron" AT, most people are running 1080p and upper midrange rigs not even capable of 2560x1600.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
To cement this point, look at the MBP retina model selection :

$1700 = 13" with Intel HD 4000, utterly useless for anything but gaming at 1/4 resolution, along with internet and light office work. Driven by a 2.5Ghz i5.

up the range to :

$2800 = 15" with Nvidia 650ti, a card approximately the equal of a desktop GTS450 at best. In other words, this thing couldn't even convincingly game at 1080p, let alone retina levels. It's far better than the HD4000, but still basically useless for anything other than internet/office work at native resolution. Paired with a 2.6Ghz i7, the CPUs for these form factors are heavily gimped compared to desktop models.

I know I'm sounding downbeat about this. I love high PPI, I'm just realistic about it for the near to mid future. It's going to remain niche outside of mobile for an irritatingly long time, and many of those factors are just the limits of today's technology combined with how much money the typical consumer is willing to pay as a premium for the displays. It's also important to remember that the vast vast vast majority of computer buyers don't care about high-end gaming in the least. We're the frontier here. And even in "everyone a millionaire with a veyron" AT, most people are running 1080p and upper midrange rigs not even capable of 2560x1600.

2 people i know game at 1080p on a MBP Retina on an external monitor far better than you would expect. That 650M GT is a far better mobile GPU than you would think.

Also your wrong about resolutions. Apple put a Retina screen in an iPhone and now Sony HTC and even Oppo have 1080p screens in 5" Phone. Samsung have a Retina screen in the Nexus 10. Less than 12 months after the ipad Retina came out.

Companies are putting out high PPI screens now and its only a matter of time before someone takes the 15" Retina MBP screen and makes a 22" version with the same PPI. The 15" Retina Pro doesnt have any real premium over the standard model so i think they arent that expensive to produce at high PPI.

Also the world doesnt revolve around Best Buy or the US for that matter. In fact America is just one market. PC gaming is huge in Asia and Europe.

Consoles are the biggest problem for PC gaming because of ports made from console limited performance like COD BO 2. The hardest part will be getting games made for high PPI.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Just wait till apple launch a retina imac and we can snag cheap Chinese panels of ebay for a 1/4 of the price.



In 5+yrs
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
2 people i know game at 1080p on a MBP Retina on an external monitor far better than you would expect. That 650M GT is a far better mobile GPU than you would think.

Also your wrong about resolutions. Apple put a Retina screen in an iPhone and now Sony HTC and even Oppo have 1080p screens in 5" Phone. Samsung have a Retina screen in the Nexus 10. Less than 12 months after the ipad Retina came out.

Companies are putting out high PPI screens now and its only a matter of time before someone takes the 15" Retina MBP screen and makes a 22" version with the same PPI. The 15" Retina Pro doesnt have any real premium over the standard model so i think they arent that expensive to produce at high PPI.

Also the world doesnt revolve around Best Buy or the US for that matter. In fact America is just one market. PC gaming is huge in Asia and Europe.

Consoles are the biggest problem for PC gaming because of ports made from console limited performance like COD BO 2. The hardest part will be getting games made for high PPI.

1) The GT650M is junk

2) Macs are also junk for gaming. rMBP has significant ghosting issues and input lag issues, way more so than any mdoern IPS panel.

3) 2560x1600 will not ever happen on a PC monitor 22 inches in size.

4) Higher resolution does not eliminate the need for AA.


Let's review how this entire thing started.

a) You say you're upset because AA causes a performance hit - this is because your 460s do not have enough VRAM , not to mention they're too old. You understand that VRAM use increases exponentially as you increase resolution or AA right?
b) You say that higher resolution such as 2560x1600 will cause less of a performance drop than anti aliasing. This is wrong. More pixels will always be less efficient than lower resolution with added detail or anti aliasing. This will also never change.

Like I said, I love higher resolution myself. But this is what I can't wrap my head around: You want higher resolution because AA causes too much performance drop. Well, sorry to say the higher resolution will have FAR LESS performance. No GPU architecture will change this. More pixels to push will always mean less performance, period. 1080p with higher detail is more efficient - and this will not change because of consoles.

I think you're in for a big disappointment if you think GT650M or 460s are adequate for 2560x1600 gaming. They are so far from adequate, it isn't even funny.
 
Last edited:

kevinsbane

Senior member
Jun 16, 2010
694
0
71
You can get a pseudo 22" 2560x1600 monitor now. If you have a large enough desk, move your 30" back about another foot and it turns into what is basically a 22" 2560x1600 monitor. You can always increase effective PPI by moving the screen further away.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
We still need AA now with 30" screens, we'll still need it with 4K screens as well.

Exactly. Anyone saying higher resolution eliminates the need for AA is wrong.

And I find it incredible that anyone believes a "smaller" monitor would be more immersive. When it comes to a home theater for gaming, bigger is always better - also applicable for PC gaming IMO
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
4) Higher resolution does not eliminate the need for AA.

If high enough, it absolutely does. When you have more pixels then your eyes are capable of discerning then you won't have visible aliasing. Basic signal theory.

2560x1600 will not ever happen on a PC monitor 22 inches in size like you suggest.

It is already shipping on 10" displays. While PC users of the last decade have demanded grossly inferior displays the rest of the world moved on.

2560x1600 22" is 137 ppi- said another way- way too low to be anything but garbage. In the 27" range it is even worse, 112ppi, 30" and it is a painfully bad 101ppi. 100ppi is probably a good guideline moving from 'garbage' to 'disgusting'.

2560x1600 is only considered high resolution in the pitiful PC display market at this point sadly(10" consumer mass market devices have that for a resolution, 5" phones are 1080p).

High end displays now are 3840x2160, that on a 22" display would get us to 200ppi, not embarassingly bad any longer. Still won't be close to the 300ppi pixel density normal people are getting used to, but a lot better then the utter crap we deal with now.

I think PC enthusiasts need to wrap their head around the fact that the entire rest of the world has blown past them in display technology- a place where PC used to always be leagues ahead of every other market.

More pixels will always be less efficient than lower resolution with added detail or anti aliasing. This will also never change.

Wrong again. It depends on the layout of the part and how resources are allocated what the relative performance hit is going to be. I could build you a hypothetical part that would be faster with higher resolution then AA, and it would *absolutely* look better- as in by a stupidly huge amount.