Forget Anti Aliasing - Where is PPI

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Since AA is so costly on performance (playing Far Cry 3 shows just how much)

Why dont we just abandon all this AA trickery and focus on driving more pixels.

AA is redundant when the image is sharper so why dont we move to 250 PPI gaming monitors and forget AA

AA makes images more tolerable but doesnt exactly add any quality but more pixels does.

It must be simpler to drive more pixels than figure out all the complicated algorithms to improve the image on current resolutions.

Anyone else agree?
 

biostud

Lifer
Feb 27, 2003
19,455
6,506
136
Buy a 1440p or 1600p monitor, which will be more demanding than AA, but also better looking.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Buy a 1440p or 1600p monitor, which will be more demanding than AA, but also better looking.

adding inches and resolution doesnt help because you still need AA.

The whole point is higher pixel density
 

biostud

Lifer
Feb 27, 2003
19,455
6,506
136
higher ppi = more pixels (or a very small screen) = more calculations = lower fps
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
higher ppi = more pixels (or a very small screen) = more calculations = lower fps

Whereas doing AA on a lower resolution will incur a much smaller performance hit.

What's more, even on 1080p games still look like crap in terms of texture quality and polygon count. How bout we first master rendering games on a low resolution before increasing PPI.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Should also be noted that AA @1440/1600p looks better than AA @1080/1200p. AA looks better at high resolutions.

Its more irritating to me that game visuals have been relatively static since about 2 years after the X360 and PS3 were launched despite the many fold increase in PC graphics hardware.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Whereas doing AA on a lower resolution will incur a much smaller performance hit.

What's more, even on 1080p games still look like crap in terms of texture quality and polygon count. How bout we first master rendering games on a low resolution before increasing PPI.

Sounds like guess work with no real facts.

I have seen my FPS half with 4x AA and 8X can be unplayable

A 680 SLI on Far Cry 3 cant even crank up AA at 2560x1600.

But you can put 2560x1600 in a 22" monitor without the need for AA on a 680 GTX SLI
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Sounds like guess work with no real facts.

I have seen my FPS half with 4x AA and 8X can be unplayable

A 680 SLI on Far Cry 3 cant even crank up AA at 2560x1600.

But you can put 2560x1600 in a 22" monitor without the need for AA on a 680 GTX SLI

22" is too small
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I use the larger screen for more than the resolution.

Sounds like guess work with no real facts.

I have seen my FPS half with 4x AA and 8X can be unplayable

A 680 SLI on Far Cry 3 cant even crank up AA at 2560x1600.

But you can put 2560x1600 in a 22" monitor without the need for AA on a 680 GTX SLI

You're spouting off nonsense. Some games STILL need AA even at 2560x1440. Saying you don't is ridiculous. Some games benefit a ton. Granted you might be able to get away with less, but AA will always reduce temporal aliasing that resolution never removes.

The fact of the matter is that our graphic quality is far, far too low and the polygon count far far too low still. You ever seen a game that looks like a CG movie yet in real time during actual gameplay? No because it cannot be done. Do you complain that Pixar movies and Avatar looked like crap at 1080p or hell, commented that a live action movie doesn't look real enough at 1080p? I bet you never thought of that.

We need games to start adding more realistic visual effects with higher polygon counts on all the models.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
If I could get a 22 inch 2560 monitor I would in a heartbeat

Have you seen what 2560x1600 looks like on a 30 inch? Or what 2560x1440 looks like on a 27". Fonts are small on both, using a smaller screen for those resolutions is ridiculous -- they are not usable on a 22". UNLESS you want to use windows 8 / tablet operating system with 7 inch tiles and fonts sized at 450% - and that is only within the operating system. Text within games would not be legible.

2560x1600 is already nearly twice the pixel density of 1080p. If you want 230PPI packed into a 7 inch screen get an ipad. That is obviously worthless for high end gaming, but there you go. I highly doubt OP has anything more than 1080p. If you want higher pixel density , get a 2560 monitor. Also, having used 2560 resolution for a long time now - anyone thinking that higher resolution/PPI eliminates the need for AA. That is definitely not the case. You will still notice jaggies.
 
Last edited:

SomeoneSimple

Member
Aug 15, 2012
63
0
0
Sounds like guess work with no real facts.

I have seen my FPS half with 4x AA and 8X can be unplayable

No it isn't. In every decent engine, MSAA is always cheaper than increasing the amount of pixel-samples (either by increasing the resolution or by using SSAA). It's the whole reason why MSAA was invented in the first place.

There are a few cases (highly deferred renderers) that MSAA costs nearly as much as simply supersampling the whole scene, but it should never exceed the performance cost of adding similar amounts of pixel-samples, unless the developers are extremely incompetent.

But you can put 2560x1600 in a 22" monitor without the need for AA on a 680 GTX SLI
An image without AA would still be aliased at such low PPI, unless you sit 2 meters from your monitor.

The human eye has a knack for detecting unnatural edges and strong contrasts, even post-process AA would be a significant improvement over an non-AA'd image.

I gotta laugh at anyone saying this. Obviously haven't seen what 2560x1600 looks like on a 30 inch. Or what 2560x1440 looks like on a 27". Fonts are small on both, using a smaller screen for those resolutions is ridiculous. You're wrong if you think it is usable on a 22".
No, you're wrong, and you are being horribly generalizing. Just because you have bad eye-sight, or small fonts strain your eye, it doesn't mean that higher PPI displays won't suit others.

I've used such 30" monitors for work, and at home I am using a resolution of 2352x1470, downsampled to the native resolution of my 22" monitor, for months now. So I know exactly what I am talking about.
Even non-native resolutions like that are a breath of fresh air while we have to suffer non-existent scaling support on Microsoft OS'. I'll take all the workspace I can get, higher native resolutions would only make it better.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Sounds like guess work with no real facts.

I have seen my FPS half with 4x AA and 8X can be unplayable

A 680 SLI on Far Cry 3 cant even crank up AA at 2560x1600.

But you can put 2560x1600 in a 22" monitor without the need for AA on a 680 GTX SLI


No real facts eh?

Higher resolution does not eliminate the need for AA. You will definitely still see jagged edges. It also is not more efficient to add PPI rather than AA. Performance wise, adding 2x MSAA is far more efficient than doubling the pixel density to go from 1080p to 2560x1600. Why on earth do you even NEED 8x MSAA? I hardly ever use anything higher than 2x.

Again. You're way off base if you think 2560x1600 is more efficient than anti aliasing at 1080p or 1200p. Not to mention, I can assure you that 2560x1600 is not usable on a screen that small unless you're using a tablet operating system with super huge 450% fonts.

adding inches and resolution doesnt help because you still need AA.

The whole point is higher pixel density

Higher resolution doesn't increase pixel density? What on earth are you talking about?
 
Last edited:

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
No real facts eh?

Higher resolution does not eliminate the need for AA. You will definitely still see jagged edges. It also is not more efficient to add PPI rather than AA. Performance wise, adding 2x MSAA is far more efficient than doubling the pixel density to go from 1080p to 2560x1600. Why on earth do you even NEED 8x MSAA? I hardly ever use anything higher than 2x.

Again. You're way off base if you think 2560x1600 is more efficient than anti aliasing at 1080p or 1200p. Not to mention, I can assure you that 2560x1600 is not usable on a screen that small unless you're using a tablet operating system with super huge 450% fonts.

Really?

Guess i am the only person who has seen Diablo 3 run on a MBP Retina then. The resolution is huge and believe me it looked far better than my AA enabled x1050 MBP. Obviously unplayable but it looked great.

I use 1920x1200 currently on 24" monitor. Anything bigger than 24" on a desk is distracting from the gameplay.

For the people who talk about "small text" really need to understand its bad software that is the problem propagated by windows.

I wonder how much resource a GFX card actually uses with AA intended to be used in games. GFX cards could be designed to drive more pixels rather than fancy features designed to improve low pixel density.

With the advent of 4K TV's you can bet that PC games will go the way of resolution PPI just like consoles will too. Its only a matter of time.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
No real facts eh?

Higher resolution does not eliminate the need for AA. You will definitely still see jagged edges. It also is not more efficient to add PPI rather than AA. Performance wise, adding 2x MSAA is far more efficient than doubling the pixel density to go from 1080p to 2560x1600. Why on earth do you even NEED 8x MSAA? I hardly ever use anything higher than 2x.

Again. You're way off base if you think 2560x1600 is more efficient than anti aliasing at 1080p or 1200p. Not to mention, I can assure you that 2560x1600 is not usable on a screen that small unless you're using a tablet operating system with super huge 450% fonts.



Higher resolution doesn't increase pixel density? What on earth are you talking about?

What are you talking about.

You said i should see 2560x1600 on a 27" which is no better than 1920x1200 on a 24" or 1080p on a 22"

Im talking about PPI - Putting 2560x1600 in a 22 or 24" screen

Heck what we really need is 4K resolution in a 24" monitor and a focus on GFX cards to drive pixels and not fancy effects.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I use 1920x1200 currently on 24" monitor. Anything bigger than 24" on a desk is distracting from the gameplay.

Oh so you're saying a SMALLER screen is better for an immersive gaming experience. Alrighty then :rolleyes:

With the advent of 4K TV's you can bet that PC games will go the way of resolution PPI just like consoles will too.

Yeah, which is why the next gen consoles are targetting 1080p. Because they will "go the way of PPI" :rolleyes:

And then this statement:

I wonder how much resource a GFX card actually uses with AA intended to be used in games. GFX cards could be designed to drive more pixels rather than fancy features designed to improve low pixel density.

I don't think you understand that adding more detail at 1080p with anti aliasing is far, far more efficient than doubling the pixel count to 2560 resolution. The performance hit for resolution is tremendous. The performance hit for AA, not so much - especially for 2x MSAA or FXAA.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Have you seen what 2560x1600 looks like on a 30 inch? Or what 2560x1440 looks like on a 27". Fonts are small on both, using a smaller screen for those resolutions is ridiculous -- they are not usable on a 22". UNLESS you want to use windows 8 / tablet operating system with 7 inch tiles and fonts sized at 450% - and that is only within the operating system. Text within games would not be legible.

2560x1600 is already nearly twice the pixel density of 1080p. If you want 230PPI packed into a 7 inch screen get an ipad. That is obviously worthless for high end gaming, but there you go. I highly doubt OP has anything more than 1080p. If you want higher pixel density , get a 2560 monitor. Also, having used 2560 resolution for a long time now - anyone thinking that higher resolution/PPI eliminates the need for AA. That is definitely not the case. You will still notice jaggies.

2560x1600 is not twice the pixel density. Pixel density only increases with resolution if the size stays constant or doesn't increase too much that it negates the effect of increased resolution
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Really?

Guess i am the only person who has seen Diablo 3 run on a MBP Retina then. The resolution is huge and believe me it looked far better than my AA enabled x1050 MBP. Obviously unplayable but it looked great.

I use 1920x1200 currently on 24" monitor. Anything bigger than 24" on a desk is distracting from the gameplay.

For the people who talk about "small text" really need to understand its bad software that is the problem propagated by windows.

I wonder how much resource a GFX card actually uses with AA intended to be used in games. GFX cards could be designed to drive more pixels rather than fancy features designed to improve low pixel density.

With the advent of 4K TV's you can bet that PC games will go the way of resolution PPI just like consoles will too. Its only a matter of time.

You think 4k rez will become standard overnight? HAHA...how long did it take for 1080p and you still don't broadcast in 1080p. What's even better is you think consoles will be able to handle 4k resolution in real time gameplay. Look at what we have now. They don't even do full time 1080p.

Anything bigger than 24" is distracting for gameplay you say? What a load of crap. People have been gaming on screens measured in feet not inches for years. They only mention how immersive it is, not distracting. Sitting at a desk a 27-30" display filles more of your field of vision. I don't know about you but looking at a monitor and seeing the bezel around the image is the distracting part. A Bigger screen, filling more of your vision, will not show the bezel as prominently.

Further, maybe you should do a little research on polygon counts.

Have you seen a game use a model with this amount of detail? And I bet it doesn't look blurry on a 1080p screen too and the image is low resolution and compressed to boot.

fn_scar_render_1.jpg
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
What are you talking about.

You said i should see 2560x1600 on a 27" which is no better than 1920x1200 on a 24" or 1080p on a 22"

Im talking about PPI - Putting 2560x1600 in a 22 or 24" screen

Heck what we really need is 4K resolution in a 24" monitor and a focus on GFX cards to drive pixels and not fancy effects.

If you want more pixels, the option is there. You can get a 2560 monitor, which is roughly double the pixel count of 1080p with double the performance hit compared to 1080p.

Obviously, you're complaining about pixels but aren't willing to spend the money for the upgrade to your screen. The option is there if you want it. 2560x1600 isn't going to happen at 22-24 inches for the PC. As I said, that resolution is not remotely usable at that size. Furthermore, higher resolution (or pixel count) is far less efficient than anti aliasing. Your suggestion that higher pixel count is somehow better than AA - that is ridiculous. AA has a far smaller performance hit.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
Why on earth do you even NEED 8x MSAA? I hardly ever use anything higher than 2x.

I'm not sure if I should take your comment serious, 2x MSAA is laughable and I wouldn't even call that proper anti-aliasing. Unless you're using some kind of temporal, high frequency sampling pattern, (or a quincunx pattern) you are only applying AA to a single plane, either vertical or horizontal.

ALL forms of diagonal-aliasing will still be aliased after applying 2x MSAA.
 
Last edited:

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
You think 4k rez will become standard overnight? HAHA...how long did it take for 1080p and you still don't broadcast in 1080p. What's even better is you think consoles will be able to handle 4k resolution in real time gameplay. Look at what we have now. They don't even do full time 1080p.

Anything bigger than 24" is distracting for gameplay you say? What a load of crap. People have been gaming on screens measured in feet not inches for years. They only mention how immersive it is, not distracting. Sitting at a desk a 27-30" display filles more of your field of vision. I don't know about you but looking at a monitor and seeing the bezel around the image is the distracting part. A Bigger screen, filling more of your vision, will not show the bezel as prominently.

Further, maybe you should do a little research on polygon counts.

Have you seen a game use a model with this amount of detail? And I bet it doesn't look blurry on a 1080p screen too and the image is low resolution and compressed to boot.

fn_scar_render_1.jpg

27 and 30" monitors on a desk are huge to sit less than 1 foot away especially when most people with 42" TV's sit 6-12 feet away normally.

Id rather have more detail in a 24" monitor and with high DPI smooth edges and even smooth text. With higher resolution comes far more benefits than MSAA
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
If you want more pixels, the option is there. You can get a 2560 monitor, which is roughly double the pixel count of 1080p with double the performance hit compared to 1080p.

Obviously, you're complaining about pixels but aren't willing to spend the money for the upgrade to your screen. The option is there if you want it. 2560x1600 isn't going to happen at 22-24 inches for the PC. As I said, that resolution is not remotely usable at that size. Furthermore, higher resolution (or pixel count) is far less efficient than anti aliasing. Your suggestion that higher pixel count is somehow better than AA - that is ridiculous. AA has a far smaller performance hit.

Talking to you is like banging a head against the wall.

I cant take you seriously when you claim high DPI makes text to small. This is a windows problem not a PPI issue.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Have you seen what 2560x1600 looks like on a 30 inch? Or what 2560x1440 looks like on a 27". Fonts are small on both, using a smaller screen for those resolutions is ridiculous -- they are not usable on a 22". UNLESS you want to use windows 8 / tablet operating system with 7 inch tiles and fonts sized at 450% - and that is only within the operating system. Text within games would not be legible.

2560x1600 is already nearly twice the pixel density of 1080p. If you want 230PPI packed into a 7 inch screen get an ipad. That is obviously worthless for high end gaming, but there you go. I highly doubt OP has anything more than 1080p. If you want higher pixel density , get a 2560 monitor. Also, having used 2560 resolution for a long time now - anyone thinking that higher resolution/PPI eliminates the need for AA. That is definitely not the case. You will still notice jaggies.

I missed where you asked if I have seen a 2560x160030 incher before. I've been looking at one daily since 2009. I intially had a u 3007 wfp-hc. Now I have a U3011. I almost want to swap it for a 2560x1440 27 incher because it has a higher DPI.