AA versus Resolution

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0

I don't think it matters much what resolution you run today long as you are running LCD native resolution. When you're used to running AA and then try to go back you notice whole lot more aliasing until you get used to not running AA again.

CRTs are different. It seems to give more detail when running higher resolutions because CRTs are capable of running higher resolution at the same size as LCD. For instance a 19" CRT is capable of 1600x1200 or higher but a LCD can only go 1280x1024 or 1440x900. Running higher resolutions mean more detail and less aliasing on a CRT because more pixels are squeezed on to a smaller screen.
 

RockinZ28

Platinum Member
Mar 5, 2008
2,173
49
101
Yea I would say that's valid, screen size can have a big impact. I have a 22" CRT that does 2048x1536, but I will not play games below 1600x1200. Even with 16xAA @ 1280x960, the games look horrid, looks like I'm playing with legos the pixels are so big.
Since my screen is 22", going from 16x12 to 20x15, isn't as noticeable, sure the higher res looks better, but it's nothing like going from 12x9 to 16x12. Going to 20x15 has a monstrous impact on fps in new games and isn't possible for me as I can never afford the best cards, let alone two which may be necessary. So 16x12 with 2-4x AA is more practical.
I do enjoy games from like 2005 and before at 2048x1536 and AA at 85hz though :).
 

n7

Elite Member
Jan 4, 2004
21,303
4
81
Originally posted by: jaykishankrk
I was wondering, whats the point in having AA beefed up to the maximum level possible when playing a game at lets say 1900 X 1080 resoultion.. does it really make any difference?

What is the point of having AA beefed up at let's say 1280x800? :confused:

It's generally to alleviate the visibility of jaggies.


It's the same thing @ any other resolution.

I seriously don't get why people somehow assume aliasing magically disappears at higher resolutions.

The only time aliasing becomes less apparent is with a smaller pixel pitch. Pixel pitch, not resolution, is what determines visibility of aliasing.

Here's a list of what pixel pitch is for various common monitor sizes.
http://en.wikipedia.org/wiki/Dot_pitch
 

dookulooku

Member
Aug 29, 2008
93
0
0
Originally posted by: n7
The only time aliasing becomes less apparent is with a smaller pixel pitch. Pixel pitch, not resolution, is what determines visibility of aliasing.

That's not accurate. Given the same pixel pitch, a higher resolution should exhibit less aliasing.

None of the posts have really stated what aliasing really is. Aliasing is caused by excessively fine details (high frequencies) at too low of a resolution (sampling rate).

Now in computer graphics, an edge is pretty much like an impulse when treated as a signal. An edge, like an impulse, has energy (brightness) but technically zero width. Thus, it contains infinitely high frequencies. The spectrum is a straight line, meaning the same amplitude for all frequencies. So in this case, raising the resolution DOES NOT reduce the aliasing.

Raising the resolution does not reduce edge aliasing for the reason I just mentioned, but it does reduce object aliasing. For example, let's say your field of vision is 90 degrees and you have a fine object that is only 1/10 of a degree wide. If you only have 640 pixels of horizontal resolution, then that object is only 0.7 pixels wide. This means as you move around at a constant speed, maintaining the same distance, you will see it 70% of the time -- it will flicker. Now if you had 1600 pixels of horizontal resolution, then the object is 1.8 pixel wide, meaning you it will be 2 pixels 80% of the time and 1 pixel 20% of the time. Depending on the contrast between that object and it's surrounding, this can be a huge reduction in aliasing.

When you turn on anti-aliasing, you increase the number of sampling points, meaning fine objects get sampled and drawn instead of flickering. It's usually also weighted (objects closer the actual pixel location carry greater weight), so if the fine object was black, it gets drawn as gray. The weighting also causes an edge to get blended with its surrounding, reducing aliasing since it's no longer an impulse.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
1920 x 1200 is definitely not middling if we are going by average users.
Most people will have 1280 x 1024 monitors or have upgraded to 1680 x 1050, but 1920 x 1080 and up are high end with 2560 x 1600 being for professionals or really high end.
I mean really if you wanted to you could just as easily add resolutions like 320 x 240, 640 x 480, 800 x 600, 1280? x 600 (whatever the new netbook res is) etc... if you're going to base the term "middling" on arbitrary resolution numbers and starting point.

That being said resolution is a lot more expensive than anti aliasing especially with the newer cards so it doesn't seem to be a fair comparison. When I tried 1920 x 1080 with my 4850 I get about 50-60 fps in devil may cry 4 with like a 3-10 fps difference or so with various anti aliasing settings.
On the other hand at 1360 x 768 I can get 60-100 fps with maxed settings.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
4xAA increases are usually equal to one step up in resolution in terms of jaggies, however a lower resolution gives less actual detail and even with a lot of AA will seem blurred and simple.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Originally posted by: jaykishankrk

Considering, one is running the game at 1920 X 1200 and 4AA, irrespective of the games he play, will there be substantial amount of performance maintained through out the game play without experiencing much of glitch in frame movement?

I don't think one can experience smooth play at the time when there is too much action isnt it? and considering that one engages himself in lot of action other than meagerly running around to admire the environment around him, and this is what FPS games are made for isnt it..?
It depends on the game, and I don't run all games at 1920x1200 anyway; I generally use 1600x1200, 1760x1320 or 1920x1440.

As for performance, again, it all comes down to disabling game options to trade for AA if required. Who wants motion blur and costly shadow options when the entire screen is composed of serrated edges?

The only game I have that I can?t manage 4xAA at 1600x1200 is Crysis but that?s only because I?m using a mid-range 4850 as an interim card.

I can assume a situation where this point holds its value, There can be situations where the character is aiming at a distant enemy camouflaged to the environment he is in, it can be tricky to knock him down if only 10% of his face is visible behind certain particle which is jagged to an insane extent other than this i hardly see AA playing any noticeable part in the game play..
You seriously underestimate the impact of aliasing. The fact is it's everywhere but some are used to it so they don't know how much better things look without it. To those people it's worthwhile for them to run at 4xAA for a week or so and then get them to go back without AA. They?ll spot jagged edges immediately.

As my screenshots plainly indicate even 4xAA isn?t enough.

Can u please elaborate on that?
When was the last time you saw an animated movie like Toy Story or Ice Age having jagged edges and sparkling/shimmering/crawling? It doesn?t happen because the pixels are in the perfect position.

An offline render is too slow for real-time gaming but the effect can be replicated in real-time with a strong enough AA level.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Originally posted by: nRollo

For all practical purposes, 19x12 is the second highest resolution. High res CRTs are rare and very, very, very few people use them anymore.
By that same metric (market share and what people use) can we conclude Intel?s GMA is a mid-range solution since the majority of the market uses it?

Can we also conclude a 4670 is the same as the GTX280 since their market share is a drop in the bucket compared to GMA? Can we lump the two cards as ?high-end? just like you?re lumping 1920x1200 and 2560x1600 as ?high? based on what people use?

As far as the OP goes, love 25X16, and would take the sharpness of 25X16 w/o AA over low res with AA anytime.
That's because your device can't scale images at non-native resolutions like a CRT can.

Also as far as sharpness goes 2560x1600 is really 2133x1600 since the extra width only adds more content on the screen but doesn't increase pixel density.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Again,

BFG makes accurate points, but the problem is that very few people use CRT's anymore. The technology is superior in some facets, in others it inferior. As far as resolution and scaling and color, the CRT is the best ever. But it is plagued with geometry problems, requires more power, is heavier and gives many people major head-aches, myself included. Also, since CRT's are not manufactured anymore to my knowledge, they really cannot be considered. So in my opinion, the second highest resolution is 1920 x 1200 and I am therefore in agreement with Rollo.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Originally posted by: nRollo

For all practical purposes, 19x12 is the second highest resolution. High res CRTs are rare and very, very, very few people use them anymore.
By that same metric (market share and what people use) can we conclude Intel?s GMA is a mid-range solution since the majority of the market uses it?

Can we also conclude a 4670 is the same as the GTX280 since their market share is a drop in the bucket compared to GMA? Can we lump the two cards as ?high-end? just like you?re lumping 1920x1200 and 2560x1600 as ?high? based on what people use?

As far as the OP goes, love 25X16, and would take the sharpness of 25X16 w/o AA over low res with AA anytime.
That's because your device can't scale images at non-native resolutions like a CRT can.

Also as far as sharpness goes 2560x1600 is really 2133x1600 since the extra width only adds more content on the screen but doesn't increase pixel density.

BFG-

I know you want to keep pretending that we can speak of high resolution CRTs as if they are a part of the gaming landscape, but they aren't.

Your analogy about video cards doesn't work because people have a choice to use whatever video cards they like, but CRTs are largely extinct.

You might as well be arguing Glide vs Direct 3d. I'm sure there are used 3DFX cards available, and I imagine somewhere there's an outlet store that can sell you a new one, but for 99.9% of the world it's a moot point.