AA and LCD's.... a question

Jun 14, 2003
10,442
0
0
at what point does it become pointless to keep upping the AA?

LCD's are just big grids of square pixels, so there is gonna be some inherent aliasing just from that isnt there?

you cant display a perfect diagnol line on a LCD, because its has to be made of squares in a stair case fashion

so at what point do you get a situation where increased AA levels yield nothing in terms of IQ improvement because you are limite by your monitor?

panels with larger pixel pitches are gonna be worse than smaller ones as well?

or can LCD's change the orientation of their pixels?
 

Crafty35a

Senior member
Feb 2, 2003
253
0
76
I think you're misunderstanding the differences between LCD's and CRT's. A CRT also displays everything as a grid of square pixels, so it's equally impossible to draw a true perfect diagonal line on a CRT.

Anti-aliasing just makes it look better, basically by displaying the pixels along a diagonal edge as an average of the colors nearby, rather than one solid color or another. (which is what produces the aliasing effect). So antialiasing is really equally effective on a CRT and an LCD, all other things being equal.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Crafty35a
I think you're misunderstanding the differences between LCD's and CRT's. A CRT is also made up of square pixels, so it's equally impossible to draw a true perfect diagonal line on a CRT.

Anti-aliasing just makes it look better, basically by displaying the pixels along a diagonal edge as an average of the colors nearby, rather than one solid color or another. (which is what produces the aliasing effect)

oh right

so your always gonna have aliasing, its just a matter of how well your eyes are decieved?
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
They definitely can't change the orientation of their pixels. Larger pixel pitches make aliasing worse to deal with, on both types of displays.

A CRT is not a grid of square pixels (see here for photos)...they are instead triads of circular pixels on a shadow mask, or vertical strips of pixels on an aperture grille. The shadow mask makes aliasing appear more segmented than jagged, but the aperture grille is almost the same as an LCD in that regard.

To me, it looks like it can't get much better than 16xRGMS (enable with nHancer), which is pretty intensive. I wouldn't call anything remotely jagged-free until at least 8xS.

With sharp displays (LCDs and OLEDs) you can use a special type of antialiasing that does it at the subpixel level. This, when applied to fonts, is called subpixel hinting, and is the basis of Microsoft's ClearType, which is actually a pretty poor implementation in practice. Graphics cards can't yet do this natively and I'm not sure how it would work for colored graphics anyway. Supersampling algorithms work on the premise that color 3n is three times as bright as color n (linear L* gamma curve), and use 2n as an in-between to blend them. The more calibrated your monitor is, the less gray AA artifacting you will see, but it will never be completely eliminated. 10-bit color may expand the flexibility of AA as well, if it ever gets here.