No need for AA @ 4k ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Yes, only important metric is pixels per degree, but it seems that no-one is using it.

Anyway, surface, edge, BRDF/shader antialiasing methods are still needed even though one would use very high resolutions.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Depends on the subjective tolerance of aliasing for each individual. I remember discussions when 1024 x 768 was a main stream resolution and 1600 x 1200 was the dream resolution and AA wasn't needed!

For me, 4K still needs AA but simply much less - it also depends on what type of aliasing considering they are many methods that have pro's and con's, strengths and weaknesses.
 

CakeMonster

Golden Member
Nov 22, 2012
1,384
482
136
I'm playing WoW now on my Dell 30" 2560x1600. Normal desk, view distance about 2ft. With the patch earlier this week, they removed MSAA. The options are now FXAA and CMAA. So even though those new modes do take care of some of the shimmer and pixellation, the aliasing is annoying me greatly now. As a gamer for almost 20 years I certainly know what to look for but I can't imagine how anyone would not be bothered by the way it looks now. Its grating to the degree that I enjoy the game much less.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm playing WoW now on my Dell 30" 2560x1600. Normal desk, view distance about 2ft. With the patch earlier this week, they removed MSAA. The options are now FXAA and CMAA. So even though those new modes do take care of some of the shimmer and pixellation, the aliasing is annoying me greatly now. As a gamer for almost 20 years I certainly know what to look for but I can't imagine how anyone would not be bothered by the way it looks now. Its grating to the degree that I enjoy the game much less.

You should look into downsampling. It can help a lot on games with limited AA options.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Every resolution no matter how big it is and how great dpi a screen has, needs AA.
The clarity of a 4k resolution is amazing especially in a pc monitor but that doesn't negate the fact that there is still aliasing especially in some particular forms of textures when available (alpha), or in some games where the game engine is in serious need of it..

Nevertheless if someone has to choose between AA and higher resolution ALWAYS choosing resolution comes first. (leaving the fps out of the equation for the sake of example)
There's no such thing as the 1080p with 4xMSAA or 4xSSAA(or DSR) looks like 4k.

So to recap.
No AA can replace the higher resolution (they don't have the same IQ anyway), but that doesn't negate the need for AA in every resolution if this is possible.
 
Last edited:

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Every resolution no matter how big it is and how great dpi a screen has, needs AA.

Not true at all. If the DPI is great enough where you can not see individual pixels then there is no discernable aliasing. Plenty of smartphones have already achieved this.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Not true at all. If the DPI is great enough where you can not see individual pixels then there is no discernable aliasing. Plenty of smartphones have already achieved this.

No they haven't. Try some alpha test textures on them and you will see. :)
There's always need for AA because simply because there's always aliasing. Not even the smallest dpi can change this.
Try some CGI studios technical analysis and see why are they so stupid enough to use insane amounts of AA to produce the best possible IQ..
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Its definitely possible to beat the angular capability of the eyes such that the pixels not only disappear but that we never see any aliasing effects, but its a lot further away than you might imagine. The current estimate based on what we know about the eyes resolution is todays 24" monitor would need a resolution of 12000p. Considering at best we are now moving to 2160p with a 4k monitor at 24" we have at least 6x the density to go and it took us a long time to go from DVD quality to HD quality to 4k.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
No they haven't. Try some alpha test textures on them and you will see. :)
There's always need for AA because simply because there's always aliasing. Not even the smallest dpi can change this.
Try some CGI studios technical analysis and see why are they so stupid enough to use insane amounts of AA to produce the best possible IQ..

Know the specific app? I want to test that on my nexus 5.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
You will still want AA. The only thing that I find questionable is every review using unrealistic settings such as 8X MSAA. Personally I use FXAA for nearly everything now, as it has no performance hit. Conversely, the performance hit for 8X MSAA is rather huge - so I only use that in older games (and combined with SGSSAA at times).

AA was originally created for low resolutions, it was 1st advertised for 800x600 rez & below for those of us who were around long enough to remember when AA was first introduced. That it's now supposedly "needed" @ 4k is absurd.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Its definitely possible to beat the angular capability of the eyes such that the pixels not only disappear but that we never see any aliasing effects, but its a lot further away than you might imagine. The current estimate based on what we know about the eyes resolution is todays 24" monitor would need a resolution of 12000p. Considering at best we are now moving to 2160p with a 4k monitor at 24" we have at least 6x the density to go and it took us a long time to go from DVD quality to HD quality to 4k.
Indeed, also it is important to know what cause of an aliasing as the source can be beyond eye resolution.

If you try to brute force solution you will notice that making an anisotropic nature of a CD surface with bump mapped/parallax holes as it is not the best solution, even with very high resolution. ;)
One needs to be intelligent on how you use your samples and one will get a lot better quality a lot cheaper than just increasing resolution everywhere.

I really hope we would see resolution independent shading in games soon, it should be quite nice to get good fps at huge resolutions. (you can use same amount of shading for different resolutions.)
AA was originally created for low resolutions, it was 1st advertised for 800x600 rez & below for those of us who were around long enough to remember when AA was first introduced. That it's now supposedly "needed" @ 4k is absurd.
Spatial AA for computer graphics was introduced to get it to look decent, in film or otherwise. (there were papers published in '70s..)
Last Starfighter was rendered in 3000 × 5000 and it used AA.. in 1984.
 
Last edited:

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Its definitely possible to beat the angular capability of the eyes such that the pixels not only disappear but that we never see any aliasing effects, but its a lot further away than you might imagine. The current estimate based on what we know about the eyes resolution is todays 24" monitor would need a resolution of 12000p. Considering at best we are now moving to 2160p with a 4k monitor at 24" we have at least 6x the density to go and it took us a long time to go from DVD quality to HD quality to 4k.


Based on what assumed viewing distance from the screen? As mentioned several posts ago, the same pixel density can look different depending on how far away you are
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I also agree that the source makes a difference. It would be nice to have higher resolution in-game textures to match the increases in display resolution
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
I) the hardware side.

per apple's retina definition (if valid). retina is 300+ppi at 12"+ viewing distance.

4k on a 24" display (184ppi) would qualify as retina at a viewing distance of 19"+. since normal viewing on a desktop is typically 20"-24".



II) the software side (the culprit)

the reality is - texture to be rendered are NOT 4k texture. so AA will always be needed to smooth out those lesser texture.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Depends on the subjective tolerance of aliasing for each individual. I remember discussions when 1024 x 768 was a main stream resolution and 1600 x 1200 was the dream resolution and AA wasn't needed!

A 21" (19" viewable) 1600x1200 CRT is 105ppa. 19" (18" viewable) is 111ppa. 24" 1080p LCD is 92.
So that 15 year old resolution still needs AA less than 1080p.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
the reality is - texture to be rendered are NOT 4k texture. so AA will always be needed to smooth out those lesser texture.
For texture minification mipmapping is the solution, without mipmapping minification would be the greatest source of texture aliasing.
For texture magnification we have bilinear filtering, which basically cannot cause aliasing during magnification.

So no, texture size has no effect on aliasing of surface when proper scaling is used.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
OMG, i just got a 4k 28"screen for laughs!...LMAO, WOW, i cant read a thing.haha...I am amazed how much my sight must have degraded over the years to need AA on anything!
I have to use zoom to read the phucking browser page!...LOL
Got 3340x2160 AOC U2868 some thing or rather and TBH, i might have to change down to 1600p...LOL
edit ...1440p
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Depends on the subjective tolerance of aliasing for each individual. I remember discussions when 1024 x 768 was a main stream resolution and 1600 x 1200 was the dream resolution and AA wasn't needed!

For me, 4K still needs AA but simply much less - it also depends on what type of aliasing considering they are many methods that have pro's and con's, strengths and weaknesses.

Exactly. Because AA at those resolutions wasn't imaginable given then current hardware. Same now. Can't imagine AA at 4k with the hardware we currently have. AA will always be needed to some extent barring something like 500 ppi at one's viewing distance. Probably will still need some form of temporal AA even then.

I have to admit though that going beyond 4k for a TV setup and a person sitting about 10 feet away from a 60" TV is overkill. I'd even say that 12-15 feet from a 120" projector screen will not need more than 4k. However one will prefer that 4k image to be properly anti-aliased. Going to 8k on that screen setup would mostly be to reduce any remnant aliasing more than adding any discernible detail.
 
Last edited:

yhelothar

Lifer
Dec 11, 2002
18,408
39
91
4K on a regular desktop sized monitor is essentially 2-4x super-sampling AA. SSAA is much higher quality than MSAA or any other kind of AA for that matter.

You won't really need AA unless you have a 4K TV that's >50"
 

yhelothar

Lifer
Dec 11, 2002
18,408
39
91
No they haven't. Try some alpha test textures on them and you will see. :)
There's always need for AA because simply because there's always aliasing. Not even the smallest dpi can change this.
Try some CGI studios technical analysis and see why are they so stupid enough to use insane amounts of AA to produce the best possible IQ..

Try 4x SSAA on a 1080p monitor. That's the highest quality AA possible since it's essentially rendering the scene at 4K and resizing it to 1080p. Now imagine instead of resizing it back down to 1080p, you have an actual monitor that displays 4K. It's like 4x SSAA but higher quality.

Here's an example of 4xSSAA.
4xadmsaa.jpg