Nvidia developing Subpixel Reconstruction Anti-Aliasing (SRAA) to combat ATI's MLAA

endlessmike133

Senior member
Jan 2, 2011
444
0
0
http://research.nvidia.com/publication/subpixel-reconstruction-antialiasing

Subpixel Reconstruction Antialiasing (SRAA) combines single-pixel (1x) shading with subpixel visibility to create antialiased images without increasing the shading cost. SRAA targets deferred-shading renderers, which cannot use multisample antialiasing. SRAA operates as a post-process on a rendered image with superresolution depth and normal buffers, so it can be incorporated into an existing renderer without modifying the shaders. In this way SRAA resembles Morphological Antialiasing (MLAA), but the new algorithm can better respect geometric boundaries and has fixed runtime independent of scene and image complexity. SRAA benefits shading-bound applications. For example, our implementation evaluates SRAA in 1.8 ms (1280x720) to yield antialiasing quality comparable to 4-16x shading. Thus SRAA would produce a net speedup over supersampling for applications that spend 1 ms or more on shading; for comparison, most modern games spend 5-10 ms shading. We also describe simplifications that increase performance by reducing quality.

why should anyone ever buy another ati card?
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Just read it on TPU and was about to post. Great news for nVidia card owners! MLAA is an amazing feature and the green team users will most definitely appreciate the increase in IQ, should it be similar to MLAA.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
And I will be boycotting any software developer that uses it!


On a serious note, I am always happy to see higher IQ being pushed by anyone.
 

Pohemi

Lifer
Oct 2, 2004
10,859
16,927
146
You really want to pay $650 and $400 for brand new cards again? $300 for old cards again?...

This is why I went ATI with my latest build. <$300 for a 6950 2GB card, or $300-550 for anything Nvidia 70 series or better (ie 470, 480, 570, etc).
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Is this the one that's artifacting?

No it stopped artifacting (and I really have no idea how it stopped)! :)
See, ATI cards fix themselves, amongst performing other good in this world. While nVidia cards are responsible for global warming as well as the extinction of the dinosaurs. :D
 
Last edited:

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
did they apply microsoft's cleartype (subpixel hinting) for fonts in windows to 3D? it also yields blurriness to all text, but it is easier to read.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
And I will be boycotting any software developer that uses it!
1 lol was had right now.

I didn't read past the first page because it was so ridiculous but I take it the guy was trolling?
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
And I will be boycotting any software developer that uses it!

actually, game developers won't make the choice of using it or not, as it would be a post-processing filter like MLAA and the choice would be yours to make.

boycott yourself then :)
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I welcome anyone to check the AA link in my signature for more info about AA.

MLAA didn't need a response, it was importing the "crappy console AA" to PC... the reason it was never used in PCs is because its really bad compared to any form of real AA.

why should anyone ever buy another ati card?

I read through the nVidia statement, SRAA is exactly the same as MLAA in all the details they provided. They claim it uses an unspecified "better algorithm" but nothing they said differs from MLAA and explains how it is better. I highly doubt it will be any less bad then MLAA.
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
MLAA didn't need a response, it was importing the "crappy console AA" to PC... the reason it was never used in PCs is because its really bad compared to any form of real AA.

I don't think "crappy console AA" uses any sort of intelligent algorithm to apply the blur effect, it just blurs the entire screen. I was comparing L4D2 on my PC and Xbox 360 side by side and the blur effect on the Xbox 360 did nothing to reduce jaggies and was also not reproducible through any settings on my PC (including MLAA).

MLAA does do a good job of only targeting transition areas that can use smoothing, and it actually does take a substantial amount of resources I don't think consoles can spare, especially in shader heavy games since the post effect requires the use of the shaders. It was unfortunate MLAA is so demanding because I was hoping for a cheap AA method for Eyefinity, which it didn't turn out to be.

So I think MLAA is quite a bit superior to what you're making it out to be, but still quite inferior to real AA.

I also don't see the point of SRAA so much since nVidia usually supports real AA through its driver efforts and profiles anyway. The point of MLAA was for AMD to have a fail safe AA method so no one could say it doesn't support AA in Starcraft 2 again.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Check out Dead Space or Borderlands with MLAA and compare it with MSAA or SSAA. If anyone thinks the usual AA modes look superior in those two games, they must be blind.

MLAA is not the be-all-end-all AA mode. But in some games it offers a superior result compared to the regular modes - which either don't work at all or offer almost no IQ improvement. It's an extra feature that can be used separately or on top of the "usual" ones. Personally I am happy nVidia follows AMD with providing more IQ choices for consumers :thumbsup:
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
This thread is a train wreck. Either post about MLAA/SRAA or stay out.

Super Moderator BFG10K.