Nobody is hating man, come on get real. FXAA/SMAA/MLAA these are not nvidia or AMD dependent features, they run on either vendor's card.
They're s**t. If you want to accept lowered IQ well go ahead and smile while you are fed a crap sandwich. Thought we were supposed to be going forward with visual fidelity, not backwards.
The problem is the nvidia zealots who get pumped when anything related to nv is released and rage if someone does not jump on board. Good grief, I could care less about nv & unreal's tech demo, there is not even a game out there that looks like this. I do care about devs and now nvidia trying to push a feature that reduces IQ.
FXAA = blur city
![]()
No FXAA = crisp textures and clarity
![]()
Four year old game looking better than its sequel in 2011 = priceless
![]()
Or compare
Crysis 1 PC vs 360
http://www.youtube.com/watch?v=xQ_1m...&feature=feedu
Crysis 2 PC vs 360
http://www.youtube.com/watch?v=xtHAP...424&feature=iv
Crysis 2 on PC vs 360 is not much different. Crysis 1 on PC vs 360 is comparing a Ferrari to a Pinto.
I still dont' see why you insist FXAA has no value, Groove. We have to face the reality that PC development comes second to consoles, and many console titles do not support native AA. This is where fast AA comes in handy, even if the game does not support native AA, MLAA/SMAA/FXAA will generally work.
The other scenario is a demanding game that would take too large of a performance hit for MSAA (ie, BF3) , in these cases a user could take advantage of FXAA which looks pretty good with little performance penalty. Obviously MSAA will have a large performance penalty in comparison.
I can't argue against those who see value in it. For me, it's crap, I'd be willing to to use more cards to be able to run deferred MSAA - but that's just me. It certainly has value if someone is not concerned with IQ to the point that they'd prefer no aliasing at the expense of IQ. It's a trade-off is all.
I prefer to have it all and would rather not compromiseI run BF3 on ultra with 2xmsaa and fxaa off. I can't run 4xmsaa consistently, I get hitching with it unfortunately. Two GK110 had better resolve that for me.
These new AA modes are popular exactly because of console and cross-platform titles. They are perfect for consoles because they come at such a low performance hit. I would prefer PC games are pushing boundaries though and not compromising on IQ. I have large doubts they will ever eliminate the blur introduced by FXAA/MLAA/SMAA. Even today MSAA is not as good as SSAA, so all these post AA modes have a long ways to go.
Although I agree with the console part, I disagree with the rest.
Don't have such an obtuse view of post process AA methods just because FXAA and MLAA are poor implementations. Full scene SSAA hasn't been viable for a long time now, and MSAA + TrSSAA is becoming less viable very quickly, the future is most certainly some kind of a post process AA. They just need to get it right. SMAA is on the right track. It's surprisingly very good already. Even in it's baby stage (1x), it's already superior to any other AA method in deferred shader games. And it only gets better with SMAA S2x and 4x. It doesn't blur textures and it gives polygon, transparency and shader anti aliasing, with almost zero performance hit. You can't get that with any other kind of AA method today except full scene SSAA, which is simply not an option.
No he was being serious with his straw man.
Anandtech - Where opinions and subjectivity go to die.
It does blur textures. It is becoming a necessary evil with deferred shader games and devs not wanting to implement proper msaa.
Even one of the creators of smaa acknowledges it is inferior and is good because other options are not being presented or the performance hit of proper msaa is too intensive.
http://forum.beyond3d.com/showpost.php?p=1596576&postcount=1105
4xSMAA starts to actually introduce 2xmsaa and 2xtraa which is why it looks improved. But the blur is still there and it starts to introduce a performance cost as well. It is a good solution, but full on msaa or ssaa is preferable. I think ssaa is always going to be too much for the newest games, but older games can make use of it. MSAA nees to be kept on the table.
Personally, it is a good option if you do not care about the reduced IQ, but there are hardware setups capable of pushing msaa in deferrd shading games and it needs to be kept on the table.
But MSAA by itself is not very effective on deferred shading. It's pretty awful on anything besides edges.
And nothing is perfect. Even FS SGSSAA blurs everything unless you force a negative LOD which doesn't really restore the textures as they're meant to be to be seen, it's just a cheat i.e. like bicubic down sampling with sharpening.
Considering the results that SMAA 4x potentially puts out, at a fraction of the cost of MSAA (never mind MSAA+TrSSAA or FS SGSSAA), it's a very good solution.
Think this way
FXAA = 50% IQ for 5% hit
SMAA = 90% IQ for 10% hit
MSAA + TrSSAA = 95% quality for 30+% hit
FS SGSSAA = 99% IQ for 50+% performance hit
Which one would you take? I beg you to think practically, not strictly as an enthusiast. Which is going to benefit the majority of gamers? Which should developers focus on?
Lottes has 4.0 on his blog, but in motion SMAA Tr2x looks a lot better than both MLAA and FXAA and loses only slightly to SSAA visually, but has merely a 2ms performance penalty. It's a shame these post AAs are criticized so badly as they're built into Dice's and Cryteks newest engines, and on the consoles. SSAA isn't practical because of the performance hit, and this particular demo running on Kepler is a great example of the practical use of a low cost post-AA. Visually, they can reduce digital artifacts like aliasing and keep the framerate within budget. The blur is cost.
I think the "oh this looks horrible" opinions will be overlooked when devs push visuals like "Samaritin" and Crysis out on platforms that aren't budgeted for traditional AA types. The poor man's AA in the console age. heheh.
All of them. I want the choice of them all, not just the best performing but lowest quality option. Fwiw you can force sgssaa in Crysis 2, it is pretty intensive, but I could live with the frame rate. So the option should be there imo, no reason we have to just have one rather than all the possible choices.
![]()
Geforce 670TI
Blower type fan, she must scream something aweful!
I agree perfectly but you already know you're not going to get that. We never have.
1) FS SSAA has never been an option in games.
2) MSAA is not always present, and at times it has AMD/NV quirks (requiring more work from the developer, i.e. UE3 engine)
FXAA and SMAA can be injected painlessly on both vendors, addresses shader induced aliasing, and has almost no performance hit.
Which do you think is going to have all the focus for further optimization?
Say you could only pick between NO AA and one of the 4 options I gave you before, only one. Which would you pick?
"That board we ran the Samaritan demo on is the same board we're running the Unreal Engine 4 demos on, we can get so much more out of the card than what you saw in Samaritan."
Editor's note
I was actually at the event and personally asked Mark Rein about the differences in rendering year to year.
The three-way GTX 580 was run with 4x MSAA. The same demo was run yesterday on a single 'Kepler' card, at the same settings, but with MSAA substituted with FXAA. The end result was a comparable image but with far less computational penalty for running the pixel-shader-based AA over conventional MSAA.
For what it's worth, a 'Kepler' card is fundamentally faster than a GTX 580 at exactly the same settings, though I can't say by how much.
As for Unreal 4...
Edit: Little more info on the demo..
http://hexus.net/tech/news/graphics...itnessed-running-unreal-4-unreal-3-samaritan/
Have you tried version 2?
Also you said you had the same problem with games like WoW, where it would blur text.
I'm using 1.2 of smaa (fxaa does not work) and am not able to reproduce the blurring of text.
Could have also been motion blur.
Oh I hate motion blur.
Here is when I see motion blur IRL:
1. When looking at the road VERY close to my car while driving at 50+ MPH. (motion blur is only applied to the ground very near the car, as in the first 2-3 feet, the rest is unblurred)
Here is when I don't see motion blur IRL:
1. Turning my head fast.
2. Turning my head very slowly.
3. Walking
4. Running
5. Jumping
6. Falling
And the worse part is their stupid argument of "In real life its not like you have an HD camera floating 6 feet off the ground"
No, in real life I have eyes floating 6 feet of the ground. And they don't do this kind of horrid motion blurring.