• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Anyone here NOT use AA when gaming?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I never use AA. Even if I'm playing an older game that my rig could easily run with AA on, I don't use it. While I'm playing I'm usually too focused to notice jaggies.
 
I run 1920x1200. While I do see some improvement when running AA, it doesn't look THAT much better, and given that running AA at such resolution is too hard on any card out there I chose not to. As I said before at such resolution the difference in quality is minimal anyway.
 
Originally posted by: OneOfTheseDays
I see a lot of people here are adamant about AA, but for me i've never really been bothered about jaggies at a 1920X1200 resolution. If I have to run a game at 1024X768 then I usually turn on AA, but when running at my native res it's really a non-issue for me. Anyone else feel the same way?

Same here, no need for that drop in performance when I'm already pushing everything to limit on 1920x1200.

AA was good thing for texture rich games with low poly count, today's hi poly count and shaders does not have much need for AA.
 
I can easily see the need, even at 1920x1200. Some games more so than others. Some engines are more prone to jaggies, and the Battlefield engines are to me. BF2 was very jaggie, even at such a high res. AA clears it up nicely. Of course if it puts the frames to low to be playable to me, I dont use it. But if its a playable option, I always use AA. Jaggies are peeve of mine.. like nails on a chalkboard. One reason why Im annoyed some games dont support AA.
 
I don't like AA. My current card can handle AA on older games, however I find that it makes textures a bit blurry. I'd rather have sharp textures, than a screen that is slightly blurry from AA.
 
depends on the performance. AA is the first to go for me to maintain a good framerate. Then comes Resolution.

16xAF is a MUST for me. I can't stand anything lower.
 
I have yet to have a rig capable of using AA on current games... I upgrade once a year but I always turn off AA on new games and usually lower the res to keep things playable...
 
I am really surprised to see everyone's comments here, save for a few.

I am with BFG10K1 on this issue.

I will drop the resolution (provided it scales well) to enable AA. IMO, I don't think any game should be without it. But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

Even so I can understand turning it off to get performance, but I cannot understand someone not using it when they have the performance available... I can't believe some people believe AA looks bad... My guess is they are running the shatty algorithm of Quincunx or whatever that was, because that was horrible.
 
Originally posted by: ArchAngel777
I am really surprised to see everyone's comments here, save for a few.

I am with BFG10K1 on this issue.

I will drop the resolution (provided it scales well) to enable AA. IMO, I don't think any game should be without it. But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

Even so I can understand turning it off to get performance, but I cannot understand someone not using it when they have the performance available... I can't believe some people believe AA looks bad... My guess is they are running the shatty algorithm of Quincunx or whatever that was, because that was horrible.

Reducing resolution, especially on an LCD, then enabling AA makes no sense...
 
I can easily see the jagged edges at any resolution up to 2048x1536 (my monitor's max), but I keep it off anyway as I rarely have any performance to spare on my aging X1900XTX in relatively recent games.
 
My current card can handle AA on older games, however I find that it makes textures a bit blurry.
Unless you're running a super-sampling or CFAA derivative AA will have absolutely no effect on texture quality; MSAA and CSAA only affect geometry edges.

16xAF is a MUST for me. I can't stand anything lower.
I totally agree.
 
I do sometimes on my mac as 1920x1200 is pushing it for a 7800GS, kinda a tragedy as LCDs 'need' AA far more than a CRT does.
 
Originally posted by: Throckmorton
Originally posted by: ArchAngel777
I am really surprised to see everyone's comments here, save for a few.

I am with BFG10K1 on this issue.

I will drop the resolution (provided it scales well) to enable AA. IMO, I don't think any game should be without it. But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

Even so I can understand turning it off to get performance, but I cannot understand someone not using it when they have the performance available... I can't believe some people believe AA looks bad... My guess is they are running the shatty algorithm of Quincunx or whatever that was, because that was horrible.

Reducing resolution, especially on an LCD, then enabling AA makes no sense...


Explain.
 
I kind of agree with Thockmorton here.
When not running at an LCD's native resolution, everything becomes <blurry which negates the impact of running AA.

That is of course unless you have one of those killer Gateway 30 inchers...
 
Originally posted by: taltamir
you have a 2048x1536 monitor with a X1900XTX? How big and expensive was your monitor?

It's a CRT. It cost $700 or something a few years ago. It's easily superior to anything else I have seen.

I haven't upgraded the card since before a month ago, it made no sense to buy the 8800 GTS/GTX cards given how long they had been out, and now that the 8800GT is out, I have no time to play games. 😛 If that D9E February rumor is true, I will just wait for that.
 
Originally posted by: OneOfTheseDays
I see a lot of people here are adamant about AA, but for me i've never really been bothered about jaggies at a 1920X1200 resolution. If I have to run a game at 1024X768 then I usually turn on AA, but when running at my native res it's really a non-issue for me. Anyone else feel the same way?
Yes, but the native res of my LCD is 1280x1024 (primary). Haven't moved to gaming on the secondary yet (1680x1050).
 
Originally posted by: ArchAngel777
But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

QFT. If my card can handle it, I'll do AA. But Crysis for instance just doesnt really need it due to the mostly organic outdoor environment (not like there is any hardware available that could run it with high AA at a reasonable resolution and high settings anyway). FEAR is the perfect example indeed for a game that looks like a sack of shit without lots of AA though.

 
I have a very clear criteria for using AA:

I have the Samsung 204B native res 1600x1200@60KHz so first thing I set the game to 1600x1200 and medium detail with VSync.

If I hit 60FPS I raise the detail (not shadows though) if i still hit 60 (like i do now in Hellgate London, which is very fun BTW 🙂 ) i'll start using AA.

Usually, I set up my games to hit 50-60FPS, but thats a personal choice.
 
I see no need for it, especially with the huge performance hit it usually incurs. I guess I'm too focused on actually playing the game to notice if an edge is slightly jaggy or not.

Solid FPS > Slightly smoother lines

/shrug
 
The only game I don't use it on is Crysis, for obvious reasons... I can run at higher graphics detail if I keep AA off, and it looks decently nice at 1280x1024 without it, anyway.

In 2 years when I have hardware that can blow it away, I'll probably turn AA all the way up, like I do with every other game.
 
Priorities:

#1 - Run native resolution of 1680x1050.

#2 - Max out in-game quality settings.

#3 - Enable 2xAA and 8xHQAF or better.

#4 - Upgrade if I can't do #1, think seriously about upgrading if I can't do #2, and upgrade if there is a significant upgrade for very little money out of pocket if I can't do #3. Crysis is the notable exception here. I play this game with 8xHQAF and no AA at native resolution and I'll drop quality settings to do so. I would love to run Crysis with at least 2xAA due to major foliage jaggies, but it just isn't happening with a single 8800GT.
 
Originally posted by: adairusmc
I game at 1680x1050 - and I have never seen the need to run any AA.

I also game at 1680x1050 on a 20" monitor. I remembered playing Oblivion with HDR and no AA at first, and then the hack to enable HDR+AA came out and it made a big difference visually, big enough that I never wanted to go back even though I had to sacrifice performance.
 
Back
Top