Anyone here NOT use AA when gaming?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I never use AA. Even if I'm playing an older game that my rig could easily run with AA on, I don't use it. While I'm playing I'm usually too focused to notice jaggies.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,486
2,363
136
I run 1920x1200. While I do see some improvement when running AA, it doesn't look THAT much better, and given that running AA at such resolution is too hard on any card out there I chose not to. As I said before at such resolution the difference in quality is minimal anyway.
 

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
Originally posted by: OneOfTheseDays
I see a lot of people here are adamant about AA, but for me i've never really been bothered about jaggies at a 1920X1200 resolution. If I have to run a game at 1024X768 then I usually turn on AA, but when running at my native res it's really a non-issue for me. Anyone else feel the same way?

Same here, no need for that drop in performance when I'm already pushing everything to limit on 1920x1200.

AA was good thing for texture rich games with low poly count, today's hi poly count and shaders does not have much need for AA.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I can easily see the need, even at 1920x1200. Some games more so than others. Some engines are more prone to jaggies, and the Battlefield engines are to me. BF2 was very jaggie, even at such a high res. AA clears it up nicely. Of course if it puts the frames to low to be playable to me, I dont use it. But if its a playable option, I always use AA. Jaggies are peeve of mine.. like nails on a chalkboard. One reason why Im annoyed some games dont support AA.
 

kmmatney

Diamond Member
Jun 19, 2000
4,363
1
81
I don't like AA. My current card can handle AA on older games, however I find that it makes textures a bit blurry. I'd rather have sharp textures, than a screen that is slightly blurry from AA.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
depends on the performance. AA is the first to go for me to maintain a good framerate. Then comes Resolution.

16xAF is a MUST for me. I can't stand anything lower.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I have yet to have a rig capable of using AA on current games... I upgrade once a year but I always turn off AA on new games and usually lower the res to keep things playable...
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I am really surprised to see everyone's comments here, save for a few.

I am with BFG10K1 on this issue.

I will drop the resolution (provided it scales well) to enable AA. IMO, I don't think any game should be without it. But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

Even so I can understand turning it off to get performance, but I cannot understand someone not using it when they have the performance available... I can't believe some people believe AA looks bad... My guess is they are running the shatty algorithm of Quincunx or whatever that was, because that was horrible.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
Originally posted by: ArchAngel777
I am really surprised to see everyone's comments here, save for a few.

I am with BFG10K1 on this issue.

I will drop the resolution (provided it scales well) to enable AA. IMO, I don't think any game should be without it. But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

Even so I can understand turning it off to get performance, but I cannot understand someone not using it when they have the performance available... I can't believe some people believe AA looks bad... My guess is they are running the shatty algorithm of Quincunx or whatever that was, because that was horrible.

Reducing resolution, especially on an LCD, then enabling AA makes no sense...
 

CP5670

Diamond Member
Jun 24, 2004
5,667
766
126
I can easily see the jagged edges at any resolution up to 2048x1536 (my monitor's max), but I keep it off anyway as I rarely have any performance to spare on my aging X1900XTX in relatively recent games.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
you have a 2048x1536 monitor with a X1900XTX? How big and expensive was your monitor?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
My current card can handle AA on older games, however I find that it makes textures a bit blurry.
Unless you're running a super-sampling or CFAA derivative AA will have absolutely no effect on texture quality; MSAA and CSAA only affect geometry edges.

16xAF is a MUST for me. I can't stand anything lower.
I totally agree.
 

Dainas

Senior member
Aug 5, 2005
299
0
0
I do sometimes on my mac as 1920x1200 is pushing it for a 7800GS, kinda a tragedy as LCDs 'need' AA far more than a CRT does.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Throckmorton
Originally posted by: ArchAngel777
I am really surprised to see everyone's comments here, save for a few.

I am with BFG10K1 on this issue.

I will drop the resolution (provided it scales well) to enable AA. IMO, I don't think any game should be without it. But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

Even so I can understand turning it off to get performance, but I cannot understand someone not using it when they have the performance available... I can't believe some people believe AA looks bad... My guess is they are running the shatty algorithm of Quincunx or whatever that was, because that was horrible.

Reducing resolution, especially on an LCD, then enabling AA makes no sense...


Explain.
 

Blurry

Senior member
Mar 19, 2002
932
0
0
I kind of agree with Thockmorton here.
When not running at an LCD's native resolution, everything becomes <blurry which negates the impact of running AA.

That is of course unless you have one of those killer Gateway 30 inchers...
 

CP5670

Diamond Member
Jun 24, 2004
5,667
766
126
Originally posted by: taltamir
you have a 2048x1536 monitor with a X1900XTX? How big and expensive was your monitor?

It's a CRT. It cost $700 or something a few years ago. It's easily superior to anything else I have seen.

I haven't upgraded the card since before a month ago, it made no sense to buy the 8800 GTS/GTX cards given how long they had been out, and now that the 8800GT is out, I have no time to play games. :p If that D9E February rumor is true, I will just wait for that.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,415
404
126
Originally posted by: OneOfTheseDays
I see a lot of people here are adamant about AA, but for me i've never really been bothered about jaggies at a 1920X1200 resolution. If I have to run a game at 1024X768 then I usually turn on AA, but when running at my native res it's really a non-issue for me. Anyone else feel the same way?
Yes, but the native res of my LCD is 1280x1024 (primary). Haven't moved to gaming on the secondary yet (1680x1050).
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: ArchAngel777
But I will agree that some games are more prone to it than others. Crysis does not need AA as much as many indoor engine based games because there are very few straight lines. FEAR is a good example of a game NEEDING it. FEAR looks like ASS without AA.

QFT. If my card can handle it, I'll do AA. But Crysis for instance just doesnt really need it due to the mostly organic outdoor environment (not like there is any hardware available that could run it with high AA at a reasonable resolution and high settings anyway). FEAR is the perfect example indeed for a game that looks like a sack of shit without lots of AA though.

 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
I have a very clear criteria for using AA:

I have the Samsung 204B native res 1600x1200@60KHz so first thing I set the game to 1600x1200 and medium detail with VSync.

If I hit 60FPS I raise the detail (not shadows though) if i still hit 60 (like i do now in Hellgate London, which is very fun BTW :) ) i'll start using AA.

Usually, I set up my games to hit 50-60FPS, but thats a personal choice.
 

RyanPaulShaffer

Diamond Member
Jul 13, 2005
3,434
1
0
I see no need for it, especially with the huge performance hit it usually incurs. I guess I'm too focused on actually playing the game to notice if an edge is slightly jaggy or not.

Solid FPS > Slightly smoother lines

/shrug
 

manowar821

Diamond Member
Mar 1, 2007
6,063
0
0
The only game I don't use it on is Crysis, for obvious reasons... I can run at higher graphics detail if I keep AA off, and it looks decently nice at 1280x1024 without it, anyway.

In 2 years when I have hardware that can blow it away, I'll probably turn AA all the way up, like I do with every other game.
 

Golgatha

Lifer
Jul 18, 2003
12,400
1,076
126
Priorities:

#1 - Run native resolution of 1680x1050.

#2 - Max out in-game quality settings.

#3 - Enable 2xAA and 8xHQAF or better.

#4 - Upgrade if I can't do #1, think seriously about upgrading if I can't do #2, and upgrade if there is a significant upgrade for very little money out of pocket if I can't do #3. Crysis is the notable exception here. I play this game with 8xHQAF and no AA at native resolution and I'll drop quality settings to do so. I would love to run Crysis with at least 2xAA due to major foliage jaggies, but it just isn't happening with a single 8800GT.
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
Originally posted by: adairusmc
I game at 1680x1050 - and I have never seen the need to run any AA.

I also game at 1680x1050 on a 20" monitor. I remembered playing Oblivion with HDR and no AA at first, and then the hack to enable HDR+AA came out and it made a big difference visually, big enough that I never wanted to go back even though I had to sacrifice performance.