• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

1920x1200 resolution: do you use AA/AF?

I use 16xAF and 4xAA as mandatory minimums, unless the game doesn't allow AA.

For Crysis I use 2xAA as 4xAA is too slow.
 
At 1920x1200 . AF yes absolutley 8x or 16x there really isn't a performance different between the two. AA when I am able too. Crysis gets 0x but everything else atleast 2x.
 
It's typically no AA for me in recent games. The jagged edges are easily noticeable at any resolution up to 2048x1536 (the max I can use), but I can't spare any frames for AA in modern titles. I use 4x or 6x in most games older than two or three years though.

I have AF set to 16x for everything. AF's performance hit on most modern cards is generally so small that it makes no sense not to use it.
 
I'll turn on AA if the game runs smooth enough with it on. Generally speaking though, jaggies don't bother me all that much, at least nowhere near as much as low FPS do.
 
Now the question becomes do you use vsync. I have noticed that my fps is generally locked to nothing more than 60fps when I have vsync on. Does this make any difference to anyone? I wish there were a way around this, but tearing is a big issue.

Was just curious how many of the people on these forums were using AA in their games. AF is no penalty I know.
 
Originally posted by: PhatoseAlpha
Don't most LCDs have a refresh rate of 60hz anyway, so more fps wouldn't help?

but you see since you're only getting 60fps the game is running slower than if you didn't use vsync.
 
Originally posted by: cmdrdredd
Now the question becomes do you use vsync. I have noticed that my fps is generally locked to nothing more than 60fps when I have vsync on. Does this make any difference to anyone? I wish there were a way around this, but tearing is a big issue.

Was just curious how many of the people on these forums were using AA in their games. AF is no penalty I know.

As a general rule, I try to use at least 4xAA/16xAF and v-sync at my native rez of 1920x1200. Of course, there are situations when that is not possible. V-sync is usually the first thing I sacrifice, and then I try 2xAA and maybe a bump down in resolution. I really dislike 0xAA.
 
I use the most as 4x at 1920x1200. Games that take a hit with AA such as UT3 and Crysis i just disable AA all together.

The lack off AA still makes the image look 'dirty' even at 1920x1200.

I use 16x AF in everything.
 
Now I wonder. If everyone says the games look crummy at 1920x1200 with no AA what about the times when you had to use 1024x768 or 1280x1024 and that was the absolute highest resolution you had at the time. Now I know some games ran poorly at that resolution way back when, how did you manage? 😉
 
Originally posted by: cmdrdredd
Now I wonder. If everyone says the games look crummy at 1920x1200 with no AA what about the times when you had to use 1024x768 or 1280x1024 and that was the absolute highest resolution you had at the time. Now I know some games ran poorly at that resolution way back when, how did you manage? 😉

:laugh: werd, DOG!


I can see the future:
2D image? Let me get this right, you stared at flat monitor? How is that possible? How did you manage!?!? Aight later, pluggin' back in my 7-dimension Virtual Reality. :thumbsup:
 
Originally posted by: cmdrdredd
Now I wonder. If everyone says the games look crummy at 1920x1200 with no AA what about the times when you had to use 1024x768 or 1280x1024 and that was the absolute highest resolution you had at the time. Now I know some games ran poorly at that resolution way back when, how did you manage? 😉

Didn't know any better my friend 🙂

Still looked better than the blocky PS1 GFX 😀
 
Originally posted by: Dkcode
Originally posted by: cmdrdredd
Now I wonder. If everyone says the games look crummy at 1920x1200 with no AA what about the times when you had to use 1024x768 or 1280x1024 and that was the absolute highest resolution you had at the time. Now I know some games ran poorly at that resolution way back when, how did you manage? 😉

Didn't know any better my friend 🙂

Still looked better than the blocky PS1 GFX 😀

Thats why I bought a Nintendo 64 at that time =)
4x AA / 16x AF standard for most games. HL2/CSS and that era, 8xQ AA (as it looks glorious).
 
Originally posted by: Dkcode
Originally posted by: cmdrdredd
Now I wonder. If everyone says the games look crummy at 1920x1200 with no AA what about the times when you had to use 1024x768 or 1280x1024 and that was the absolute highest resolution you had at the time. Now I know some games ran poorly at that resolution way back when, how did you manage? 😉

Didn't know any better my friend 🙂

Still looked better than the blocky PS1 GFX 😀

Exactly. You also have to remember that games overall didn't look as good back then than they do now. Quake 2 without AA isn't exactly the same as say Oblivion without AA because Quake 2 doesn't look very realistic anyway, and the textures aren't as high rez. Just because this rocked my world at one time, doesn't mean I don't appreciate higher IQ with AA now.
 
I just don't understand why people say that 1920x1200 with 0x aa looks bad when alot of people are using 4xAA at 1280x1024 and the game looks spectacular. I would think that the increased resolution was more benefitial than doing AA.

Anyway I just got a 24" monitor here and have been messing with it. I may not get to keep it (dad bought it for his new system at work) but was looking for how people handle decreased FPS with increased resolution and what level of AA they typically use. In the event that I get to keep it I'll have an idea of what typically is the best settings.
 
I think you misunderstood me a little. Inevitably the image in some games looks a little dirtier but its far away from looking bad. I would take the higher res and lack of AA than a lower res with AA.

Games like UT3 which have depth of field and motion blur takes some of the aliasing away.

I took some screens for you:

Call of Duty 4 - 0X AA
Call of Duty 4 - 0X AA
Call of Duty 4 - 4X AA
Call of Duty 4 - 4X AA

Unreal Tournament 3 - 0X AA
Unreal Tournament 3 - 0X AA
Unreal Tournament 3 - 4X AA
Unreal Tournament 3 - 4X AA

As you can see there is hardly any noticeable difference in UT3, but the frame rate difference speaks for itself.
 
Originally posted by: cmdrdredd
Originally posted by: PhatoseAlpha
Don't most LCDs have a refresh rate of 60hz anyway, so more fps wouldn't help?

but you see since you're only getting 60fps the game is running slower than if you didn't use vsync.

thats why you game with a crt =) 1280x1024 vsync 100hz
 
I always use the highest AA that I can.

In older games, 4x or more AA is no problem on my x1800xt, on many newer games, AA is not a viable option.
 
Originally posted by: zeroburrito
Originally posted by: cmdrdredd
Originally posted by: PhatoseAlpha
Don't most LCDs have a refresh rate of 60hz anyway, so more fps wouldn't help?

but you see since you're only getting 60fps the game is running slower than if you didn't use vsync.

thats why you game with a crt =) 1280x1024 vsync 100hz

1600x1200 @ 100hz here 🙂 (Dell P1230 22" Trinitron)
 
Originally posted by: zeroburrito
Originally posted by: cmdrdredd
Originally posted by: PhatoseAlpha
Don't most LCDs have a refresh rate of 60hz anyway, so more fps wouldn't help?

but you see since you're only getting 60fps the game is running slower than if you didn't use vsync.

thats why you game with a crt =) 1280x1024 vsync 100hz

heat, weight, space no thanks.
 
Back
Top