If you play games at 1280x1024 or 1600x1200 then you really don't need AA
Originally posted by: BenSkywalker
If you play games at 1280x1024 or 1600x1200 then you really don't need AA
On what, a 14" monitor? Even running 2048x1536 you still benefit from AA(although that setting is much crisper then 1600x1200) and AF you will benefit from also. In terms of a typical consumer display and normal eyes, you would need to run a res around 8000x6000 to eliminate noticeable aliasing, although you certainly reach a point of diminishing returns well before that.
If you have any card faster than a 9700 Pro there's no reason to use anything lower than 16xAF.The best combination of AA/AF is 4xAA/8xAF. If you have an Nvidia card 2xQAA/8xAF.
Originally posted by: g33k
I have a question. What happens if you enable AA and AF through the video driver and the game? Say it's doom3 and you have x4AA enabled in the game, but you have x2 enabled in the driver. Does the game override the driver or is it vice versa?
So which one really consumes the GPU, AA or AF?
Originally posted by: otispunkmeyer
Originally posted by: g33k
I have a question. What happens if you enable AA and AF through the video driver and the game? Say it's doom3 and you have x4AA enabled in the game, but you have x2 enabled in the driver. Does the game override the driver or is it vice versa?
drivers overide game...unless of course you have the drivers set to application controlled
16x is just a bit better than 8x and not worth it. I would rather have the performance increase IMO.If you have any card faster than a 9700 Pro there's no reason to use anything lower than 16xAF.