• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Enabled 16xAF in Half-Life 2 - can't tell the difference.

In my never-ending search to tweak things for no good reason, I tried bumping from 8xAF in Half-Life 2 to 16xAF to see if my frame rate could handle it. I haven't seen much of a drop, but on the other hand I didn't see a big visual improvement either ... Thinking maybe my card - BFG 6800 OC - didn't handle 16xAF in Half-Life 2 (It won't run 6xAA - if you select that option the game just disables AA) I checked, but no, there it is, right in the Nvidia display panel, so it should be enabled.

Anyone else see a big difference with 16xAF?
 
In many real-world situations, you will not see an appreciable difference between 8x and 16xAF. Beyond a certain point, the textures just can't be filtered any further (which is why both ATI and NVIDIA use adaptive AF these days; even 4x filtering can be overkill at times).
 
16xAF pretty much doesnt do any more than 8 does.

by the time u got past 8xAF textures just dont need it any more, so the card doesnt do it. i think 16xAF only gets applied in rare occasions where extreme angles are involved...hence why u cant see an immediate difference and hence why 16xAF doesnt impact performance like you'd think. i did the stress test and a few other graphics tests and the diff between 16 and 8 x was always less than 5 fps (on top rig in sig)
 
also, nvidia cards dont do 6x FSAA. its an ATI thing only. 2x, 2xQ, 4x, 8xQ are your only options as far as FSAA goes.
 
The games were you'll see the most difference between 8x and 16x are ones that use very long distances (Serious Sam, Unreal, flight sims, etc). If HL2's areas are quite small you won't see much difference.
 
I thought nVidia did support 6xAA, but only super-sampling and not multisampling... I run 6xAA with my 6800GT in Source and it seems to smooth the models quite nicely--then again, I also don't take screencaps and zoom in on the edges of every part of the image to see if there are any jaggies (and then proceed to cry on a tech forum "OMFG JAGGIES NVIDIA SUX"). Running 1280x1024 may also have something to do with the image quality...

Anyway, like everyone else said, there's no point to 16xAF.
 
Graphics do look better with 16xAF, but in HL2 I doubt it because most of the time you are in enclosed areas where 8x is just fine. But 16xAF clears up some blurry textures at the end of 8xAF that just need to be cleared up, just that the performance drop, small but still a drop, the rarity of it's actual usage, and the near negligible IQ increase make it an nonefficient feature. But of course that's all based on opinion.
 
Shinei - when I enabled 6xAA in the source video stress test, it effectively disabled AA. Remember, I was enabling AA through the in-game console, NOT in the Nvidia console. I bet if I forced my card to run in 6xAA with the Nvidia console, then it might work in Source, but I'm not positive.
 
thats becuase nvidia cant do 6xAA, as some one already pointed out

2xAA, 2xQAA, 4xAA and 8XAA are ur only choices, 8xAA being a supersampling multisampling mish mash that kills performance, but looks wicked. you can get away with it on the original HL and UT though 😛
 
Back
Top