• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

resolution or AA, AF

DJFury

Member
I'm just not getting the kind of performance I expected with this setup:

P4 3.4ghz 80mhz FSB
1gb DDR2 533mhz
74gb Raptor
160gb Barracuda w/ NCQ
20.1" Dell 2001fp
256mb ATI X800XT

I'm running CS: Source at 1600x1200 4xAA 8xAF VSync off and my fps is around 50-60 consistently, but sometimes drops to 30s in big firefights or going up to 80 in smaller tunnels/hallways.

Running at 1280x1024 doesnt help much. Is it better to lower the resolution to say 1024x768 or stay at 1600x1200 and decrease to 2xAA 4xAF? Should I decrease mipmap detail? What about quality of reflcetion? Which would look better? My vid card is running at around 50 degrees C, is that what's affecting performance? I need more frames, help!!

 
It is definetely the 80mhz FSB...I mean...damn.. 🙂

Anyway, I have heard a lot of the "professional" reviewers state that it is indead better to run at high res than at low res with AA and AF. I am on the middle ground...
 
I don't seem to get many increases in HL2 at all, except for turning the water from "Reflect All" down to "Reflect World". You might possibly get a decent increase in frame rates by turning AA down to 2x - I have to do that in FarCry because 4x is just too slow most of the time.

For reference, I have a 6800GT @ Ultra speeds and a 2005FPW (1680x1050).
 
counterstrike is hammering your CPU, its a very cpu bound game, especially in bit fire fights, i get similar drops at lower resolution of 12x10 with 2xaa and 8xaf, on a 40player office server and it can only be the cpus fault
 
Originally posted by: JonnyBlaze
at 1600x1200 you dont need 4xAA. its just dumb. shut it off. crank up the af tho.

JBlaze

I don't know that it's "dumb" necessarily - I can easily see the jagged/stairstep lines even at 1600x1200 (or in my case, 1680x1050)... But like I said above, 2xAA is a lot better, and usually enough (and all you're going to get at decent frame rates in newer games 😛)
 
i play at 1600x1200 but on a 19" screen. i guess on a bigger screen youll see them more. id say try and even higher res but you cant on an lcd.

JB
 
Turn AA down to 2x and turn on temporal. AF can be cranked up to 16x if you want, shouldn't hurt performance. AA hurts performance. I didn't use AA when I was at 1600x1200 myself, I have a 19" monitor.
 
Yeah, I notice quite a bit of alising even at 1600x1200 and 2xAA on my 21", about the same as 2xAA at 1024x768 on my 17"; 4xAA in either case pretty much clears that up and 6xAA is just butter. That said, with an lcd I would try to keep the game at your native resolution of 1600x1200 wtih whatever mix of eyecandy you prefer and can spare, or drop it down to 800x600 for even scaleing and crank everything.

however, DJFury's preformace figures still seem off to me. Running my barton at 2.2ghz with a x800xt-pe, I play CS:S at 1600x1200 with everything including AA and AF cranked and manage to keep my framerate in the 30s or above. Given he has a notably faster CPU, seems he should be doing at least a little better than me instead of worse.
 
at 1600x1200 you dont need 4xAA. its just dumb. shut it off. crank up the af tho.

JBlaze
I don't think so, because I still notice jaggies. While they are forgiveable, I don't think it's dumb if you have the bandwidth to burn.

Try turning just AA off.
 
Originally posted by: malak
Turn AA down to 2x and turn on temporal. AF can be cranked up to 16x if you want, shouldn't hurt performance. AA hurts performance. I didn't use AA when I was at 1600x1200 myself, I have a 19" monitor.

I wouldnt advise temporal aliasing in Hl2... if i enable it on my x800xtpe i get drops to 42 fps, with it off i get 100+


edit: i just tested, 1600x1200 on a 12man aztec server and was usually about 80 fps, but in firefights sometimes dropped to 50ish and highs of 90, so it seems about right for you

double edit: with that shiny ram you ahve, you should be able to overclock your 3.4 fairly well
 
There's no reason to turn down any of the features, as 30 fps as a low is still very, playable. Your eyes can't notice the difference unless it gets less than that, so it doesn't make sense to me to turn anything down.
 
Who told you all that horse crap? I can notice 240+ frames per second.

Where do these people get this information? Some say 30, some say 40. It's BS.

30fps is the standard minimum that consitutes playablility.
 
Originally posted by: VIAN
Who told you all that horse crap? I can notice 240+ frames per second.

you are both full of horse crap, you wouldnt notice the difference between a steady 61FPS and a steady 300 FPS . and i would bet my life on it.

your eye can only see a max of 60 FPS. end of story, its a fact. the rest is myth. and "playability" depends on who is playing.
 
anything above 40fps is blended together by human eye seamlessly. Movies in the US run at about 30 fps. The problem is getting that to hold smoothly...
 
actually film is at 24FPS, and that is very playable on games other then first person shooter games, (i.e RTS games, RPGs, adventure games, etc)
 
Originally posted by: TheOasis
Originally posted by: VIAN
Who told you all that horse crap? I can notice 240+ frames per second.

you are both full of horse crap, you wouldnt notice the difference between a steady 61FPS and a steady 300 FPS . and i would bet my life on it.

your eye can only see a max of 60 FPS. end of story, its a fact. the rest is myth. and "playability" depends on who is playing.

Yeah, your arm can only shotput 40ft, the rest is a myth. Seriously, some people's eyes/brains are more adept at discerning framerate than others just like some people can chuck a shotput further than others. But if someone told you he could put a shot 240ft, I wouldn't take his word for it. 😉
 
The general rule of thumb is to leave AF maxed, raise the resolution as high as possible and then use AA if you've still got power to burn.

at 1600x1200 you dont need 4xAA. its just dumb
I run 1920x1440 and in some games I use as much as 8xAA.

I wouldnt advise temporal aliasing in Hl2
I wouldn't advise it anywhere.

as 30 fps as a low is still very, playable.
you wouldnt notice the difference between a steady 61FPS and a steady 300 FPS
anything above 40fps is blended together by human eye seamlessly.
Nonsense.
 
I have an FX-55 2GB RAM 2xBFG 6800GTs and a pair of Raptors and at 1920x1200 4xAA 8XAF I notice slowdowns during certain portions of the game.
 
wow thanks for all the input. I lowered to 2xAA and my fps went up 20 to 70-80fps on avg with no noticeable difference in image quality. Oh yea, is it better to adjust settings within the game or in the ATI catalyst driver? Also if my vid card is running at 50C is it advisable to overclock?
 
Back
Top