Rig:
AMD 64 3200 Winchester @ 2.4GHz
2 X 1GB DDR 400 RAM using Divider 333 running @ 399MHz 1T 3-3-3-8
Radeon X1900XTX
All settings at 1280x1024 MAX in game except in game AA
CCC- default
1)In game AA - Disabled
Avg: 48.6
Max: 148
Min:29
2)In game AA - Enabled
Avg: 49.5
Max: 146
Min: 29.5
Yes, FPS increase with AA enabled
3)CCC 16x HQ AF 4x AA
Avg: 47
Max:150
Min:26
Now, the thing I didn't like about the last option is that I get a min of just 26FPS whereas I would want it to be atleast around 30FPS. Does a min FPS of 26 or 30 make a big difference?
Is the in game test more stressthan than real world gameplay?
AMD 64 3200 Winchester @ 2.4GHz
2 X 1GB DDR 400 RAM using Divider 333 running @ 399MHz 1T 3-3-3-8
Radeon X1900XTX
All settings at 1280x1024 MAX in game except in game AA
CCC- default
1)In game AA - Disabled
Avg: 48.6
Max: 148
Min:29
2)In game AA - Enabled
Avg: 49.5
Max: 146
Min: 29.5
Yes, FPS increase with AA enabled
3)CCC 16x HQ AF 4x AA
Avg: 47
Max:150
Min:26
Now, the thing I didn't like about the last option is that I get a min of just 26FPS whereas I would want it to be atleast around 30FPS. Does a min FPS of 26 or 30 make a big difference?
Is the in game test more stressthan than real world gameplay?