• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Performance Increase?

imported_Noob

Senior member
I had just bought an X800 Pro and have realized no significant performance increase from my original 9800 Pro 256mb. Even in high demanding games such as HL2 and SC3. I know the X800 Pro is still a 12 piper. But that means about 50% more rendering power plus higher clock speeds. Could my CPU be limiting the cards potential to boost the frame rates?
 
Your CPU is pretty decent. You should definately be noticing an improvement over your 9800 however. I'm talking night and day differences.
 
Do you have the settings turned up?
Did you have the game reoptomize for the new card? (I think HL2 did this when I first installed it)
 
Originally posted by: YOyoYOhowsDAjello
Do you have the settings turned up?
Did you have the game reoptomize for the new card? (I think HL2 did this when I first installed it)

How do you get the game to reoptimize for the new card? Should I just reinstall the games?

 
I wouldn't reinstall until someone else confirms that it optomizes when you install... I can't remember if it did or not, just that it might have.

When you do into the advanced video settings, there is an asterik next to some settings that you can change once the game in installed. They're for the ones that the game recommends for your system. I'm not sure if optomizing for the new card would just change the asterik values or if it would change more of the stuff like texture loading sizes etc.

When you say you didn't notice any improvement, were you playing at the same resolution? So the framerate didn't get better?

Did you try turning it up to higher quality settings than you were able to play before? (like more AF and AA?)
 
An example is SC3 demo. With my 9800 Pro I played at 1024x768, all settings maxed, no AA and AF. With the X800 Pro I played at the same resolution and settings with AA 6x and AF 16x. I did recognize a difference on that resolution. But then when I set it to 1280x1024 the performacne just pludged very significantly. I could see if it plundged this much on 1600x1200. But this much by just adding a resoltuion with about 20% more pixels? It just seemed a little much to me.
 
what mark do you get on 3dmark 2005? you should get about 4.5-5k i think with a x800pro, if not somethings wrong probably!
 
Originally posted by: Noob
An example is SC3 demo. With my 9800 Pro I played at 1024x768, all settings maxed, no AA and AF. With the X800 Pro I played at the same resolution and settings with AA 6x and AF 16x. I did recognize a difference on that resolution. But then when I set it to 1280x1024 the performacne just pludged very significantly. I could see if it plundged this much on 1600x1200. But this much by just adding a resoltuion with about 20% more pixels? It just seemed a little much to me.

6xAA and 16x AF will start to bring all cards down as the resolution goes up. To play at higher resolutions you need to lower the AA and AF, also it's not needed as much at higher resolutions.

I play HL2 at 1680x1050(widescreen) I believe with AA @4X and AF @6X, can't be sure since I'm not at my home pc. With Far Cry it was even lower settings at the same resolution.
 
And what exactly does the program do? Analyze heat, stability, performance through video stress tests, etc? I have never used a benchmarking program before.
 
Originally posted by: Noob
I played at the same resolution and settings with AA 6x and AF 16x. I did recognize a difference on that resolution. But then when I set it to 1280x1024 the performacne just pludged very significantly.

6xAA!

Try 4xAA, maybe 8xAF. Going from 6xAA to 4xAA will give a good performance boost at minimal image quality loss.
 
Originally posted by: Concillian
Originally posted by: Noob
I played at the same resolution and settings with AA 6x and AF 16x. I did recognize a difference on that resolution. But then when I set it to 1280x1024 the performacne just pludged very significantly.

6xAA!

Try 4xAA, maybe 8xAF. Going from 6xAA to 4xAA will give a good performance boost at minimal image quality loss.

I assume AA provides more of an image quality enhancment? How much does AF do though?
 
How exactly does the 3DMark work. I do not want to put my computer through a rigurous test and risk ruining parts. And how much of an IQ does AF make?
 
They do different things for image quality. It's hard to say which gives "more" image quality enhancement.

AA is easy enough to understand, it will blend colors of jagged edges, so that a black line on a white background will not be a 'stairstep' of black pixels on a white background, but a balck line with a white background and some grey pixels in areas to make the enges look smoother.

AF is a texture filtering quality enhancement and is most noticeable on floor textures. It's pretty difficult to explain what it does.
Here is an older article that shows what texture filtering does:
http://www.nvnews.net/previews/geforce3/anisotropic.shtml

It's OLD, discussing the GF3, but the part on filtering is a pretty good example of what it's doing.

4x AA and 8x AF is the sweet spot for me.
4xAA provides noticeable quality improvement, and 6xAA provides a pretty big performance hit
For AF, the most annoying to me is how you will get a "line" in front of you where the LOD changes. Using "8x performance" is generally enough to blend this line in a way that I don't notice it, and as long as I don't notice it, I'm happy.

Personally the LOD thing is a huge annoyance to me. I'd rather run lower res, no AA and 8xAF than higher res or 6xAA and no AF. People are different though, find what you prefer.

3Dmark is no more rigorous on your system than running a game. It merely provides a comparison point... and it's pretty too 🙂 . It's basically a program that uses it's own game engine to evaluate your 3D performance and compare it to other systems.
 
Originally posted by: Concillian
They do different things for image quality. It's hard to say which gives "more" image quality enhancement.

AA is easy enough to understand, it will blend colors of jagged edges, so that a black line on a white background will not be a 'stairstep' of black pixels on a white background, but a balck line with a white background and some grey pixels in areas to make the enges look smoother.

AF is a texture filtering quality enhancement and is most noticeable on floor textures. It's pretty difficult to explain what it does.
Here is an older article that shows what texture filtering does:
http://www.nvnews.net/previews/geforce3/anisotropic.shtml

All I have heard is that AF prevers the 3D image asit fades into the background. But that sounds a little too basic. And thanx for telling me some optimal settings. I didn't know that AA 6x could give a big performance right after 4x. And I do see the LOD problem you are talking about with AF. I got somethign to that effect to.

Will enabling Temporal AA give a huge performance hit?

It's OLD, discussing the GF3, but the part on filtering is a pretty good example of what it's doing.

4x AA and 8x AF is the sweet spot for me.
4xAA provides noticeable quality improvement, and 6xAA provides a pretty big performance hit
For AF, the most annoying to me is how you will get a "line" in front of you where the LOD changes. Using "8x performance" is generally enough to blend this line in a way that I don't notice it, and as long as I don't notice it, I'm happy.

Personally the LOD thing is a huge annoyance to me. I'd rather run lower res, no AA and 8xAF than higher res or 6xAA and no AF. People are different though, find what you prefer.

3Dmark is no more rigorous on your system than running a game. It merely provides a comparison point... and it's pretty too 🙂 .

 
My understanding of temporal AA is that it will not activate unless your FPS is already above your monitor's refresh rate. In other words It will never do anything unless it can do it without penalty.

AF is something you have to experience. You can't really show how annoying that line is in a still picture. You need to have motion for AF to give a very noticeable benefit over trilinear filtering. If you see the benefit, then turn it off, then up to 4x, then up to 8x, then 16x. Stop when you can't really tell a difference while you're playing.
 
When AF was at 4x I say a little black line appear from time to time. Plus stuff just looked out of place. But AF 8x seems to be fine.
 
Originally posted by: YOyoYOhowsDAjello
Do you have the settings turned up?
Did you have the game reoptomize for the new card? (I think HL2 did this when I first installed it)

How do you have the game reoptimize for the card?
 
Originally posted by: Noob
Originally posted by: YOyoYOhowsDAjello
Do you have the settings turned up?
Did you have the game reoptomize for the new card? (I think HL2 did this when I first installed it)

How do you have the game reoptimize for the card?

If it did it, it did it automatically when it installed. I'd have to reinstall the game myself to see if it actually did it. Maybe you should try it if you don't have anything else to do.
 
Back
Top