RADEON X800 PRO Performance with Athlon XP, article. Update now with P4"C" CPUs

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Thanks for the link. It's about time someone got around to this. He even benched some games at 2048x1536, which a few ppl on this forum have asked for.
 
Oct 16, 1999
10,490
4
0
That would have been a lot more helpful if they had gone into detail about the minimum framerates with the various setups. It'd be nice to know if that brand-spanking new $400 video card still leaves you with sub 30fps slow-downs because of your CPU bottleneck or vice versa. I really don't give a crap where my bottleneck is if I'm hitting 40+ fps, I want to get rid of whatever is dropping me down to 30 or lower.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I agree, I too would like to see a min/max/avg. graph for this kind of testing. Really does'nt mean that much otherwise.
 

Dman877

Platinum Member
Jan 15, 2004
2,707
0
0
Originally posted by: Killrose
I agree, I too would like to see a min/max/avg. graph for this kind of testing. Really does'nt mean that much otherwise.

I'd go even further, leave out the average/max fps al-together, just show us the minimum framerates cuz that's all I care about.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
I figure an overclocked mobile barton will place in between the 3200+ and the A64, which is nice :)
If you don't think this article says enough, go do your own benches
 
Oct 16, 1999
10,490
4
0
Originally posted by: Avalon
I figure an overclocked mobile barton will place in between the 3200+ and the A64, which is nice :)
If you don't think this article says enough, go do your own benches

Brilliant and completely feasible, thanks!
 

CU

Platinum Member
Aug 14, 2000
2,415
51
91
This is great. I have been looking for benchmarks with the newer cards and older cpu's. I really want to see the P4 article though since that is what I have, but the AXP is good enough for now.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
It's good to see Athlon 64 and 2048 x 1536 results in there. Hopefully this will become a trend in the future.
 

mstegall

Member
May 10, 2004
155
0
0
Nice find, interesting to see the comparisions between the pentium and AMD with the same video cards. The pentium seems to do pretty well.
 

pookie69

Senior member
Apr 16, 2004
305
0
0

Thanks loads for posting this mate - defintely something that i have wanted to clear up for some time. Article applies very well to someone in my situation, particularly being a fan as i am of UT2004 whilst running an o/c'ed P4 2.0A. HUGE-a$$ bottleneck rite there then i see :(

Anyways, thanks again.

:cool:
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Hmm...Even a X800 Pro can't produce 30 fps in Far Cry @ 2048x1536, and the performance drop from 16x12 to 2048x1536 is huge as well. Maybe the X800 Pro or the game itself isn't optimised for 2048x1536 yet?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It is true that cpu speed limits performance at 800x600 and up to 1280x1024 but beyond that for most games the cpu speeds stops to matter as has been shown by the latest firing squad article. In many cases the performance between 2.4 and 3.2 processor was way too close with 0-5 frames between them. When the cpu difference was noticeable, the performance was already high even with a 2.0 system.

Personally, I still think that pairing a 2.0A with X800Pro is better than 3.2ghz with 9800xt. Why? Because in shader intensive new games like T:AOD and Far Cry and HALO that system crushed the latter. Considering new games will only incorporate more intense shaders in the future, the graphics card will matter even more. Ideally it makes sense to pair the fastest components you can afford. But if someone had a choice between upgrading the cpu or the GPU first (primarily for games), I would always tell them the GPU first unless their cpu is really really slow.

Finally, my only gripe is that comparing 9800xt to x800pro is somewhat illogical since 9800xt only costs slightly less. I think the articles should have paired up 9800Pro to X800Pro which only costs $200 than it would make sense to argue the point of cpu bottleneck at lower resolutions + the advantage of saving additional funds. This would have a stronger wait on the outcome of a purchase. Of course the user is also left with the option of upgrading the cpu later and increasing the current videocard's performance if it's already there, which again gives a slight nod towards a purchase of the faster card. On a final note, for simulators, regardless of the cpu the performance was slow in all cases because these games are very demanding. So for the user with a slow cpu, my vote will always go towards a faster videocard as that does provide the best return on your investment *PII 233mhz cpu users need not apply :laugh:

_____________________
***Also if you take the time and compare the XP and P4 articles together, P4 2.4C beats AXP2800+ and often XP3200+ in these benches at most resolutions and detail settings when paired with either 9800xt or x800pro further proving that indeed the old rating system was way off. In fact, the 2.4C absolutely pounded the 2500+ and people often argue 2500+ is faster in games. Now I am starting to think Anand's comment about 2.8C being 20-30% faster than 2800+ xp is actually true, especially after analyzing these benches together.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: StrangerGuy
Hmm...Even a X800 Pro can't produce 30 fps in Far Cry @ 2048x1536, and the performance drop from 16x12 to 2048x1536 is huge as well. Maybe the X800 Pro or the game itself isn't optimised for 2048x1536 yet?

This /may/ be the reason for that large performance drop:

With 4 quads enabled the Hierarchical Z-Buffer has around 4 mega pixels of (lower level) Z buffer storage capabilities, which should be good for over 2048 resolutions, which means that high definition resolutions like 1920x1080 are fully covered. Should the screen resolution exceed that of the maximum capabilities of the Hierarchical Z-Buffer it is not disabled entirely, instead a portion of the Z-buffer is setup in the Hierarchical Z-Buffer to its maximum storage capability and then anything that falls out of that range falls back to the early pixel level reject, so the majority of the screen can still be captured by the Hierarchical Z-Buffer.

X800P has only three quads enabled, meaning possibly not enough to fully cover 20x15. Still, three quads should cover most of 20x15, so I'm not sure why the big drop-off. Maybe the drivers just aren't optimized for that rare res yet.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
Yeah it's probably the drivers. Catalyst 2.2 was very poor at resolutions higher than 1600 x 1200 but Catalyst 2.3 increased performance by around 50%.

It's likely the same thing is now applying to 2048 x 1536.
 

pookie69

Senior member
Apr 16, 2004
305
0
0

This bit i have DEFINITELY FOUND TO BE TRUE;



"Being CPU-limited isn?t necessarily a bad thing. From an optimists? viewpoint, eye candy features such as anti-aliasing and anisotropic filtering are ?free?, meaning they come with no performance hit. You can also crank up the texture and geometry detail and screen resolution with little or no drop in performance"

Im running an o/c'ed P4 2.0A @2.44GHz with an o/c'ed 9800SE with freqs greater than stock 9800XT (only the 4 pipes though :( ) >>> anyways, up until very recently, i had been playing UT2004 @1280x1024 with normal/low settings - i didnt realise the effect of this cpu limitation thing. Im running now with all settings at max, and after an initial fall in FPS, i quickly hit a plateau in terms of FPS drop - hitting an average FPS on most levels of 40-50 now as opposed to 50-60 with low/normal settings, occasionally dropping to high 30s >>> thats still very playable for me. I used to be hung up on FPS's before, but not anymore now that my game looks the SH!T :)

One thing ive noticed, i can easily turn up AF all the way to x8 without ANY AFFECT on the FPS, yet as soon as i turn on x2AA, there is a noticeable drop in FPS below my plateau, whether that be AA by itself or alongside AF >>> any ideas why? LOL - ATi's adaptive AF again? Is AA more intensive a trick to pull off than AF then? Doesnt concern me too much as at 1280x1024 res, i dont mind AA not being on. AF though i need, and i cannot believe how far i can crank that up.

I used to think my gfx card was pants. After reading that article and playing with settings and convincing myself that i do NOT need to be running 60+FPS for good gameplay, im amazed at high far i can turn up the settings without any loss in FPS once ive hit that 40-50 FPS plateau. Who needs an x800 when u can pull this off with a 9800se?!

QUESTION:- like i said, turning on AA seems to drop the FPS unlike AF and other settings once ive hit that 40-50 FPS range. Does that mean that i am no longer being CPU limited at that settings setup, and so even if i were to replace my 2.0A with a 3.06B, i wood not see any increase in FPS with AA turned on? Or am i mistaken?
 

Chebago

Senior member
Apr 10, 2004
575
0
0
marcyes.com
I was never frustrated with this trend of average FPS in reviews until I started playing Far Cry, my 9800pro runs the game pretty well at high resolutions until there is a lot of movement, then it drops from about 90FPS to about 30 FPS, which makes it start to lag, that just drives me nuts, then I read benchmarks trying to get the best balance of eye candy and performance and get statements like, "ran great at 54 FPS average" etc.. But I need to know if it ever dropped below 30, I don't care what the average was. If I got a 100 on 1 test a nd a 60 on another, my average would be 80, a b- which is barely acceptable, but that doesn't change the fact I still got a D- on one of my tests.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Yes, (I'm pretty sure that) AA requires more memory bandwidth than AF, thus the bigger framerate hit. If your 9800SE has a 128-bit memory bus, that's your answer right there. You really need a 256-bit memory bus to comfortably use AA at 12x10.
 

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: Pete
Yes, (I'm pretty sure that) AA requires more memory bandwidth than AF, thus the bigger framerate hit. If your 9800SE has a 128-bit memory bus, that's your answer right there. You really need a 256-bit memory bus to comfortably use AA at 12x10.

My 9800SE has a 256-bit bus. However im running 768MB of PC2100 memory on main system :( >>> sad i know. Maybe thats the choker rite there? Or maybe my P4 2.0A? Me thinks i need to do an upgrade sometime soon :confused:

When it comes to the workings of gfx card and related features, im kinda at a loss. HDDs much more my thing :D
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: pookie69
QUESTION:- like i said, turning on AA seems to drop the FPS unlike AF and other settings once ive hit that 40-50 FPS range. Does that mean that i am no longer being CPU limited at that settings setup, and so even if i were to replace my 2.0A with a 3.06B, i wood not see any increase in FPS with AA turned on? Or am i mistaken?

If changing a video setting is lowering your FPS considerably, you have hit a setting where you are no longer being CPU-limited, but rather limited by your video card. So you are correct in that a faster processor would not help you here. With only 4 pipelines, the 9800SE is going to be much slower with AA than a 9700Pro/9800/9800Pro.