UT2k4 performance

Jun 14, 2003
10,442
0
0
with this system

XP 2000, 512mb PC2100, R9500pro, 2x40gb 7200rpm 2mb cache drives (regularly defraged) MSI mobo...KT333 ultra 2

playin online in onslaught....i average round bouts 30fps....this dips when there is alot goin on at once in front of me though

i would like to know...

1) what part of my system is most limiting game performance
2) which component would it be best to upgrade to get better performance
3) are there any in game adjustments i can make to improve?

bear in mind....i play at 1024x768 @75hz and no AA/AF...
i have tried reducing resolution, playing with the fog slider thing (which seems to do nothing at all to performance or image)

any reccommendations welcom! thanks
 

SneakyStuff

Diamond Member
Jan 13, 2004
4,294
0
76
fyi, on new hard drives, degragmenting regularly shows absolutely no performance gains, its usually just in your head. Only in cases of servere defragmentation will you notice a difference, just thought i'd save you some time. And in response to what limits you most, I would have to say your CPU.
 
Jun 14, 2003
10,442
0
0
why? its way above what they recommend......its about 1.6GHz isnt it an xp2000? that makes me 400mhz above recommended!

i thought it might be the ram....i heard 2100 was gettin phased out for being too slow
 

SneakyStuff

Diamond Member
Jan 13, 2004
4,294
0
76
I use pc2100, and I am pretty much completely satsfied with it, most games out now a days are CPU limited, or GPU limited, keep in mind that although "reccomended" might be exceeded, that just entails extra performance gains, a prime example is warcraft 3, reccomended 600 MHz processor, but you are obviously going to notice a substantial performance gain on a 2 Ghz processor. I am 100% sure your Radeon is not crippling you, unless you forgot to plug in the Y power splitter (wink wink:)) The 9500 pro is a very nice card. BTW, my unreal2k4 slows down a bit on assault mode when I set the world detail on high, try toning that down a tad.
 
Jun 14, 2003
10,442
0
0
jus been reading....the chip i have has a 133fsb (266 coz i think its doubled or sumthing becuase of ddr)....and that the 266mhz rams makes sense...seen as that matches the fsb nicely which ive been told is good for latencies or sumthing..so upgrading to 333 would merit no extra performance less i get a 333fsb.

so you say turn down world detail? would toning the physics down abit help also? seen as physics are just loads of maths for the cpu to crunch.

is FSB speed more influential on games than actuall CPU speed?
 

SneakyStuff

Diamond Member
Jan 13, 2004
4,294
0
76
I've got EVERYTHING on normal, and all boxes checked, except for projectors, and I get very nice performance. I jsut enable 2xAA/2xAF when I want it to look nice, and even then, runs smooth.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
they both help, but it is widely known that the unreal engine is very cpu limited. as for good performace in ut2004, i can't get your hopes up there. i run my cpu at 3000+ (12.5x174), gig of ram at 174mhz and a 9800xt. i still get dips into the 20's in onslaught useing anything but the most but ugly low settings. the game just tries to push things too much for people like us who aprecate good framerate.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
ya some people are just more sensitive to framerate than others. the game doesn't choke at all so someone who can deal with framerate diping into the 20's will claim it runs smooth, but for those of us who require more framerate it is unacceptable.
 
Jun 14, 2003
10,442
0
0
yeah i mean at 30fps u wouldnt know i think its very good theyve got it to run as smoothly as it does most of the time 20fps is smooth i probably wouldnt notice 20fps if i didnt have fraps on!!!! ahhah...but as soon as u get in a fire fight with afew people...then it becomes important..as the fps dipps it gets harder to hit people....but im hoping the real game will play abit smoother after all UT2k3 plays real nice at about 45-50fps most cases

on the up side....i can actually get away with 4xaa and 8xaf without any drop...so i guess my 9500pro is serving me well....i might try 6xaa and 16xaf!!!

guessin im gonna need serious upgrades to get what i want!!!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Your CPU is the most limiting factor. Personally I would change the colour depth to 16-bit because you won't notice any difference. Secondly, Disable Shadows because I doubt you care about your opponents shadows while playing a fast-paced FPS. These 2 things should free the videocard somewhat.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: otispunkmeyer
yeah i mean at 30fps u wouldnt know i think its very good theyve got it to run as smoothly as it does most of the time 20fps is smooth i probably wouldnt notice 20fps if i didnt have fraps on!!!!

personaly i can notice anything below ~40fps, but under 30fps is when i become irritated by it.

Originally posted by: otispunkmeyer
ahhah...but as soon as u get in a fire fight with afew people...then it becomes important..as the fps dipps it gets harder to hit people....

ya, very important.

Originally posted by: otispunkmeyer
but im hoping the real game will play abit smoother after all UT2k3 plays real nice at about 45-50fps most cases

i'd bet that the real game will have even worse performace on some maps. epic doesn't seem to really care about good framerate. load up ut2003, i'll bet you hit single didgets in br-disclosure; when i had a 1900+ and a 9700pro i know i saw things drop as low as 5fps.

Originally posted by: otispunkmeyer
guessin im gonna need serious upgrades to get what i want!!!

i think you are going to need a time machine to go steal hardware from the future. ;)


 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: RussianSensation
Personally I would change the colour depth to 16-bit because you won't notice any difference.


assumeing you are legaly blind! :D
 
Oct 16, 1999
10,490
4
0
The more CPU and memory bandwidth you throw at UT2K4 the better it will run. Since the demo came out I have gone from
AthlonXP 2600 (T-bred, 16x133) w/512 PC2100 to
AthlonXP 2600 (T-bred, 11x200) w/512 PC3200 to
AthlonXP 2600 (T-bred, 11.5x200) w/512 PC3200 to
Mobile Athlon XP 2600 (Barton, 11x200 - still trying to get faster) w/512 PC3200
and have seen a noticeable increase in how well the game played at every step, even with the Barton 100Mhz lower than the T-bred. My video card is a 9800, and even running at 1024x768 w/ 4X AA & 16X perf AF I was still getting increases from the CPU & RAM upgrades. As it stands, the lowest I've seen it hit with things as they are currently was mid 20's, and that was with a heinious battle going on in the tunnel of the onslaught map with me zooming in on it with the lightning gun. Normal heavy firefights stay in the upper 30's and otherwise it's always at 40+. This is with all the eye-candy settings in the demo as high as they will go. I don't think there is hardware available today thet will keep this game from being choked down to less that 30 fps in really insane instances, the best you can do is throw all you can at it to keep it there for the least amount of time.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: TheSnowman
Originally posted by: RussianSensation
Personally I would change the colour depth to 16-bit because you won't notice any difference.


assumeing you are legaly blind! :D

Why can yoU? hmm.....have you actually ran a game at 16-bit colour depth? or you are just assuming theoretically it's noticable? Just try to switch the desktop colour quality setting to 16-bit and you wont notice a change in your wallpaper or background while it is standing still. Now imagine trying to notice it in a moving game?

hmm....maybe explosions at best

I am saying it will increase performance while sacrificing 5% of visuals and most people probably won't even notice.

The difference going from 8 to 16 bit (colour that is) is huge 16 to 32....debatable.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
16bit has banding all over the place, i was around before we even had 32bit color and i was very happy to see it come. ;)
 
Jun 14, 2003
10,442
0
0
ive wondered bout 16bit vs 32bit.......im sure i read some where that the human eye can only see so many colours.....and they were no where near the numbers that 32bit produces....still i could be wrong
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
i wouldn't be suprised if you are right about reading such claims, but i assure you that the person who wrote it was wrong. ;)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
personaly i can notice anything below ~40fps, but under 30fps is when i become irritated by it.

Will you be forcing DX8 in Far Cry and HL2 then, Snowman?

 

Darkhound

Member
Mar 1, 2004
44
0
0
my system is identical to yours except I have 1x80Gb 7200 RPM HDD and a GeForce FX5700 Ultra 128Mb and I average 60fps. So I reckon its the video Card.

Thats playing at 1280x1024, everything High with 2xAF (TFT hates anything below that res)
 
Oct 16, 1999
10,490
4
0
Originally posted by: RussianSensation
Originally posted by: TheSnowman
Originally posted by: RussianSensation
Personally I would change the colour depth to 16-bit because you won't notice any difference.


assumeing you are legaly blind! :D

Why can yoU? hmm.....have you actually ran a game at 16-bit colour depth? or you are just assuming theoretically it's noticable? Just try to switch the desktop colour quality setting to 16-bit and you wont notice a change in your wallpaper or background while it is standing still. Now imagine trying to notice it in a moving game?

hmm....maybe explosions at best

I am saying it will increase performance while sacrificing 5% of visuals and most people probably won't even notice.

The difference going from 8 to 16 bit (colour that is) is huge 16 to 32....debatable.


You got two problems here. The difference between 32 to 16 bit is noticeable, and going from one to the other is not going to affect performance if your CPU is the bottle-neck.

Originally posted by: Darkhound
my system is identical to yours except I have 1x80Gb 7200 RPM HDD and a GeForce FX5700 Ultra 128Mb and I average 60fps. So I reckon its the video Card.

Thats playing at 1280x1024, everything High with 2xAF (TFT hates anything below that res)

Thre's no way you're averaging 60 fps with that rig on the Onslaught and Assault maps.
 

memories2002

Senior member
Apr 2, 2002
448
0
0
i have athlon tbred 2000, 512mb ddr266, 9500pro, audigy es, a7n8x-x, wd 80gb se 7200rpm 8mb..
i havent gone below 40 fps, if i have its been very rarely and i havent noticed..no idea why youre running so slow?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Rollo
personaly i can notice anything below ~40fps, but under 30fps is when i become irritated by it.

Will you be forcing DX8 in Far Cry and HL2 then, Snowman?

in the demo far cry i have quite a bit of stuff turned down so i am not irretated by low framerate, but i still have enough options on to get the nice bumpy-shiny ps2.0 effects. however, i have heard from beta testers that more recent builds have done great things for performace so quite possably i will be playing the retail version with not just bumpy-shiny ps2.0 effects but everythying else up as well. as for hl2, i can't rightly say what settings i will be using until i get a chance to try it; but considering the radeon doesn't really take a hit for doing ps2.0 over dx8 level pixel shaders; i don't see why i would want to force dx8.

sorry about that though Rollo, i know that isn't the answer you wanted to hear. ;)
 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
onslaught is awesome, i cant wait to buy this game.

i havent really ran a benchmark yet, but i will after i get home from work. im assuming i get more then 40fps while playing, but i play in 800x600 and i have the setting set to normal and i unchecked everything details wise... cause all they are is annoying when your trying to kill someone.