What frame rate do you get in UT?

Deeko

Lifer
Jun 16, 2000
30,213
12
81
TBird 800@920
224MB RAM
K7M
V5 5500
SBLive! value

utbench.dem, 1024x768, min desired rate 0
D3D - 36.81
Glide - 38.05
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Tbird 800 @ 950
Geforce2 GTS 32MB @ 200/390
256MB Ram @ 133MHz

When playing CTF with 12 bots on november I see averages of 70's and minimums in the 40's (1024x768x16)
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
UT Timedemo1
1024x768 = 59.06
800x600 = 70.24

Duron 600 @ 896MHz
Abit KT7
128meg @ 145MHz
V3 3k AGP, stock.
SB Live!
 

TAsunder

Senior member
Oct 6, 2000
287
0
0
Duron 700 @ 900 on A7V
32 Bit Color, High Skin, Medium World, High Sound qualities
Radeon 32mb DDR (stock speed)
224mb PC-100
SB Live! Value

Frame Rate varies greatly depending on level:

Single Player Galleon: 40-50fps average
Most Levels: 55-65fps
Min: 22ish
Max: 75-85fps (when game has started).

Multiplayer (when I host): 5-10fps lower across the board

I don't think a particular benchmark is worthwhile as an indicator of performance, since each level is different, and even some areas on particular levels are different framewise. The best indicator is to play about 15 levels and write down the frame calculations at the end, IMO.
 

Eug

Lifer
Mar 11, 2000
24,046
1,675
126
I don't think playing levels is very accurate, because there are far too many variables. I never get the same numbers, but with benchmarks the numbers are always within 2% of each other with the settings unchanged. Benchmarks have their problems too, but it helps if you run multiple ones. Unfortunately, it seems the only one commonly used is the UTBench.dem. AnandTech uses its own, which is completely different.

At 800x600 Glide, settings on high but with minimum frame rate = 0, I get in the high 30's. I get about 41 with minimum frame rate = 40. This is running a Celeron 533A at 880, and a Voodoo 3 2000 at 179. The frame rates are much higher during gaming most of the time, but sometimes get very slow in the most congested fragfests.

Get UTBench here.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Nice OGL score at 1280. OGL runs good on the Geforce, but I couldn't improve the gamma to an enjoyable setting. Is there a way to adjust the OGL gamma?
 

Rickr

Senior member
Oct 21, 1999
339
0
0
57 average on cityintro with D3D (1024*768 32-bit)
33.5 average on UTbench

Cel2 633@950
Geforce DDR


 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
5500 @ 183, glide, detail textures = true

1280x1024:
Thunder 68.8
cityintro 65.6
UTbench 45.8

zoom. :p

if you used OGl or D3d, go to the console, type "preferences", open the "rendering" section, and select the API. Then set "detail textures" to "true". In glide, it is already "true". In D3d and OGL, it is set to "false". Then try your benchmarking again.

:)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Robo

"if you used OGl or D3d, go to the console, type "preferences", open the "rendering" section, and select the API. Then set "detail textures" to "true". In glide, it is already "true". In D3d and OGL, it is set to "false". Then try your benchmarking again.

OK, my scores were. Now install the second texture CD and rerun the benches with S3TC textures and 32bit enabled for a truly fair comparison......










oh, sorry about that;)

BTW- 32bit definately does make a difference running UT with S3TC enabled.

Merlocka-

"Nice OGL score at 1280. OGL runs good on the Geforce, but I couldn't improve the gamma to an enjoyable setting. Is there a way to adjust the OGL gamma?"

Set it in display properties before you enter then game. I bump mine up .5 from what ever it happens to be set at for best results. That is if you are using the S3TC patch, regular OpenGL performance kinda blows for me, D3D is much faster.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Ben, 32-bit makes a difference ONLY because 16-bit in OGL and D3d is inferior to glide.

Remember, it's 16-bit artwork at its source. If the 16-bit is rendered properly, it's as good as it can get in UT. Ain't no "true" 32-bit art there to be rendered.

I do agree tho, in D3d and OGL, 32-bit looks better. In fact, 16-bit D3d looks HORRID!!!!!!

and Ben, I had 32-bit forced for those. 1024 was only about 8 fps faster, but 1280 is almost 15 fps faster in 16-bit.

Honest question here, have you been playing UT a lot with the patch? I've heard very conflicting opinions about its stability. Lots of peeps are bitching about missing/tearing textures, etc. on some maps.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Robo-

"Honest question here, have you been playing UT a lot with the patch? I've heard very conflicting opinions about its stability. Lots of peeps are bitching about missing/tearing textures, etc. on some maps."

Through the entire one player game start to finish four different times not to mention hundreds of "practice" matches and replaying portions of the one player game testing various things with alternating number of bots. All my "spare" time for the last couple of weeks has been devoted to UT and this patch in one way or another(including getting a Linux install up and running on my PC).

I have had well over a hundred emails, possibly two or three hundred(my HD got wiped thanks to PartMgc last week and I hadn't counted them all up, my last backup was two weeks prior to writing the article) about various problems with many different boards. To date, I have been able to fix every issue with supported hardware that I have had time to figure out with the exception of the Radeon and oddly enough the S3 boards(MetaL doesn't work under Win2K), aside from some Z-Buffer problems with Rage128 boards(and is appears now the Radeon also, but no confirmation from ATi yet).

Certain boards need to have certain tweaks in place. I am working on a troubleshooting guide for it, and have been tryig to get information on all the boards. As of now, I have very little on the Radeon and NONE on the V5(wanna give it a shot?). The patch does offer significantly better performance for every other board allowing you have enough system RAM(the S3 textures make UT even more of a pig unfortunately, far less of an issue under Linux). Rage128, TNT1/2 and MatroxG400 users all have given me feedback about enjoying the increased performance and superior visual quality over the D3D code or native OpenGL, even though it doesn't enable S3TC for them. Some of these boards need an additional tweak in the .ini file here or there, but they all work very well when setup properly.

"Remember, it's 16-bit artwork at its source. If the 16-bit is rendered properly, it's as good as it can get in UT. Ain't no "true" 32-bit art there to be rendered."

Anything that has native transparency benefits greatly from 32bit, not to mention one of the issues with the OpenGL UT code, even in non patched mode, leaves some rather serious tearing issues and missing textures due to 16bit Z-buffer(the first time I have ever in a game seen 32bit Z be a requirement for proper display btw). You need to use a 32bit Z-Buffer which has been somewhat problematic for ATi users(though I assume by lack of complaints this problem has been fixed with Matrox boards).

If you are bored or whatever at some point, I would greatly appreciate someone trying to run the OpenGL.dll file on the V5, even though it doesn't support texture compression. I know, Glide will almost certainly still be better, but it won't hurt anything if any V5 owners just try it out. It works with the 4.28 or 4.32 patch and you only need to overwrite you OpenGL.dll file, which you could backup and save so you don't lose it if you find the patch intollerable:)

I'm trying desperately to get a full report from every reasonably current video card on the market with this patch, I've combed over so many .ini and .log files that I am getting to the point where a quick scan tends to point out 75% of all problems, and I would be greatly appreciative if anyone feels curious themselves and happens to own a V5 if they could give me some feedback(I'M Begging here in case noone has figured it out yet:D).
 

TAsunder

Senior member
Oct 6, 2000
287
0
0
I installed the second cd and then set the st3c = true in the ini file.. and it was slower. What am I doing wrong here? I installed some other dll and changed the ini according to the readme, but now I can't even run it in openg :(

What's the secret :)

UTBench with high everything, 32 bit, 1024x768: 38.7, Medium world High skin: 41

Duron 700@900
Radeon 32mb ddr @ 180 (still tweaking)
256mb Crucial PC133 CL2 RAM
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Ben, have you checked out www.3dspotlight.com?

It has, without a doubt, the best UT tweeking out there. May want to give it a shot.

and yeah, i'll check UT out with the OGL patch, tho I'm in the process of trying to get the CD2 textures into an FXT1 format.

anyway, one of the points of what I was trying to say is that OGL in Ut is problematic. yes, the S3TC helps ALOT, but it isn't a be-all, end-all solution. There are other issues with the OGL code (as you've seen)

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
TAsunder-

My reccomendations for a Radeon-

[OpenGLDrv.OpenGLRenderDevice]
UseGammaExtension=0
UseModulatedGamma=0
UseS3TC=1
UseTNT=1
MinDepthBits=16
MaxLogUOverV=8
MaxLogVOverU=8
UseMultiTexture=1
UsePalette=1
UseAlphaPalette=0
ShareLists=0
AlwaysMipmap=0
DoPrecache=1
Translucency=True
VolumetricLighting=True
ShinySurfaces=True
Coronas=True
HighDetailActors=True
DetailTextures=True
UseTrilinear=True

Use with the OpenGL.dll file on this page. This blend seems to give the best mix of performance and image quality. I know the TNT line in there looks weird, but it is used as a compatibility option and for some reason doesn't disable texture compression on the Radeon(although it does on S3 boards).

Post back with your results or feel free to email me.

Robo-

"and yeah, i'll check UT out with the OGL patch, tho I'm in the process of trying to get the CD2 textures into an FXT1 format."

Hmmm, dl the S3TC software decompressor, decompress all textures then precompress with an FXT1 utility and overwrite the default textures? Sounds very interesting.

"anyway, one of the points of what I was trying to say is that OGL in Ut is problematic. yes, the S3TC helps ALOT, but it isn't a be-all, end-all solution. There are other issues with the OGL code (as you've seen)"

Uhm, I can't comment at this time:)
 

TAsunder

Senior member
Oct 6, 2000
287
0
0
Thanks, Ben. I'll check it out this evening. Somehow I doubt it's going to be as good as I hoped :p
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Ben:



<< ]&quot;Hmmm, dl the S3TC software decompressor, decompress all textures then precompress with an FXT1 utility and overwrite the default textures? Sounds very interesting.&quot; >>



eh? wheredat???? wherefore I fidn S3TC software decompressor?



<< &quot;anyway, one of the points of what I was trying to say is that OGL in Ut is problematic. yes, the S3TC helps ALOT, but it isn't a be-all, end-all solution. There are other issues with the OGL code (as you've seen)&quot;

Uhm, I can't comment at this time
>>



eh? Whatfore you say that?
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
TAsunder, Ben, the ini that I'm using that works on the Radeon looks a little different than the one you posted.
[OpenGLDrv.OpenGLRenderDevice]
UseGammaExtension=0
UseModulatedGamma=0
UseS3TC=1
UseTNT=1
MinDepthBits=16
MaxLogUOverV=8
MaxLogVOverU=8
UseMultiTexture=1
UsePalette=1
UseAlphaPalette=0
ShareLists=0
AlwaysMipmap=0
DoPrecache=0
Translucency=True
VolumetricLighting=True
ShinySurfaces=True
Coronas=True
HighDetailActors=True
DetailTextures=True
UseTrilinear=False

Edit: I tried it with those two settings changed, and it still works just fine.
 

yellowperil

Diamond Member
Jan 17, 2000
4,598
0
0
My FPS actually increases as I raise the resolution (up to an extent). I'm getting:

640x480: 32-35 fps
800x600: 40-42 fps
1024x768: 45-48 fps

This seems really strange, but whatever.
 

TAsunder

Senior member
Oct 6, 2000
287
0
0
D3d with nearly all tweaks as recommended at the page above set to quality instead of speed (including Detail Textures and Volumentric Fog On): 41

OpenGL with Ben's Ini: 37.8

1024x768
Radeon 32mb DDR @ 180mhz/180mhz
256mb Crucial PC133 CL2 RAM
Duron 700 @ 900

Is that good or bad? :)