Got my hacked Radeon LE, not that impressed Unreal Tournament using D3D.

Eug

Lifer
Mar 11, 2000
24,047
1,676
126
I guess Glide and Voodoo is really the way to go at lower resolutions and slower machines.

At 800x600x16, my old Voodoo 3 overclocked to 179 MHz still was several FPS faster (39 fps) than my Radeon (35 fps) in D3D at the same resolution using UTBench with the same settings. This translates to about a 10% difference, which may not seem like much, but it seems the minimum frame rates were noticeably better with the Voodoo. At 1024x768x16 they are similar, with the Radeon at that resolution being close (35 fps) to the same as 800x600x16. I'm running a Celeron 880. People may say UTBench is far too intense, but I disagree, because those intense situations are when you need the fps the most.

At 1024x768x32 for the Radeon, the game becomes far too stuttery. The overall fps is OK, but it really chokes on some scenes. (oldfart says the 64 MB card is better, but I had decided to save my coin.)

What about texture compression and/or OpenGL?

The Radeon will be better at other games, but it's amazing just how well the 'lowly' Voodoo 3 does.
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81


<< I guess Glide and Voodoo is really the way to go at lower resolutions and slower machines. >>



Maybe that's it. Tribes 1 ran better on my voodoo banshee and celeron 550 than on my radeon 64 and celeron 550. Heh.

But your's should still run much better. I ran it great at 1024x768x32 on my Radeon 64 and celeron 550 when I had it. Although it is a radeon 64, I doubt it made a big difference at those settings.

Are you running in win98 or win2k?
 

Eug

Lifer
Mar 11, 2000
24,047
1,676
126
I am using Windows 2000, which is probably part of the reason. Also, the Radeon 64 is supposed to run much better at 1024x768x32 than the Radeon 32.

See here.
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81
On one of Anand's reviews, the 64mb was 5fps faster in avg frame rate than the 32mb at 1024x768x32. Depends how you look at it I guess.

But regardless, your 32mb cel 880 SHOULD run it better than my 64mb cel 550.

Which win2k drivers are you using? I'm using the 3132 betas but the only game I've played was Tribes 2 and it has near identical performance to win98.

You could also try OpenGL to see if that is faster.
 

Raincity

Diamond Member
Feb 17, 2000
4,477
12
81
Your not alone. UT runs like crap for me in opgl with texture compression on my Radeon 64 in Win2k with the lastest beta driver. 3D3 runs fine for me @1024x768 32

Rain
 

Eug

Lifer
Mar 11, 2000
24,047
1,676
126
I'm not sure I can really use Anand's numbers, because judging by the numbers, it's a light bench. I prefer the more intense ones like UTBench.

I am using the Jan 7 Win 2000 drivers, which are the latest release drivers (non-beta). I've deleted all of the 3dfx driver registry entries that I've seen.

By the way, I'm not saying I'm getting 35 fps most of the time. No most of the time I'm over 60, but I'm benching under 35 with UTBench, with minimum frame rates in the range of about 20.

By the way, I get great numbers in Q3. Blows away the old Voodoo 3. Everything ultramaxed at 1024x768x32 gives me 72 fps in demo001. Looks beautiful and it's very smooth.
 

DieHardware

Golden Member
Jan 1, 2001
1,706
0
76
I have to agree with you guys, UT is &quot;stuttery&quot; with my Radeon LE with the registry hack, running the 7075s and the card at 166.5/166.5MHz(through raid-on). My lowly Rage Fury Pro seems to run &quot;smoother&quot; in UT at 1024x768/32bit(stock):(.
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81
Oh yuck, UT in D3D with 16bit color. It looks good in Glide 16 bit or D3D 32 bit, but the D3D 16bit is just play ugly.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Don't discount your Celeron as a problem as well. UT is a CPU-hungry program and a CPU like a Celeron with crippled caches and memory/FSB bandwidth will suffer.