New 5500 drivers (1.03) kick some arse!

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Very nice framerate increase in 32-bit, FSAA and higher resolutions. The low resolutions are still kinda slow, but if you shell out $250 for a video card to run in slow resolutions, beat yourself silly.

Presently, the 5500 is @ 185 (tho 193 is quite do-able, just a bit too warm for my tastes - add a few fps @ 193 if you like)

a few nifty little benchmarks using the new drivers:

Quake3 1024x768x32, SHQ (textures and geometry maxed) - 84.9 fps
Q3, 1024x768x32, default HQ - 90.2 fps
Q3, SHQ, "hi-vis" config (turn off blood, gibs, weapon, marks, brass and dynamic lighting - all the frag-hindering stuff), 1024x768x32, SHQ - 108.1 fps (!!!!!)

MDK2 - 1024x768x32, maxed textures, mipmap enabled - 72.2 fps (Very nice improvement - strangeness in the drivers tho, as this isn't much slower than what I get @ 640x480x32 bit, go figure!)
MDK2 - 1024x768x22, FSAAx2, maxed textures, mipmap enabled -74.1 fps
MDK2 - 1600x1024x22, maxed textures, mipmap enabled - 75.5 fps
presently, I'm leaning toward the FSAAx2 as being the best for this game. the jaggies don't disappear @ 1600, and the FSAA + 22-bit post filter REALLY makes this game look great!

UT - glide, high quality world textures and skins, detail textures = true, cityintro 1600x1200- 56.2 fps, 1280x1024 - 69.4 fps - WH000T!!!!!!

Evolva, 1024x768x32 bit - 91.7 ave

for OGL, "depth precision" is set to "faster" for 32-bit, and "disabled" for 16-bit (cuz it gives funky weirdness in 16-bit)
alpha blending = sharper
mipmap dithering = enabled (16-bit), disabled (32-bit)
lodbias = -0.50 (-0.75 for 2xFSAA)
3d Filter quality = high

nice performance increase, and they're stable!

That's what's nice about this driver set in particular, it is a great performance improvement, and it does NOT introduce bizarre system instabilities like some other manufacturers *coughnvidiacough* driver releases do





 

Rawdog

Member
Aug 30, 2000
83
0
0
Hey RoboTECH, if you have it, can you test the fps in MSFS2000. That's where I'm having my problems.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Rawdog, I don't have it

you gotta do something weird to some ini file for FS2000 tho. I can't remember exactly what.

Specialist, you're either full of $hit, or you are clueless.

Somehow, your DDr posts faster scores in UT than my 64MB GTS o/c'ed to 220/400.

please explain how you got such a mutant video card.

although, judging from your sig, I take it you're not too serious. :)
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
The KT133 provlem was not nvidia's fault but its fixed. 3dfx dosent even support agp fully LOL. Thats why they dont have too many mobo problems.
 

nitrousninja

Golden Member
Jun 21, 2000
1,095
0
76
There is no real reason to support AGP fully right now anyway. Just check benchmarks from any video card the has AGP/PCI versions available. There is usually a 2% difference. Just so you know.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Doomguy, think about what you just said.

You laughed at the 5500 because "it doesn't support AGP fully".

Yet, the 5500 has NO problems with ANY motherboards out there, and the GTS and GeForce originals have had TONS of problems, because "they support AGP fully"

Now then, just what do you get with your wonderful AGP?

Nothing.

I had to go from AGPx2 to AGPx1 (using powerstrip) to get both of my GTS's to get past 133 FSB with no stabilty issues. I lost a grand total of 1 fps in Q3.

Wow. Darn good thing they have such "wonderful, full-featured AGP implementation"

I'd much rather have 1 fps in Q3 than stability across all platforms.

<rolls eyes>
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Doom, that &quot;its not Nvidia's fault&quot; list is getting longer by the day.

&quot;its not Nvidia's fault&quot; That Q3 TC looks like crap on GeForce cards

&quot;its not Nvidia's fault&quot; That D3D performance is poor in UT with Nvidia cards

&quot;its not Nvidia's fault&quot; That 2D is poor quality on Nvidia cards

&quot;its not Nvidia's fault&quot; That there are AGP problems with certain motherboards.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
oldfart, you forgot

&quot;it's not nvidia's fault that you have to play driver shuffle if you play something other than Quake3&quot;

 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
Robo: Please show some proof for this &quot;driver shuffle&quot; . All the nvidia drivers i've ever used worked with all the games i've played like deus ex, quake 3,quake 2, quake 1, thief, system shock 2, half-life, Elite Force.

Its a very common, yet false myth. Nice try.

Also:

1. Really its not 3dfx's fault they dont have true trilinear filtering and anistrophic filtering.

2. Its not 3dfx's fault their opengl drivers are poor.

3. its not 3dfx's fault the 5500 and 6000 use huge amounts of power.

4. Its not 3dfx's fault their linux drivers suck.
Obviously sarcasm.

Shall I go on more?
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Ok first of all to all u nvidia flamers:

UT and Deus Ex are both built around the unreal engine. Guess what the unreal engine was built for from the start: GLIDE!!! they threw in an extremely horrible and slow direct 3d code at the very end. Open gl is a joke and doesnt even work. How can u compare the two if one of the games was made entirely for voodoo chipsets. it is just stupid. When we compare quake 3 scores, the playing field is equal because it is open gl. no one card has the advantage. and guess what, the VOODOO 5 gets spanked.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
&quot;the VOODOO 5 gets spanked&quot;

How do you figure?

My system
P3-600e, overclocked to 800mhz
256 megs of RAM
V5-5500 agp
WinME

Q3, High Quality, 1280x1024 resolution[/u]
New drivers: 54.5 fps
GeForce 2: 56.7 fps (based on Anand's benchmarks, in which they used a 1ghz cpu.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
&quot;Robo: Please show some proof for this &quot;driver shuffle&quot;

GeForce Game List

read the notes at the bottom

or how about

GeForce FAQ addresses the driver shuffle

&quot;When we compare quake 3 scores, the playing field is equal because it is open gl. no one card has the advantage. and guess what, the VOODOO 5 gets spanked. &quot;

hmmmm.....84.9 fps for the 5500, and ~90-95 fps for the GTS (with ugly-ass TC enabled)

hardly spanked. Disable TC on the GTS to even the playing field a bit, and you'll find their framerates are almost equal

 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
When will it end?

RoboTech, Ben Skywalker and others post hugely informative messages regarding intricate details of gaming hardware and someone always poops the thread with &quot;the card I wish I had is better than the card you have been using for 3 months&quot;.

Ever wish you could toss a bitchsmack through IP?


 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
>hmmmm.....84.9 fps for the 5500, and ~90-95 fps for the GTS (with ugly-ass TC enabled)

True story. I started playing Rocketarena3 after I got my Geforce. I instantly notice the blocky sky which albeit ugly (and I hope they fix) I don't really notice it that much. I then thought to myself... these guys did alot of work with the textures. The walls all seemed slightly different, some places had discolorations to break up the plain patterns of the walls. Some areas looked scortched... etc... I really thought it was supposed to be like that :) TC on Geforce produces some weird stuff. I've since disabled it (and knocked the textures down to 16-bit, a decent tradeoff), so I don't know if it has been fixed yet. I laughed pretty hard when I realized the artifact were errors in TC.

Neither card is perfect, if one of them were perfect no one would own the other one. I have a theory that the more people argue about 2 cards the more equivalent they are.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
>Q3, 1024x768x32, default HQ - 90.2 fps

Are the Firingsquad numbers that far off? They have like 72fps at this setting. Granted they are un-overclocked but even at linear gains that would be like 81fps.
 

pen^2

Banned
Apr 1, 2000
2,845
0
0
Sudheer Anne: no no no! unreal engine originally had a software renderer, which was first ported to glide, for glide being the only applicable API back in those days.

&quot;its not Nvidia's fault&quot; That Q3 TC looks like crap on GeForce cards
yes, radeon looks hell a lot better at it
&quot;its not Nvidia's fault&quot; That D3D performance is poor in UT with Nvidia cards
maybe, but UT engine sux azz anyways. it is ugly AND slow... blame stupid game developers using the POS engine.
&quot;its not Nvidia's fault&quot; That 2D is poor quality on Nvidia cards
how could it be? nvidia isnt to responsible for this IMHO. Hercules tends to have somewhat lesss fuzzier output and thats one of the reasons why they are so highly regarded! it is up to the card manufacturer to make their cards more attractive.
&quot;its not Nvidia's fault&quot; That there are AGP problems with certain motherboards.
yes and no... should texture thrasing happen AGP DOES make a difference. you need to give nvidia some credit for implementing *relatively* new tech unlike 3dfx. had the mobo manufacturers sticked to the agp spex there wouldnt have been all those 'agp problems'
 

pen^2

Banned
Apr 1, 2000
2,845
0
0
Robotech:
>>Q3, SHQ, &quot;hi-vis&quot; config (turn off blood, gibs, weapon, marks, brass and dynamic lighting - all the frag->>hindering stuff), 1024x768x32, SHQ - 108.1 fps (!!!!!)

good point! i am sorta amazed how people dont use the hi-vis config as their standard bench marking setup. i never ever bother playin the freakin game with all that crapola turned on!!!:eek::D:|:(:D
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Does it matter who to specifically point the finger at? The fact is, is that the nVidia cards do have their shortcomings, just like any piece of hardware does. It's just that the hardcore nVidia supporters can't admit it, and have to blame something else.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
the firingsquad #'s aren't that &quot;far&quot; off. They just went into the drivers and disabled the default setting for &quot;depth precision&quot; and reset it to &quot;disabled&quot;. Kinda BS if you ask me.

Half the beauty of the V5's drivers is just how much you can configure them.

There is a nifty new variable called &quot;depth precision&quot;

With depth precision disabled, 1024x768x32, I get about 73 or so fps. With it @ &quot;fast&quot;, I get ~78.

At 183, I get 85 fps @ SHQ with it @ &quot;faster&quot;

In 16-bit, it gives a mild speed increase, but causes some rather noticeable visual anomalies.

In 32-bit, it gives a MAJOR speed increase, and I have YET to see ANY game show ANY visual &quot;goofiness&quot; with it.

So far, so good.

As far as the &quot;hi-vis&quot; config, websites don't use it because they are either

a) too lazy
b) too worried about following &quot;default standards&quot;

It's this type of thing that keeps the websites from bothering to look into the 5500's driver tweekability. Blame 3dfx for making stuff too complicated <g>

BrotherMan, as far as your comments regarding the Unreal Tournament engine, you're nuts. It looks awesome, but to each his/her own

 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Interesting, I wonder if depth precision is changing the Z buffer depth or something. Which &quot;depth precision&quot; setting gets the 90.2fps at 1024x768 32bpp with default HQ?