• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

V5 5500 vs the Geforce 2 GTS in Unreal Tourney, BIG DIFF?

JokerF15

Golden Member
hi,

i have a geforce 2 gts, Elsa Gladiac. I am getting ok fps on my Athlon 750@900. BUt i want to know, if i traded my gts for a v5 5500 agp/pci (either) would my fps increase in unreal tourney, deus ex, half-life(cs), madden 2001. These are the games i play the most now. im not really happy with the geforce, i figure that i could get almost similar results with a geforce ddr 256, not the same, but similar. Will a v5 5500 increase speed?. I dont want to shell out 600 bux for a v5 6k.

thx for nay help
 
DAMN!! serious! that's hella lotz!!!.

it's worth it there..is there a 32 mb version or only 64. also which is better pci or agp, i heard something about that the pci is better if im overclocking the fsb, which i am about to do. well ill jsut get agp, (ADVANCED graphics port) that's wut my mind tells me, thx!!!
 
Yes, it will. Especially in Deus Ex.

And once you use FSAA in Madden2001, you won't understand how you could live without it.
 
In Deus Ex the 5500 would be faster because of Glide support. However it is also a very CPU dependant program. Same for Half-life(its cpu dependant), but you wouldn't see any advantage for the 5500 from what I've seen, the GTS should be faster(in HL).

UT is a different issue, the 5500 does not support the S3TC high-res textures, whereas the GTS does. This not only makes the game look better but it runs faster also. I haven't seen benchmarks comparing the GTS with S3TC in GL vs the 5500, its likely the 5500 is faster still, but it should be much closer overall.

"And once you use FSAA in Madden2001, you won't understand how you could live without it."

Well don't forget you can use FSAA on the Geforce also, it may not be quite as good but it certainly is there and works well.
 
hmm..im thinking that i should give the v5 5500 a go ahead. it seems to be better in the games that i play and a little worse in most other games, that i dont play. im not happy with my gts 2 well...ill give it a go, and see if someone will trade their v5 5500 for my gts. hopefully

thx for all your help.
 
Deus EX would definately get a good boost, but other then that the games you listed will see only a minor performance improvement, and UT is rather ugly on a V5 after you get used to the S3TC enhanced look on a Radeon or GF/GF2.

For UT, are you running the Loki OpenGL patch? If you aren't you can improve your performance a decent amount and gain a significant boost in visual quality(UT is ugly in D3D or Glide, for FPS, I got a 16-24FPS boost at higher resolutions with the Loki patch).

Half-Life seems to play smoother on a GF based board under OpenGL then anything else that I have seen, though I haven't played with the 1.03 drivers on the the V5 yet.
 
The Loki OpenGL driver is not that useful, as it tends to bog down during firefights.

I easily get 60FPS+ in UT with my Geforce 256/PIII 933 using D3D.

Seeing as how I've actually HAD BOTH a V5 5500 AND a Geforce 256, I'm not saying this out of my a$$.

Deus Ex COULD be better.... The D3D version they're using is a really old one. Unfortunately you can't just swap D3d dll's with UT's.

 
Yeah.. I saw that too.. was woundering heh.. everybody and there babys mommas uncle seems to beat the pantz off v5 5500.
 
"The Loki OpenGL driver is not that useful, as it tends to bog down during firefights."

No problems with excessive slowdown here at all during firefights, not even with 16bots plus myslef on the reactor level(well, it isn't fast but it is faster then D3D by a decent amount). Only once in a great while is there an occasional pause when virtual memory is hit, which more RAM would solve.

Minimum FPS using the Loki patch tends to be about 10-15FPS higher then under D3D running 4.28 or 4.32 with any of the Det3 drivers or Det2 5.32(I have tested those extensively).

Which version of the Loki patch were you running(there have been several updates) and what settings were you using?
 
Hey shazam, at which resolution do you get 60+ fps with a geforce?
I got a P3 700@933 with v3 and i make 63 fps in liandri with high details, decals on 800*600 res, 16 bit and 7 adapt players. I doubt you get the same performance in d3d with these settings.
 
I just wish that people would stop talking out of their asses. If you haven't actually tried a card how can you say anything about it? You can't just look at a site's benchmarks and think that you know eveything about a card! :|

Anyway, back to the subject. My cousin has a Voodoo5 5500 and we play Deus-Ex at 1024x768 with 4x FSAA and it's pretty smooth. I don't know the exact frame rate, but with the new drivers there are NO slowdowns at any point (apart from when looking at smoke, which for some reason makes my voodoo3 crawl too ...). Before the new drivers came out, 4x FSAA would be a bit choppy so we were playing at 1600x1200 with no FSAA.

We use exactly the same settings for UT too, so I guess it's the same for every game with the UT engine.

I haven't used a GeForce2 so I don't know if it is going to be slower or not, what I do know is that there is no way you will be able to use FSAA in those games with a GeForce2, and FSAA just looks so sweet. One other thing is that with the new drivers in some cases using x2 FSAA doesn't afffect frame rate at all. Especially in low resolutions, like 1024x768.

To be honest even though a Voodoo5 5500 would be much better in games based on Glide, I wouldn't advice you to swap your card for two reasons. One, the Voodoo doesn't like overclocking the AGP port very much and second because Glide is basically dead, not a lot, if any, of the new games will be based on Glide.
 
"I just wish that people would stop talking out of their asses. If you haven't actually tried a card how can you say anything about it?"

"I haven't used a GeForce2 so I don't know if it is going to be slower or not, what I do know is that there is no way you will be able to use FSAA in those games with a GeForce2, and FSAA just looks so sweet."

Runs just fine on my GF1 DDR, and I haven't seen a GF2 yet that has any problems either. UT and DeusEX both are CPU bound games the overwhelming majority of the time, FSAA chews up fillrate not CPU cycles. The reason that Glide based boards have an edge even when the D3D code is solid(such as UT 4.32) is that it is less CPU intensive. For DeusEX the V5 probably does run with FSAA on faster then the GF/GF2 or Radeon would, that game has major issues with its' D3D code, that creates a situation where FSAA gives even less of a relative performance hit. The more CPU bound you are, the less of a hit FSAA will give you.

FSAA is a fillrate hog, and Glide doesn't give any board additional fillrate. You can argue that the V5's FSAA looks better, but FSAA still will run on a GF, GF2, GF2 MX or Radeon for that matter.
 
xtreme2k, for some reason AnandTech tests the 3dfx cards using D3D. Why not use the card to the best of it's ability, I am not sure. If they were to use Glide, you'd see an improvement.
 
someone should compare these card again with them running at their BEST API

however
as far as i know
Glide doesnt work with 32bit
 
You are telling me that your Geforce 2 GTS does not play these games good enough!??!?

What resolution do you play them at? My voodoo 3 pretty much does the job for these games...
 
Rellik: 1024*768*32. Detail Textures on. The only thing I have to do is set model detail to medium. I can run 1280*1024*16 with everything on max at ~75fps.

Once again, the myth about UT being "better" in Glide keeps getting perpetuated. This is simply NOT TRUE.

Ben, I consider anything less than ~45fps to be not acceptable. The Loki driver will sometimes dip down to the 30's for me, which doesn't make me too happy. This is with the latest Loki driver and 6.34 driver rev.

To add to your statement about UT being CPU bound, UT is also texture-bound. UT in 32 bit uses an extraodinary amount of texturing, it seems.
 
"Ben, I consider anything less than ~45fps to be not acceptable. The Loki driver will sometimes dip down to the 30's for me, which doesn't make me too happy. This is with the latest Loki driver and 6.34 driver rev.

To add to your statement about UT being CPU bound, UT is also texture-bound. UT in 32 bit uses an extraodinary amount of texturing, it seems."


Performance doesn't change for me much at all between 16 and 32bit(do you have a SDR or DDR?), with the exception of 1280x1024 32bit vs 16bit under D3D, OpenGL still produces nearly identical(within 1-2FPS) results, of course I don't have a CPU that is too quick anymore(Athlon550).

If you want the best performance with the Loki patch, I would advise using the 6.31 drivers, they not only perform better but don't have any minor glitches as the 6.34s do.

I can't stand playing UT in ugly mode anymore, I gladly will spend the extra time to get the Loki S3TC patch running then deal with the default textures... but if speed is your main concern then a V5 will be better in absolute terms(16bit Glide, low res particularly).
 


<< FSAA is a fillrate hog, and Glide doesn't give any board additional fillrate. You can argue that the V5's FSAA looks better, but FSAA still will run on a GF, GF2, GF2 MX or Radeon for that matter. >>



I was not refering to differences between the VooDoo5's and the GeForce2's FSAA in general, but to the fact that Deus-Ex's D3D is so bad that with x4 FSAA a GeForce2 would crawl at resolutions like 1024x768, although it's a more powerfull card.

Edit: There is a beta D3D patch for Deus-Ex, why don't you give that a try? If I were you I would wait for the official patch before doing anything.
 
How Can you run Deus Ex or UT at either 1600x1200, or at 1024x768 with 4xFSAA on a voodoo5? I'm running on a Tbird 800 with a voodoo5, running at 1024 with NO fsaa on, and the game runs too slow for my tastes! Hell, an old game like Everquest I can't run with 4x FSAA because its too slow when all the spelleffects fly around.

Besides, in Deus ex the text got all blurry and unreadable with FSAA on, so i was sorta forced to turn it off. Either way, i can't seem to use 4xFSAA in ANY game because its too slow.

I tried 4xfsaa in q3a just to see how it would look... and for some reason, i get this horrible stutturing. If i walk in a straight line, its almost like i'm limping along... instead of STEP STEP STEP STEP i get STEP..... STEPSTEP..... STEP.... STEPSTEP
Its really strange and unplayable like that . I love how FSAA looks, but it seems to make most of my games unplayable.
 
I would not make that swap.
The GF2 is much better in OpenGL than the V5 is.
They seem pretty close in D3D.
Of course the V5 has glide, but that is dying off.

I've seen a comparison of D3D vs Glide in UT..Glide was maybe 2% faster in 16bit and won't run in 32bit. When 3dfx designed glide it was great and made a lot of sense. D3D didn't exist and OpenGL wasn't on PCs. Now D3D and OGL are both out there and superior, Glide doesn't make sense anymore, even 3dfx has acknowledged this.

I personally find half-life runs nicest in OpenGL mode, at which the GF is superior.
UT or Deus Ex would be pretty close to the same speed. Especially since you have such a fast CPU, the UT engine is alot more CPU dependant than the Quake engines.
Madden might be nicer on the V5 becuase of the better FSAA but that would be about it..

The only time I can see using a GF2 over a V5 is for racing games/flight sims where the superior FSAA helps more to reduce pixel popping. All in all I'd keep the GF2. Maybe get a V5 PCI and use both, but don't replace the GF2 with a V5. Personally I don't think it would be very worth it.

The only card I would consider over a GF2 is a Radeon, and that's becuase of the massive feature set, not for speed.
 
shazam: thx for answering. I have the details on high(texture detail)
I do know that this setting improves the framerate in d3d significantly. Even Tim Sweeney confirms this(Epic lead programmer).

I have had the chance to test a Geforce DDR in my 700E (@700 at that time) and the difference between D3D and Glide(V3) was about 5-10 fps.
I twas good for gaming, but there were ALWAYS these hiccups that did not occur in glide. This was version 428. 432 is largly a network improvement, so execpt for the seperate dlls that came after 428, performance should be about the same. The general feeling from gaming
i got was that my v3 beats the Geforce DDR in UT. But this was only because of the Glide Api. In every other game, my p3 700 struggled
to keep up with my friends p2 400 with gf DDR....

Anyway, trading a gf2gts for a V5 seems ONLY not stupid if one only plays GLide games or is stuck without an agp slot.....

I wouldnt mind getting the v5 for say, 50 bucks, though....
 
Ben, try either Deck 16 II, or play Domination level with the one big building and all the billboards (I can't remember the name). You'll see some slowdown in 1024*178*32.

I have a CLAP Pro, 132/332
 
Back
Top