Geforce 2 vs. Geforce 3 chipset, something stinks here!

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
This is the second French site that I have found that confirms a suspicion I have. Don't flame me too bad, I don't have a Geforce 3 here to compare, so I am listening to what you guys tell me and coming up with a conclusion.

http://www.nvchips-fr.com/articles/article.php?IDa=33&p=4

I am strongly suspecting that the Geforce 3 chipset does not play any games any faster than a Geforce 2 chipset at the same clock speeds. DirectX 8 this, DirectX 8 that, that's all I hear anymore. No doubt the Geforce 3 chipset that supports all so-called DirectX 8 features soundly whips any Geforce 2 in any synthetic benchmarks. But does it play the actual DirectX 8 games, or any games for that matter faster? I questioned this in "3DMark 2001 Question!". My Gainward Geforce2 Ti 500 at 275/522 hits 11011 in 3DMark 2000, and averages 40fps in Janes WW2 Fighters at 1200 x 1600 x 32 FSAA off. Of course I am penalized in 3DMark 2001 (5502) because I can't get a score for the Nature benchmark, but it plays the Nature Demo just fine. Are you Geforce 3 guys getting any better? If I am wrong I will go buy the PNY Geforce 3 Ti 200 with 4ns ram at newegg right now, it is only $159. But I have a feeling I might be wasting my money! I know what I am saying borders on Heresay in this forum, but let's get some input on this and get to the bottom of it. Like I said this is the second article I have seen (still can't find the first sites link) that shows Aquanox and a couple of other DirectX 8 games at no advantage with the Geforce 3.

Edit: I am wrong about this site, it was a Geforce 3 Ti 450 they were testing, but keep the comments coming, you are convincing me to get the Geforce 3 Ti 200!
 

EdipisReks

Platinum Member
Sep 30, 2000
2,722
0
0
lets put it this way. a friend of mine has a similar system to mine, but has a geforc2 ultra. in max payne, i can run at 1024x768 with 4x anti aliasing and 64 tap anisotropic and there is never a slow down. he can't do that. the aquanox demo also runs a hell of a lot better on my system than his. the aquanox engine was built around a geforce3, so any site that shows a geforce2 being as fast as a geforce 3 in that game is full of it. the geforce3 is a lot faster than a geforce2 in most any game. buy the gerforce3, and you will be happy, trust me.

--jacob
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Like I said in my post before, that site is has NO GeForce2's in the comparison. It is testing Gainward's GeForce 3 Ti 450... The GeForce2 GPUs get raped by GF3s because of the GF3s architecture is clearly far superior. Benches I see of recent games in a range of resolutions show GF3 boards well over the GF2 Ti and they show the GF3 to pump out twice the frame rate when the frame rate is 1600x1200.
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
I'm not convinced yet guys. For one thing any site showing a Geforce2 Ti vs a Geforce 3 is gonna put the 2 at a 60MHz deficit in ram speed, something that will have a big effect on frame rates in higher resolutions. With all due respect, I could use my old Geforce Pro at 215/467 at 1024 x 768 x 32 4X FSAA and not get any slow downs in Max Payne. What about 3DMark 2000 scores? Looks like a Geforce2 at the same clock speeds as a Geforce 3 are pretty close. Does anybody out there have both cards, to do a back to back comparison on the same machine?
Common sense tells me the 3 has gotta be faster, but I am pulling some awesome framerates on this Geforce2 Ti 500 at 275/522. I am still gonna find that French site that compared a Geforce 3 Ti 200 at 175/400 against a Gainward Geforce2 Ti 500 at 270/520 and got beat on everything but 3DMark 2001.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
Play giants - and see if you get any slowdowns when lots is happening, my gts ultra @300core / 515 memory cant handle the game when set at 1024 x 768 32bit. The slowdowns are just too much and I have to revert to 800 x 600. This only happens when lots is happening but it goes down to 20 fps which is just unplayable.
 

Xtasy

Banned
Nov 23, 2001
568
0
0
Look at these benches.
http://www.digit-life.com/articles/gaingf2ti/index.html
This is a review of the spankin hot gainward gf2 ti500/xp. As u can see in the benches, it tramples all over the gf3 ti200 at 16 bit color, but as u can see, the gainward gf2 ti200 oced to 310/520 can't beat the gf3 ti200 at 32 bit color at resolutions above 640 x 480 (all in q3a). The reason is basically better architecture, like lite speed mem etc... But if u oced that gf3ti200 to lets say 240/520, i bet the 16 bit color ibenches would be dead even tied, but at 32 bit color, the ti200 will pump a 1/3 to 1/2 more frames than the oced gf2 ti. Pixel n vertex shaders i would say are somewhat of no use unless u play aquanox, but it really is faster than the gf2 overall. Look at the tomshardware roundup of titanium cards n u see the difference. Such as wolfenstein - gf2 ti gets about 50 frames average at 1600 x 1200 x 32 bit color (60 frames when oced to the max), the gf3 ti200 gets like 70 average (90 frames when oced to the max), giants, gf3 leads the gf2's by 15 frames, gf3 ti500 literally gets double the frames over a gf2 ti in max payne (1600x 1200 x 32 bit), an these extra boosts are true for um almost all the modern games out there. Not only that, gf3 doesn't take such a big hit in the stomach when 4x or quncounx fsaa nanisoptric filtering is on. I disabled all the anisoptric n fsaa on my gts so i can get the most frames possible (and these are badly needed frames about 15 to 60 area) from this oldie.
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
Good links, lots of info to digest there. This is what I learned at Tomshardware from the review on 21 Titanium boards. In 16-bit color, the Geforce 3 and Geforce 2 chipsets are equal. An overclocked Geforce2 Ti 500 is faster than a stock speed Geforce 3 ti 200 in all the gaming becnhmarks, though not by much. When you overclock the Geforce 3 Ti 200, it is definately faster than any Geforce2 chipset, the higher the resolution, the bigger the advantage.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
When Anand reviewed the Det4 drivers along with a bunch of GF cards, including a GF2 Ultra and a GF3, the Ultra got roughly 60 FPS in Q3 1600x1200x32 max detail, while the GF3 got just over 100 FPS at the same setting.

If I remember correctly, the GF3 Ti200 for something like 95 FPS, and the Ti500 just over 120 FPS.

The GF3 cards are vastly supperior to the GF2 cards when resolutions and detail levels go up, but if you're playing at 640x480x16, you won't really see much if any difference.

Just go look at any GF3 review, especially ones done after the Detonator4 drivers were released, the GF3 absolutely crushes the GF2's in any somewhat high detail benchmark.
 

esc

Senior member
Dec 4, 2001
314
0
0
I don't exactly understand the discussion but:

gf2 ti500 is supposed to be GF3 ti500.

the original gf3 is at the middle.

and gf3 ti200 is at the bottom of the gf3 line.

ti500 is far superior than any gf card available.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
What you are forgetting is that the geforce 3 uses HSR only in 32bit mode. When in 16 bit mode the geforces are more fill rate sensitive as bandwidth is needed less ( i hope that make sense). i.e a geforce 2 ti (250mhz core) is likely to be faster than a geforce 3ti200 (175mhz core) in 16bit because the newer design's only major difference are the crossbar memory and the vertex shader (i dont think the programmable TnL will make much of a performance difference at the moment anyway).