Which is better? GF2 or ATi 9200?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mloot

Diamond Member
Aug 24, 2002
3,038
25
91
You know, there was a PCI video card review over at Sudhian, which benched the very cards he is asking about.
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Have you actually tried playing games with the mx440 compared to the gf2ti? Or using a game demo run instead of a synthetic benchmark? Most game benchmarks online give the 440 an edge in almost all games.

-Steve
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Yes I have tried the cards in games. The GF2 crushes the other cards in Neverwinter Nights. The cards are about the same in BF1942.
 

LocutusX

Diamond Member
Oct 9, 1999
3,061
0
0
Originally posted by: ShawnD1
Originally posted by: JBT
My old GeForce 2 TI 64MB scored around 5000 in 3dmark01 My laptop with a Mobility 9000 64MB which is slower than a desktop card 9200 gets 7000-8000 I forget exactly the score as I don't ussually benchmark it. Maybe I will give it a run or two when I get home just to be sure.

I relize 3dmark sucks but it is a decent benchmark for general system performance.

3DMark is the AOL of benchmarking, seriously. 3DMark is also flawed because it will give a card an incredibly low score if the card is unable to play a scene. I benchmarked both my GF2 Ti and my FX5200 in 3DMark 2001, and since some of the scenes could not be rendered with the GF2, it got an incredibly low score. 3DMark 2001 said the FX5200 was much much faster than the GF2, but another program called GL Excess said the GF2 was faster. I tried playing Neverwinter Nights with both cards and found that the GF2 Ti really was faster than the FX5200 just like GL Excess said.


Please don't use NWN as a yardstick to measure vid card performance by. It's so biased towards OLD nvidia technology it's not funny.

For example, I've been using a Geforce256 DDR (geforce 1!) for the past 4 years or so(1). When I upgraded to a 9800 PRO 128MB 256-bit, my "feeling of performance" in NWN was WORSE, not better. Indeed, benchmarking(2) proved that I was only getting about 3-4fps more in NWN with the spiffy new (& expensive) hardware at the exact same settings as I had used previously! And a casual glance of the Bioware forums proves that other ATI R3xx hardware owners have an equally tough time with this game.



(1) Note that my overclocked Geforce256 DDR, which was an AGP 2x card, could probably trump the PCI-based Geforce2MX which is the subject of this thread! Hilarious!


(2) my benchmarking of NWN: using fraps to bench the introductory in-game cinematic of SOU, where all the kobolds go attacking that dwarf
 

LocutusX

Diamond Member
Oct 9, 1999
3,061
0
0
Originally posted by: ShawnD1
Yes I have tried the cards in games. The GF2 crushes the other cards in Neverwinter Nights. The cards are about the same in BF1942.

another point: while the angle of your comments is true in that, for OpenGL stuff, the GF2Ti can "hold its own"... throw some Direct3D apps into the mix and you'll find that ATI rises to the occasion...
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: LocutusX
Originally posted by: ShawnD1
Yes I have tried the cards in games. The GF2 crushes the other cards in Neverwinter Nights. The cards are about the same in BF1942.

another point: while the angle of your comments is true in that, for OpenGL stuff, the GF2Ti can "hold its own"... throw some Direct3D apps into the mix and you'll find that ATI rises to the occasion...

Like how my 9700 pro is a dog in cs/dod :( .

-Steve
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Originally posted by: ShawnD1
Originally posted by: Childs
I don't believe there was a GF2 Ti200, that was a GF3. Just GF2 and GF2 Ultra.

The box my card came in says "GeForce2 Ti200" right on it. Xbit Labs makes referance to the GF2 Ti as being a Ti200 in an article. link
"GeForce2 Ti doesn't show any significant performance improvement compared to the GeForce2 Pro and is very unlikely to be as attractive for the customers as GeForce2 Ti 200, so we wouldn't dare predict its fortune."

Referance bolded.

In the article the Ti200 is referring to a GF3. That quote has to be a typo, because the article is talking about the newly released GF3 ti500, GF3 ti200, and GF2 ti. I did however, forget about the GF2 ti. I would have forgotten about it since it came out with the GF3s, not GF2s.
 

LocutusX

Diamond Member
Oct 9, 1999
3,061
0
0
Originally posted by: ss284
Originally posted by: LocutusX
Originally posted by: ShawnD1
Yes I have tried the cards in games. The GF2 crushes the other cards in Neverwinter Nights. The cards are about the same in BF1942.

another point: while the angle of your comments is true in that, for OpenGL stuff, the GF2Ti can "hold its own"... throw some Direct3D apps into the mix and you'll find that ATI rises to the occasion...

Like how my 9700 pro is a dog in cs/dod :( .

-Steve

dude, not sure what you're saying. and we're not talking about high-end cards, just old crap. didn't ATI used to have an OGL implementation that was painful in comparison to nvidia's (which has been "good" for a couple of years now)? And when I say "used to have" I mean like back in the day of the original Radeon and such...
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: LocutusX
Originally posted by: ss284
Originally posted by: LocutusX
Originally posted by: ShawnD1
Yes I have tried the cards in games. The GF2 crushes the other cards in Neverwinter Nights. The cards are about the same in BF1942.

another point: while the angle of your comments is true in that, for OpenGL stuff, the GF2Ti can "hold its own"... throw some Direct3D apps into the mix and you'll find that ATI rises to the occasion...

Like how my 9700 pro is a dog in cs/dod :( .

-Steve

dude, not sure what you're saying. and we're not talking about high-end cards, just old crap. didn't ATI used to have an OGL implementation that was painful in comparison to nvidia's (which has been "good" for a couple of years now)? And when I say "used to have" I mean like back in the day of the original Radeon and such...

Well for some reason, I've never had a perfect experience playing cs on an ati radeon card. Ive owned many cards from both camps over the years, and for some reason nvidia is just able to deliver a smoother ride fps wise. Even with my p4 3.0/9700 pro framerates drop into the low 40's at times. Even lower on aztec. I've never (ie very very rarely) had the fps drop below 60 on a nvidia card (with the possible exception of the geforce 1 and older) in a hl based game. The game is a heavily modified quake engine, it doesnt make sense that a Radeon 9700 would run it at anything less than a smooth 60+ fps.

I mentioned it because there are still things, such as a opengl, that nvidia still excels at.


-Steve
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: LocutusX
didn't ATI used to have an OGL implementation that was painful in comparison to nvidia's (which has been "good" for a couple of years now)? And when I say "used to have" I mean like back in the day of the original Radeon and such...

It's still pretty half-ass. My Radeon 9600XT has difficulties playing Half-Life mods at 1024x768.

Strange thing about drivers for ATI is that there seems to be an old and a new type of OpenGL. When installing Omega drivers, it asks if you want to use old OpenGL for games like CS or new OpenGL for modern games.
 

LocutusX

Diamond Member
Oct 9, 1999
3,061
0
0
strange, I never got that "old/new" option when I tried the Omega 4.5 drivers, benched no significant performance differences, and as a result wrote the thing off as a gimmick.

and if my memory goes back far enough, didn't nvidia have the "crap" OpenGL implementation at one point - with 3Dfx Voodoo being the OGL King?

roles have been moved around, nvidia is the new 3Dfx and ATI is the new nvidia. or maybe I'm just blathering...