Ok - how do the Radeon 8500 and GF3 Ti200 scale respectively

Jul 1, 2000
10,274
2
0
Here is the new system that I am building...

P4 1.8a
512MB PC800 Samsung RDRAM
Intel D850MV Mobo
Sound Blaster Audigy
80GB Seagate Barracuda IV, 7200 RPM
Sony 16X DVD
Sony 16X CD-RW
Sony Floppy

all in an Antec SX830 case

I know that the Radeon 8500 is a kick-ass card. The GF3 Ti200 would be nice to put in there so I can upgrade it to a GF4 Ti4200 later without a format and reinstall. I bought the GF3 Ti200 from Gateway, and I intend to pick up the Radeon 8500 on Sunday from BB with all of the gift cards I have accumulated over the ages. ;)

Does anyone think my GF3 to GF4 easy upgrade path idea has any merit? Or.... should I just get an 8500 and say to hell with it and wait for the next generation...

I mean - it is an incredible buy for $119 - if that is truly going to be the price.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Try them both and see what you think. Personally I'd use the Radeon 8500 unless it had issues that I didn't like.
 

Wolfsraider

Diamond Member
Jan 27, 2002
8,305
0
76


<< Try them both and see what you think. Personally I'd use the Radeon 8500 unless it had issues that I didn't like. >>



great call (me too)

hope this helps;)
 

spanky

Lifer
Jun 19, 2001
25,716
4
81
I mean - it is an incredible buy for $119 - if that is truly going to be the price.


for the 8500? or gf3 ti200? and how?
 

DClark

Senior member
Apr 16, 2001
430
0
0
If you're saying going from GeForce3 to GeForce4 for $119, then you've just been sucked into the nVidia marketing ploy that is the GeForce4MX.

The GeForce4MX is inferior to the GeForce3 Ti200 in many aspects, and in my opinion will create a lot of displeased nVidia customers who thought they were getting a "real" GeForce4 (the GeForce4 Ti versions). Don't even think about going for a GF4MX if you already have a Dx8 card (the GF4MX is essentially a tweaked GeForce2).
 

giocopiano

Member
Feb 7, 2002
120
0
0
But the GF4MX is a real GF4 ... it only loses features of the shaders. The kind of person that buys GF4MX probably wouldn't even notice the difference in the lifetime of their card given that someone has to actually program for the features.
Still, I think the GF4MX would be better named "GF4 Lite"
 

MemnochtheDevil

Senior member
Aug 19, 2001
521
0
0
I have to disagree with the statement "But the GF4MX is a real GF4 ... it only loses features of the shaders". The GF4MX has the improved memory system and new AA functions. Other than that its a GF2MX with a higher clock speed and faster ram. The improvment in image quality expected from a new generation is not in the MXs. The AA is too big of a speed hit to be used in future games (unlike the TI4600). Naming the MXs something else or calling the ti cards a GF5 would have been better.

These are good cards for those who know what their buying and accept its limitations, but there is going to be some confusion in a year or so when the new Doom comes out and all new games are using the pixel and vertex shaders. Look at the benchmarks for Aquanox on Tom's review, the GF3 ti200 beats the GF4MX in all but one test. And it loses to the GF4MX 460, which is no where near as popular as the 440 right now.

Your average gamer, who makes up a huge % of sales, is going to think he's buying a card thats better than the GF3 and when it won't run games with all the pretty details in several months he's going to piss and moan. And game developers are going to get stuck answering why his new card is not the best choice for games.

Worse, this may slow down the adaptation of pixel and vertex shaders by developers. Some may decide not to make heavy use of a feature that lots of their target market does not support. Things would have been much better if nvidia had some other naming convention.

Mem
 

xerx

Junior Member
Aug 23, 2001
24
0
0
I think GF4MX440 is a bargain right now. It comes with 4ns ram, DVD capability, and some new features of the GF4Ti. Change the heatsink and add some ramsink, you will able to reach G3 Ti200 speed when its pair up with a fast cpu except for games that utilise the nFinite engine of course.
 

MemnochtheDevil

Senior member
Aug 19, 2001
521
0
0
But thats the rub, more and more games should be using DX8 features. And unlike most on these boards most people only own 1 or two video cards over the life of a computer. If your going to OC, you can get a good gf3 ti200 up to ti500 levels a good % of the time. Thats a lot better than the OC MX will do. I just think the close out deals on the GF3 (like the current visiontek deal) are/will be hugely better deals than buying a MX. I want more detail in games not less, and the MX seems to be a step in the wrong direction.

But the original question in the thread was about the 8500 and the GF3 ti200. I'd say either of those would work well for at least 12-18 months, by which time we should see nvidia's next product. If you want to go for the GF4 speed in a couple of months then going from the GF3 will be easy. I guess it comes down to your prefrences for gaming.
 

MoMeanMugs

Golden Member
Apr 29, 2001
1,663
2
81
You can drop by my place and I'll demonstrate the 8500 for ya. We'll put that crappy Nvidia idea in your head to sleep for all time my friend. :p
 

xerx

Junior Member
Aug 23, 2001
24
0
0
Those deal only applicable for US and sometimes Canada. For us who lives outside the US, we have to pay for higher shipping and VAT.
 

DClark

Senior member
Apr 16, 2001
430
0
0
If you've got a GeForce3 Ti200, then just stick with it. The Retail Radeon 8500 is faster and has some nice features, but in my opinion not nearly fast enough to justify spending the money it would cost for the modest upgrade (especially since the Ti200 should be more than enough power for today's games). If you want to upgrade and try an ATi card, then wait for either the Radeon 8800 or the R300 based ATi cards.

The 8800 should be announced sometime early to mid March, and Available probably towards the end of the month or early April. The R300 is ATi's next all-new core (and the first ATi core from the ArtX team responsible for the Flipper chip in the Gamecube and the graphics in the Aladdin7 northbridge), and is said to be impressive. It probably won't be released until a few months before Dx9 is set to release though, as it's a Dx9 card. That should put the release date sometime in the Summer (June or July).

Btw, you don't have to format and re-install when changing graphics card manufacturers. If you're concerned about remnants of nVidia drivers causing problem, then just use Google.com to search for the "Detonator Destroyer", a program which uninstalls nVidia drivers.