Nvidia GeForce 3 vs ???

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jan 28, 2001
179
0
0
True, it is rather pricey, but Nvidia isn't worried because that's not what they are making money on. They are making their money on previous chips and chips sold in oem machines. So don't get too worked up over it. Grab yourself a nice GF2 card and enjoy it.

<<In fact the price to performance ratio with these new cards(not just nv20) is very poor compared to any prior card.>>

That's not true. The GeForce2 MX has one of the best price/performance ratios on the market. That was the last, IIRC, GF2 chip to be released. Unless the Pro was after it. I can't really remember.
 

typedef

Junior Member
Mar 4, 2001
19
0
0
I'm just going to reiterate what I've been saying in other forums. Let me start off by saying that anybody defending the GF3 is really trying hard to do so. Allow me to expand.

The major premise to buying a GF3 this Spring are DX8 features. Now, we all know just how badly software development lags hardware, as evidenced by acceptance of easy-to-implement technologies like S3TC (please read George Brousard's recent thread, concerning S3TC and Duke Nuke Em) and T&amp;L. How many games are out right now, that you can truly say....&quot;this game simply wouldn't be the same without this nVidia T&amp;L unit.&quot; None. Some games will be listed as a &quot;T&amp;L game&quot;, but come to find out that there's very little difference in visual quality, and very little in terms of speed. Now, I'm not talking about synthetic benchmarks now...I'm talking about real-world games. After all, those are the only things that really matter.

Only now, do we see a title or 2 take advantage of T&amp;L (Giants), and even then, I'm not entirely sold on it being a difference maker. Back to my point. DX8 games will NOT be coming out in waves until the end of the year, or in/around the time Box ships (whichever comes first). Therefore, you can fully expect these developers to be spending their time preparing for the XBox Christmas. These guys aren't stupid. They know that very few people will be plopping down $500-600 for this card this Spring. If they would like to make some money off the PC software, why not simply wait until their respective DX8 titles have been released for the XBox? It would be a piece of cake to &quot;port&quot; (this word is almost meaningless if porting from XBox to PC) them to the PC's.

So, this is what we've got...We've got a card that, basically, will generally edge out an Ultra by a slim margin, outside of FSAA, in current games. Forget that ridiculous crap about 3-7 times faster....Forget using words like &quot;easily outperforms the Ultra in....&quot;. Then, there's DX8.

If these features aren't going to be utilized until this Fall, what's the point of wasting $600 on it now? We already know that a NV25 will be released by then, and these things will drop in price...and if you really want cutting edge, you might as well spring for the NV25. At least you will be getting something that will coincide with some software which might actually use the hardware that you just blew a small fortune on.

I make good $$, and could easily afford to buy one...but I'm not, because I know nVidia is ripping you off big time with this chip. Microsoft funded the whole damn thing, for starters. $200 million is a stunning amount of money for any R&amp;D project....The once &quot;exotic&quot; 460 Mhz DDR is no longer hard to find, and you can bet it has come down in price as well. All things being equal, the only thing different between the $330 Ultra on pricewatch and the $550 GF3 is the chip, and chips make up, perhaps, the smallest percentage of the total cost of the unit. If nVidia wanted to, they could easily put this thing in the GTS price range (when it was introduced). Anyhow, enough about pricing....we all know it's a rip-off, and we also are aware that if a Radeon2 were to be announced next week, and that chip went toe-to-toe with the GF3, the prices would sink faster than the Titanic.

Which, ultimately, brings me to the conclusion. I shall wait for other chips to be announced, and probably won't buy a damn thing until this Fall. Main reason? I've had a 64 MB GTS since they were first announced, and can honestly say there aren't any games out now which require anything more powerful. Secondly, no DX8 games. I don't see this being an issue until later this year, when XBox is released....Make no mistake, they will be coming...But not until the end of the year. I feel that they're definitely gouging consumers, as far as the pricing...Which is fine. They know there will be bozos out there who will buy this thing, regardless of cost, DX8 game availability, etc.

Finally, I look forward to some other chips coming out in the not too distant future. I really like what I've heard about Kyro2, and ATI might surprise us with the Radeon2. In the case of Kyro2, I know they won't even think about approaching the GF3's price point....and the current chip just needs some more fill-rate to really compete with the big boys.
 
Jan 28, 2001
179
0
0
If you don't like it, DON'T GET IT. Plain and simple here. It's not going to hurt Nvidia's feelings if you don't buy the GF3. That is not what they are making their money off of. You have a handful of choices here: 1. Go out and blow $500-$600 on a GF3, 2. Stick with your current GF2 or go buy a high-end GF2 when the prices drop, 3. Wait for the NV25 and get either that or the GF3 when the prices come down a bit, 4. Stick with your other brand of video card ie. ATI, Matrox, 3dfx, etc. or go out and buy one. I'm sure there are a few other choices I might have missed, but that pretty much sums them up. The concept is not hard to grasp here. Choose your company, choose your card, use it, like it, don't like it, cuss at it, break it in half, return it, get another card, wait for another card, and so on and so forth. I just can't stand it when people bicker back and forth about what is better and what is not. You don't have to publicly display your preference, then rant on and on about it. Maybe this belongs as a new thread, but oh well. Don't start taking information from my previous posts and try to use it against me, because it is not biased towards anyone manufacturer or chip. They were just general statements pointing out certain information that helped prove my point. I wouldn't mind trying products from all four major companies: Nvidia, ATI, Matrox, and 3dfx (even though they are dead). Just the limiting factor of me doing that is $$$$$$ :frown:
 

GourmetPC

Junior Member
Mar 4, 2001
4
0
0
I have been reading everyones indepth discussion of the geforce3 card and I thought I would add my meaningless opinion. I am currently putting together a completely new system and I intend on purchasing the geforce3 for a few good reasons. About every 2 years I upgrade my entire system because I like to keep up with technology and I like having the latest and greatest to a certain degree. Granted, it will be an expensive purchase, in the long run an extra couple hundred dollars over an older and cheaper card isn't gonna hurt. If I'm going to have this new system for 2 years I want to be able to enjoy it and get the most out of it for that time.
 

UKtaxman

Senior member
Mar 3, 2001
202
0
0
What you've also got to consider is has this card got the power to push all those dx8 features? I'm beginning to wonder!
I also can't remember any video card I've ever owned being relevant 1year from purchase, never mind 2 and I think this will get worse! I think we are going to see some enormous jumps in the processing power of new cards ie they have gone on an outward differential 1mpoly to 1.5mpoly to 2mpoly to 4mpoly to 7mpoly to 15mpoly to 20mpoly. The jump between each generation in terms of raw output is growing larger. This may result in their lifespans being reduced substantialy.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
UKtaxman-

&quot;What you've also got to consider is has this card got the power to push all those dx8 features? I'm beginning to wonder!&quot;

Plenty of power. Check out any benches, even with the very early pre-release drivers running something at least close to a CURRENT game(I think I'll start a thread on that topic after I post this as I don't want to take this thread off subject:)). Not only that, remember that the Ultra completely blew away every other card on the market until the release of the GF3, by nearly 50% over the next fastest card in many cases. Besides that, the new features of the GF3 and their performance are for the most part, unrelated completely to anything that doesn't use them. It is like running through a high poly test that doesn't use hardware T&amp;L and saying a hardware T&amp;L board doesn't have enough power:)

&quot;Why are these rendering cards a ton more expensive than 3D gaming cards? Are 3D gaming cards good at rendering scenes (say on LightWave or Maya)? Are there all-in-one cards, good rendering and gaming power?&quot;

Marketshare. The high end pro 3D boards sell in very, very small numbers and they must recoup development costs which aren't any cheaper just because they don't sell as many;) For a solid solution for both gaming and pro 3D nVidia based boards are right now the only offering that competes very well in both realms. As a matter of fact, SGI is currently offering nVidia hardware in their workstations. The difference between the GeForce and Quadro is drivers. Certain features are disabled by the drivers for the GF and the Quadro drivers are optimized for pro 3D where the GF series is aimed more at gaming. They both, however, work extremely well in each others realm. Turning a GeForce into a Quadro only requires resoldering a couple of transistors on your vid card.

&quot;In fact the price to performance ratio with these new cards(not just nv20) is very poor compared to any prior card.&quot;

Actually, if you look at completely fillrate/bandwith limited situations(when using FSAA) the GF3 comes in less expensive per FPS then the Radeon 64MB or GF2. The GF2U dropping to ~$350 makes a very good deal, speaking strictly in terms of cost to performance.

typedef-

&quot;The major premise to buying a GF3 this Spring are DX8 features.&quot;

Why? Because everyone is already running a GF2 Ultra? Because noone wants signficantly faster FSAA? Because people only play games from 1999? I'll say that with very early beta drivers the GF3, in a more current game, is outperforming my GF DDR to the tune of ~600%. I dropped $320 for my Herc DDR when I bought it new, and I need to upgrade soon.

&quot;How many games are out right now, that you can truly say....&quot;this game simply wouldn't be the same without this nVidia T&amp;L unit.&quot; None.&quot;

And Half-Life is still a great game running in software. Once you get past the phase that a great game is still a great game even with inferior graphics, both of the titles that I play most often, Sacrifice and Giants benefit quite a bit using hardware T&amp;L. Those are two GOTY award winners and to date easily the two best games visually that I have seen that have shipped.

&quot;Only now, do we see a title or 2 take advantage of T&amp;L (Giants), and even then, I'm not entirely sold on it being a difference maker.&quot;

Have you played the game? Rev threw up some numbers with his GHZ Athlon and hardware T&amp;L was about 50% faster then software T&amp;L running 1024x768 32bit. If you have played it then I'm sure you realize that it is one game that is in clear need of added performance.

&quot;So, this is what we've got...We've got a card that, basically, will generally edge out an Ultra by a slim margin, outside of FSAA, in current games.&quot;

Current games? What benches have you seen for any game released within the last year? The one set I have seen had the GF3, without FSAA, running ~50% faster when everything was cranked, and even that was not what I would call a current game(nearing a year old).

&quot;If these features aren't going to be utilized until this Fall, what's the point of wasting $600 on it now?&quot;

Have to agree with that, the highest MSRP I have seen to date is $520(Elsa IIRC), and it is a given that you can always do quite a bit better then MSRP when buying anything(the MSRP on the GF2 Ultra was still $500 a few weeks back and you could find them for less then $400). I would have to say that you are absolutely wasting money if you are going to spend $600 on this board.

&quot;I make good $$, and could easily afford to buy one...but I'm not, because I know nVidia is ripping you off big time with this chip. Microsoft funded the whole damn thing, for starters. $200 million is a stunning amount of money for any R&amp;D project....The once &quot;exotic&quot; 460 Mhz DDR is no longer hard to find, and you can bet it has come down in price as well.&quot;

MS did not fund the thing at all. The money given to nVidia was advance payment for X-Box chips which nVidia still has in the bank until they start delivering the product(there is a nast clause in the contract if they can't deliver as they are obligated to). The &quot;exotic&quot; 460 MHZ DDR isn't on the GeForce3. The 4.3-4.4ns DDR SDRAM chips used on the Ultra have been replaced by 3.8ns DDR SDRAM ones, rated at ~525MHZ effective(though from what I have seen so far no OEM is going to default them at that, Herc would be my odds on favorite for that move). I wouldn't be shocked if people can push them into the 550MHZ effective range along with a decent core OC which would make the board even more impressive.

&quot;If nVidia wanted to, they could easily put this thing in the GTS price range (when it was introduced).&quot;

nVidia is currently pricing the chip at $70, you sure you want to blame them?

If you are happy with what you currently have, then of course don't upgrade. There are many benefits to those who do want to upgrade, particularly those, like me, who refuse to upgrade with every refresh cycle(which means I've been sitting on this DDR for over a year now). I'm not saying the GF3 will definately be it(that will catch a few of the regulars by surprise;)), but neither the price and certainly not the performance of this board bother me.