What's up with ATi's R300 and their integrated motherboard?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AGodspeed

Diamond Member
Jul 26, 2001
3,353
0
0


<< oh my god...people just aint getting it.

the pace at which technology progresses is NOT the issue...

say we have a processor at 10ghz now, and it'll reach 14ghz in exactly ONE year irrespective of anything. Would it seem right to introduce 8 new processors in 50mhz increments from here until next year, while with each incremental release, the value of your processor plummets into the ground? Or would it seem more reasonable to release, say, two 200mhz increments, one half way through the year and the other at the end of the year, while preserving the value of your processor...

the latter is what the server market is about...the former is what nvidia's about...except they'd release 400 new processors each year in 1mhz increments.
>>

Really? Give me an example. Exactly how is nVidia only making small and insignificant leaps in performance that gets your panties in a wade about the value of your product? The GeForce3 Ti500 is noticeably faster than the GeForce3 and GeForce3 Ti200. The GeForce3 is significantly faster than the GeForce2 Ultra not to mention it brought more features than the GeForce2 Ultra.

Besides, nVidia isn't releasing new products much more than 2-3 times a year. You're devaluing the speed increments at which nVidia releases products. In general your logic makes no sense since the competition isn't doing any better of a job than nVidia.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106


<<

<< what's your problem with ATI and screen resolutions ?

I have a 19" and 1600x1200 is definetly a bit high, and 1240x1024 is NOT 4:3, but 5:4. What did i do ?

I got powerstrip from entech, and i still have to find ONE custom made resolution which does NOT work with my setup ! I can chose the 'oddest' custom resolution..and they all work fine !

My current favourite desktop/work resolution is 1368x1026 (4:3 aspect) btw.
>>


Powerstrip doesn't work under NT completely, and for some reason I cannot get it to work with my XP box either (with a RadeonLE). When I was running 2000 it didn't work either. However, I shouldn't have to hack my registry etc. just to get a simple screen resolution. Maybe I'll try again - it's been a few months since I've tried.
>>



eug,

thats weird. Powerstrip works like a charm under my XP Pro. (Also under 98, no problem)

Dont forget that XP istself has a problem with refresh rates...i used powerstrip mainly because of this...and i discovered the "custom resolutions" feature later.

Maybe you have some odd remains/entries in your registry by using "tweakers" like radeontweak and similiar.....it SHOULD work definetly. Maybe uee regclean.exe to get rid of old/wrong registry entries..
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
The thing I like most about nvdia cards is that clock for clock they are very efficient, all to often we hear that 'nvdia just use brute force' and yet when comparing technologies they have what is either a better core design or better drivers.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
nortexoid dose have a simple point. In another thread I stated: Dosn't nVidia's 6-month cycle seem a litte to aggressive? With the way they keep dishing out faster and faster Video Cards, it?s almost impossible to keep up, let alone get a long life span out of a $400 dollar video card. Maybe it?s just me, but I feel nVidia is being a little careless, overly ambitious and wastefull with the video card market..

Now I know careless is a little far fetched, but wasteful and a little TO ambitious is in line in my eyes. When a new Technology comes out, I feel it should be A LOT faster than the older tech that I am used to. Now we all remember the whole GF2 Ultra and GF3 fiasco, for the most part the GF2 Ultra could pretty much keep up and even beat the GF3. I know that nVidia could have released a Ti500 when the GF3 came out, but they opted to milk people for their cash. The thing that gets me aggravated is that instead of focusing on technology and innovation, as opposed of brute force mhz battles (mem, core, ect.), they instead squeeze every cent out of consumers. They trick people by throwing different names on the same products with very minor upgrades. In my eye?s that?s a bit deceitful, greedy, and overall just a complete lack of respect for peoples hard earned money. When you make an investment you expect it to last and give you your moneys worth, but right now high-end video cards are as bad an investment as a 40 thousand+ dollar SUV. I just don?t think there are that many people that have the money to throw down 400 bucks on a new Video Card every few months, and currently that seems to be who nVidia is catering to. Now this is just how I ?feel?, so take it however you please.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106


<< Again...excluding gf4: Already forgot that the radeon was on #1 on 3dmarks list for a long time ? If ATI is/was good in anything than in benchmarks....i think you missed something....
Since when did 3DMark become a valuable measure of gaming performance? I think...never! ;)
>>


since always...3dmark has always been *THE* de-facto standard - and we all know that 3dmark scores usually 100% translate to the equivalent gaming/application performance :)

Ok..i am just KIDDING :) It was the poster before me who stated "...they beat ATI in most bechmarks"..so it was only legitimate to tell him that this was not right - at least *before* the gf4 came out. (gf3 vs. ti 500).



<<
Oh...and Nvidia's product cycles are better ? Actually....what you said applies to Nvidia times x10...in fact ATI was it coming out with a inovative product FIRST which introduced many new features (dx81, pixelshader1.4)...the first time de-throning Nvidia...a REAL new product when it came out...whilest the Geforce line of cards was NOTHING, nothing new for a very long time. from the gf2 to the gf3 and all so called "TI xxxx series" in between..the same old story....continued even now with their "glorious" Gf4MX etc. ect. We dont talk about the 'real" Gf4 now...this is a card a generation AFTER the Radeon, and even the Gf4 is not really a "new" product and ATI even has SOME features not even the GF4 has.

Flexy, you're creating a double standard here. First you say that the Radeon 8500 brought some new "features" to the table (ps 1.4, etc.) that nVidia couldn't match with their best product at the time (GeF3 Ti500). This is a reasonable statement. However a couple sentences later you say that even though the GeForce4 Ti series brings new features to the table it doesn't "count" because ATi is still using the Radeon 8500.
>>



hmmm..i said that ????? Maybe it came out wrong...i basically do not think that the gf4's strength is the number of (new) features !!!!!!! I see that the gf4 of course is faster (it would be pathetic to debate whether a 8500 is on par with a gf4 :) - but not because of "new features" - rather it's "only" an overall faster card, better AA performance etc. But for sure did NV not introduce "many new features" with their GF4. What new features should that be ?



<<
The GeF4 brought plenty of new features to the table whereas ATi is still riding the Radeon 8500 from last year. Not that it's a bad thing, but clearly the GeF4 line dwarfs anything ATi has in terms of gaming performance, and certainly it's features are competitive with ATi's features.
>>



hehe..now YOU jump on the waggon comparing gf4 to 8500 :) Personally, for me, i do NOT think it's worth switching from a radeon 8500 to a gf4. I dont think that the differences in performance are THAT high...and STILL (even if gf4 is actually another generation)...gf4 has probs with aniso (performance drop), and i wish they would at least have some dx8.1 h/w support (pixelshader1.4)...which (strangely enough !) the OLD radeon has. I am rather waiting for the next gen cards from matrox, ATI..and/or NV30. However, for others a ti4200 or ti4400 may be a good deal to upgrade from a gf2 or something..i dont say these cards are bad *per se* ! No way !

 

anthrax

Senior member
Feb 8, 2000
695
3
81
Without ATI Nvidia would have a free hand to rip us off...quite honestly....I starting no to like the 6 month product cycle......and the realease of so called new products which is just another iteration of a old one......
Right now I think that the video card development has out paced the ability of games to catch up........There is really no huge reason for me to get the next generation of cards right now or in the near future........
There is really no killer app that would require a GF4 4600Ti which would come up soon...even now there aren't that many Dx 8 game around...let alone Dx 9 stuff......currently the newest cards only offer 30 % perfomance advantage oer the last generation...which at least to me isn't really worth the high cost of the latest technology....

or put it in another way.....the benifits gained by using the latest interation of technology is becoming smaller and smaller...... (when compared to earlier technology.

As you can see...even Nvidia and ATI know this and they are all trying to diversifly there business by moving into the other areas.


 

anthrax

Senior member
Feb 8, 2000
695
3
81
Without ATI Nvidia would have a free hand to rip us off...quite honestly....I starting no to like the 6 month product cycle......and the realease of so called new products which is just another iteration of a old one......
Right now I think that the video card development has out paced the ability of games to catch up........There is really no huge reason for me to get the next generation of cards right now or in the near future........
There is really no killer app that would require a GF4 4600Ti which would come up soon...even now there aren't that many Dx 8 game around...let alone Dx 9 stuff......currently the newest cards only offer 30 % perfomance advantage oer the last generation...which at least to me isn't really worth the high cost of the latest technology....

or put it in another way.....the benifits gained by using the latest interation of technology is becoming smaller and smaller...... (when compared to earlier technology.

As you can see...even Nvidia and ATI know this and they are all trying to diversifly there business by moving into the other areas.


 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76


<< There is really no killer app that would require a GF4 4600Ti >>



Have you played Commanche 4?
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
<< There is really no killer app that would require a GF4 4600Ti >>



Have you played Commanche 4?


That game sucks so it's not really a "killer app", plus, it's more of a resource hog than anything. I'd rather play desert strike on the Super Nintendo or Sega Genesis.
 

Wolfsraider

Diamond Member
Jan 27, 2002
8,305
0
76


<<

<< There is really no killer app that would require a GF4 4600Ti >>



Have you played Commanche 4?
>>



yes i have on 98se and xp pro using a radeon all in one wonder no problems on my 21 inch monitor at at all

but try star trek bridge commander with 13 warbirds vs 13 federation in quick battle lol there is a test of power

and my lil ole radeon handles this fine but she doth strain with more ships

or try sierra's homeworld maxed out with 4 enemies and no restrictions like no research 9990 injections every 1minute lol

she bogs a little here but like tiger woods she plays through lol


mike
 

Valinos

Banned
Jun 6, 2001
784
0
0
I really don't see a problem in releasing new iterations of the same core every 6 months. Nvidia may not be doing the general "I just bought a Dell" joe sixpack crowd a favor by releasing so many (somewhat confusing) products, but to most informed consumers, it is great. I like having the choice of so many cards and pricepoints. I keep up with the numbers, the features, and the prices. I know what is the best bang for the buck. That's what I get. I think it is great for Nvidia to have a new product every 6 months. Sure, their flagship may be $300-400, but if I want the highest end, I can get it.

It is all about choice and variety. I always know I got the choice of spending $40 on a GF2MX or $400 on a GF4 Ti 4600 and then everything between. Intel and AMD do the same thing. Some people like to have the best out there so they buy the P4 2.4 or the Athlon XP 2100+. That just makes everything below it cheaper.

I'm all for 6 month product cycles. I'm for 1 month product cycles. I love seeing fast paced technology. It gives me something to read about, drool over, and eventually buy six months down the road when the price has been halved. Yet the performance is still more than enough.

ATI doesn't give me that choice.

And as far as features are concerned...The ENTIRE GF4 line (MX and Ti) from EVERY manufacturer offers built-in dual display and svideo. The insane horsepower that the GF4 Ti series offers is also a great feature. I can run ANY game (except Dungeon Siege which is a HOG) with 4xFSAA in 1280x960 and sometimes 1600x1200 with max details and still get 60+ FPS. That alone might make the card worth $300-400 for some people. And not to mention Quincunx FSAA is awesome too. Hardly a performance hit over 2x yet it looks better. The Radeon can't offer that.

I don't mean to get into a war, but what you say about nvidia is totally biased and wrong. I want ATI to succeed. The only way to do that is for them to get off their asses and make some performance products on a regular basis, and give the informed consumer lots of CHOICE! That's why Nvidia is major leagues. They flood the market with great products at all price points, and continue to deliver. I have yet to be dissatisfied with Nvidia (well, I do agree the GF4 MX naming convention is a load of horse dung).

Until I see some cards from ATI that offer the bang for the buck that Nvidia can offer, my next card is going to be Nvidia.

By the way, Best Buy is selling the PNY Geforce 3 (original) for $179.99 with $40 mail-in rebate bringing it down to $139.99. I just picked one up today for another system I'm upgrading. Not a bad deal, considering Newegg is selling the same card for $165. I was hoping to find a clearance Radeon 8500 64mb there for $105, but they had none left :/
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
here's one problem with nvidia's 6 month cycle:

NO INNOVATION.

this is what they do: they wait until memory manufacturers start pumping out faster memory in 6 months so they can stick it on their new boards, and during that whole time, they're worthlessly tweaking the core for higher mhz while providing no architectural improvements besides a brute mhz bump in clock and mem speeds.

memory speed increases shouldn't attribute a design win to nvidia - i.e. as a new Geforce line of cards....there's nothing new about it.

come out with a card, maybe even two per series, not 6, and then don't release another series until u make some real architectural improvements.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,572
126


<< Good lord, it's called competition and innovation, the way most technology companies work. >>

it used to be called planned obsolesence.




<< The GeForce3 Ti500 is noticeably faster than the GeForce3 and GeForce3 Ti200. >>

i'm not really sure why this is... when a ti200 is getting half the score of a ti500 which doesn't have that much more memory bandwidth something smells fishy. i wouldn't put it past nvidia to sandbag the performance of the lesser cards.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76


<< here's one problem with nvidia's 6 month cycle:

NO INNOVATION.

this is what they do: they wait until memory manufacturers start pumping out faster memory in 6 months so they can stick it on their new boards, and during that whole time, they're worthlessly tweaking the core for higher mhz while providing no architectural improvements besides a brute mhz bump in clock and mem speeds.

memory speed increases shouldn't attribute a design win to nvidia - i.e. as a new Geforce line of cards....there's nothing new about it.

come out with a card, maybe even two per series, not 6, and then don't release another series until u make some real architectural improvements.
>>



Geforce 2 Ultra -> Geforce 3 brought programmable Pixel Shaders and Vertex Shaders, Lightspeed memory architecture, nfinitefx engine and memory crossbar to improve efficiency, Multi-Sample AA was introduced also to reduce the performance hit of AA, 3d textures and shadow buffers

Geforce 3 -> Geforce 3 Ti 500 and Geforce 3 Ti 200 brought no significant changes. It did make the Geforce 3 technology mainstream with the Ti 200 at $200 price tag which for the most part could overclock to Ti 500 levels. But I think NVIDIA was just not ready with NV25 and thus had to release the Ti series to combat the Radeon 8500. Anand made a hint to this a while back in his NV25 review. The Geforce Ti 500 was about 9%-17% faster than the Geforce 3 overall.

Geforce 3 Ti 500->Geforce 4 TI 4600 brougth dual monitor support, improved lightspeed memory architecture, nfinitefx engine for a overall improved effeciency (at same clock speeds, the Geforce 4 Ti is faster than a Geforce 3 Ti 500), nview as a replacement for Twinview and better looking AA.

So you can see that NVIDIA has done more than just ramp up the memory and core speeds.

What has ATI done which was new? First with a T&L card? Nope that was NVIDIA. First with dual monitor support in mainstream card? Nope that was Matrox. First to improve on memory bandwidth by optimizing effeciency? Nope that was PowerVR. First card to offer programmable pixel and vertex shaders? Nope that was NVIDIA. First to bring AA to mainstream? Nope that was 3dfx. Sheesh.

 

Valinos

Banned
Jun 6, 2001
784
0
0


<< here's one problem with nvidia's 6 month cycle:

NO INNOVATION.

this is what they do: they wait until memory manufacturers start pumping out faster memory in 6 months so they can stick it on their new boards, and during that whole time, they're worthlessly tweaking the core for higher mhz while providing no architectural improvements besides a brute mhz bump in clock and mem speeds.

memory speed increases shouldn't attribute a design win to nvidia - i.e. as a new Geforce line of cards....there's nothing new about it.

come out with a card, maybe even two per series, not 6, and then don't release another series until u make some real architectural improvements.
>>



*cough* Give me the name of one computer hardware manufacturer/designer that DOESN'T just bump up speeds and size and sheer power every few months?

Let me think...AMD and Intel both release a new core (which is usually just a die shrink or added cache) about once a year, which still could just be considered product refreshes as the Northwood is basically a revamped Willamette. The Palomino is a revamped Tbird. The Geforce 4 is a revamped Geforce 3. See a pattern?

Hell I can take it to memory and hard drives...

IBM's 75GXP, 60GXP, 120GXP...no big changes just increased speed, platter size, and buffer size
Same goes with WD and Maxtor

Memory? PC-1600 DDR, 2100, 2400, 2700, etc Rambus just hasn't moved at all lol

Don't give me that worthless no innovation arguement. The entire PC hardware industry is the exact same. And what is ATI offering me? A new card that stays pretty stagnant in price for about a year (The Radeon 8500 was $300 retail until recently).

I like the variety that Nvidia offers all with different price points. I also like how AMD and Intel handle their business. I got plenty of choice from a Duron 900 for $50 or a P4 2.4Ghz for $600. All of which are pretty fast processors and can keep up with most of today's applications.

Again, your arguement is invalid.
 

anthrax

Senior member
Feb 8, 2000
695
3
81
Isn't commanche 4 more CPU intensive ...and frame rate is CPU limited (to a extent)....which is typical with flight sim type games......


<< The GeForce3 Ti500 is noticeably faster than the GeForce3 and GeForce3 Ti200. >>

i'm not really sure why this is... when a ti200 is getting half the score of a ti500 which doesn't have that much more memory bandwidth something smells fishy. i wouldn't put it past nvidia to sandbag the performance of the lesser cards.

Well its buiness its called market segmentation.....value, middle line, premium .......can't sell premium stuff if the value stuff perfoms almost as well as the top of the line stuff..........

The thing is .....compare a GF4 4600 and a GF3 12 months ago......you get a average of around 30 - 40 % difference in perfomance...but really what huge benfit does it really bring?...........
Take a GF3 on a 1700 XP platform....~ 98 fps..at 1280 x 1024 x 32...
Take a GF4 4200 100 ~fps ..
even if the GF4 4600 is 30 % faster.... and does 130 fps..... its kinda pointless to a vast majority of ppl out there...

even at 100 fps.....most main stream moniters can not do 1280 x 1024 x 32...at 100 Hz.......
let alone 130 Hz plus.....


 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
AMDInside, all the real architectural improvements you just noted came with NEW PRODUCT LINES, not from 6 different rehashes of the same line (i.e. GF4 line has 6 products)...so are u supporting me or something?..thanks.

Valinos, are u awake?...i'm referring to the GRAPHICS chipset market, not the memory or cpu market, which are entirely different...WHAT FEATURES CAN U ADD TO SDRAM???....u tell me.

CPUs are another story entirely as well...they ramp far better than GPUs, and an increase in mhz translates to a proportionate increase in performance (for the most part)....with GPUs, an increase in mhz may translate to virtually no increase in performance....especially in nvidia's case since they're so memory bandwidth limited....they're brute-force strategy is failing.

and, my argument's "invalid"?...take a class in logic or something.
 

Valinos

Banned
Jun 6, 2001
784
0
0
Nortexoid, how can you accuse nvidia of not innovating and not include the rest of the PC hardware manufacturers?

I don't care if it is the graphics market or if it is HSFs. They all offer various "rehashes" of the same product. You can't just lump nvidia into one market segment and not the others and accuse nvidia of not innovating. Like I said before, I like the variety that is available. The variety and different price points let any consumer choose what is best for them. If they want to stay one cycle behind the game they can still get some killer graphics for under $200. If they want the most FPS and the biggest bragging rights they can get a GF4 Ti 4600. What's it matter to you? How is it hurting you? It just shows that ATI is staggering and can't keep up. Of course, I don't want to see nvidia become a monopolist and prices go up, but at the moment they are the leaders. They have the best hardware out there and they innovate constantly. Even if the Geforce 2 is just a reiteration of the Geforce, it still added quite a few features that made it stand out and perform better than the Geforce...what's wrong with this?

If you were running the company would you just put out two cards a year? One for $300 and one for $100? I think not. You have quite a few people willing to shell out over $100 for graphics but not quite $300 or $400.

Again, you have been invalidated.

INVALID
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
ok ok, i'll laugh this up, since i'm getting nowhere.

but let me make a last remark:

nvidia shouldn't be selling a so-called high part for 300 and a low end part for 100...because in reality, the low-end part is manufactured on the same process using the same architectural design but crippled in some way (i.e. by memory bandwidth + core speed, or whatever)..

meaning, the high-end part should be selling for like 150-200 if the lowend part is 100.

and yes, i do hate hdd manufacturers for selling their high capacity drives whcih add merely a working side to the platter or the like while boosting costs dramatically....i hate any other manufacturer (i.e. intel) who cripples so-called high-end products and resells them as low-end when tey cost, from a manufacturing stand point, virtually the same price.

screw it, i'lll confess my hatred for nvidia right now...godspeed and others know where i'm coming from....i hate their entire business ethics, marketing stratagem (for the stupid), and undeserved recognition as a benchmark standard (for the most part)....

they produce decent graphics chipsets, but they still have a lot to learn...brute GPU power isn't all....
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0


<< they produce decent graphics chipsets, but they still have a lot to learn...brute GPU power isn't all.... >>



aah yes because the geforce 3 runs at such high levels or core / mem compared to the ati radeon design
rolleye.gif
And lets not forget the geforce 4 which even in ti4200 guise running at the same speeds as a 8500le it is so unefficient it just dawdles along doesnt it.
Can you name a more efficient card for the high end or even the middle of the graphics market ? lets see the only thing that gets close is the Kyro II and that is getting beaten by the supposed low performing mx440.
And lets not forget of course that the mx440 is just a horrifying rename of a geforce 2, we wouldnt see that from any other manufacturer *cough* radeon 7500 *cough*
 

Valinos

Banned
Jun 6, 2001
784
0
0



<< aah yes because the geforce 3 runs at such high levels or core / mem compared to the ati radeon design
rolleye.gif
And lets not forget the geforce 4 which even in ti4200 guise running at the same speeds as a 8500le it is so unefficient it just dawdles along doesnt it.
Can you name a more efficient card for the high end or even the middle of the graphics market ? lets see the only thing that gets close is the Kyro II and that is getting beaten by the supposed low performing mx440.
And lets not forget of course that the mx440 is just a horrifying rename of a geforce 2, we wouldnt see that from any other manufacturer *cough* radeon 7500 *cough*
>>



Exactly.

Also, to nortexoid,



<< meaning, the high-end part should be selling for like 150-200 if the lowend part is 100. >>



So we'd have the low end and high into two segments and two prices, within $100 of each other. Wow, that makes business sense. They have to make a profit to make investors fatter and richer so the execs can get fatter and richer and everything is dandy. I don't see a problem of offering a premium on a product that bares your invention. People buy for the nvidia brand and the known performance and power. Nvidia has yet to trip up. They continue to offer a good product and people pay for it. It is the same as Sony putting a premium on all Sony products. That goes with any industry dude. Take an economics class. It just makes sense for profit's sake.

The early adopters will buy it, and they'll tell/show their friends and their friends adopt when it comes down in price.



<< and yes, i do hate hdd manufacturers for selling their high capacity drives whcih add merely a working side to the platter or the like while boosting costs dramatically....i hate any other manufacturer (i.e. intel) who cripples so-called high-end products and resells them as low-end when tey cost, from a manufacturing stand point, virtually the same price. >>



Again, it is business, and just putting in a middle product is the way to go to get those on the line people. Hmm...should I spend $200 on the GF4 Ti 4200 or $400 on the Ti 4600....oh wait, there's the Ti 4400 for $300.

People get attracted to having that extra option, so they put it out there to be fed on. More money, and that is what running your own business is all about!



<< screw it, i'lll confess my hatred for nvidia right now...godspeed and others know where i'm coming from....i hate their entire business ethics, marketing stratagem (for the stupid), and undeserved recognition as a benchmark standard (for the most part).... >>



Then you spill the beans and admit you are extremely biased and that your opinion doesn't really count. Ok kid, thanks for playing. You once again, totally invalidated yourself. Just another kid routing for the underdog "indie" player. I guess you wanna be a rebel against the mainstream to vent your anger. Cool off bud, you need to worry about more important things.

Denied.

Valinos :)


*Edited a couple spelling mistakes.




 

AGodspeed

Diamond Member
Jul 26, 2001
3,353
0
0
nortexoid, I think you're exaggerating a bit. nVidia doesn't have 6 "rehashed" GeF4 cards. They have two. The GeForce4 Ti4400 and Ti4600 (soon the Ti4200). The GeF4MX cards don't count, they're just really beefed up GeForce2 MX cards with a GeF4 Ti memory controller (hence they perform between GeF2 GTS and GeF3 Ti200 levels). There are only two GeF4 MX's, the 420 and 440 (the 460 doesn't exist, never did).

Besides, what about ATi? Look at the Radeon 8500, 8500LE, and 8500LE LE. Does that not fit your criticism? Hence your statement that "in reality, the low-end part is manufactured on the same process using the same architectural design but crippled in some way (i.e. by memory bandwidth + core speed, or whatever)..." ATi does the same thing.

Btw, nVidia hardly ever releases GPU's that have not been modified in some shape or form in the last year. GeF2 Ultra --> GeF3 --> GeF3 Ti500 --> GeF4 Ti4600. Two out of those three releases were significantly modified from their predecessor. Performance-wise and feature-wise.

Still, I'm not sure what you're arguing. Are you saying that companies (nVidia in this case) should price products in such a way as to make less money? (no $300+ video cards). Well what do you expect, for these companies to not sell $200-300 cards because their $100 cards aren't that different? Cmon!

What's stopping you from getting a $100 card? What are you complaining about, nVidia is offering some fast and feature-filled video cards for $100-150. Why would you care if they're selling $300 cards; is it because your card will be worth less when you trade it in later months? Bah, I'm sure nVidia is really worrying about people who buy and sell cards every few months and is therefore going to drop their $300 video card lines and lower their profit margins as a result.
rolleye.gif
 

Agent004

Senior member
Mar 22, 2001
492
0
0


<< nortexoid, I think you're exaggerating a bit. nVidia doesn't have 6 "rehashed" GeF4 cards. They have two. The GeForce4 Ti4400 and Ti4600 (soon the Ti4200). The GeF4MX cards don't count, they're just really beefed up GeForce2 MX cards with a GeF4 Ti memory controller (hence they perform between GeF2 GTS and GeF3 Ti200 levels). There are only two GeF4 MX's, the 420 and 440 (the 460 doesn't exist, never did). >>



I think you being unfair here. If a card is market under the same line/name, GF4 xx series (either Ti or Mx), it does count as as GF4 cards. It doesn't matter how it performs, but rather how it is marketed by nVidia. So yes, nVidia does indeed have 6 (4? ) 'rehashed' GF4 cards
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
And ati has how many exactly? lets see 4 retail radeon8500 (64mb and 128mb, 64mb aiw and now the 128mb aiw) we can then add to that the 3 versions of the le chipset (64mb, 128mb and the LELE, also the hercules version as well if your being picky) and thats before we get to the 7500 based cards of which I cant remember all the diffrent types. So as you can see ATI is just as bad.