What's up with ATi's R300 and their integrated motherboard?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Athlon4all

Diamond Member
Jun 18, 2001
5,416
0
76


<< If a card is market under the same line/name, GF4 xx series (either Ti or Mx), it does count as as GF4 cards. >>

I agree, but I fail to see what the point is in relation to the discussion.
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
Mingon


<< lets not forget the geforce 4 which even in ti4200 guise running at the same speeds as a 8500le it is so unefficient it just dawdles along doesnt it. >>


huh? just because something performs well doesnt' mean it's efficient....performance doesn't entail efficiency. my 16 cylinder engine running off 8 cylinders may still perform as well as a tweaked out 4 or 6 cylinder engine (running of 4 or 6 cylinders, respectively), but it surely isn't efficient by any means...that's nvidia, the 16 cylinder engine running off 8 cylinders...praying for faster memory to be released 6 months down the road so they can rehash the same product slightly tweaked for an outrageous price.


<< And ati has how many exactly? lets see 4 retail radeon8500 (64mb and 128mb, 64mb aiw and now the 128mb aiw) >>


huh? (again)...the AIW is a product line of its own, and memory configurations are rather irrelevant for the most part...because then nvidia would have something like 9 products or more in their series...so we'll just ignore that

Valinos

i don't have to like poor business ethics, so stop frothing at the mouth. also, the GF4 line is 6 cards - 3 MX parts, and 3 Ti parts...or is there even more now, after all, it's been a couple months or so? So how does that fit into your 3 card segment scheme? now it's 3 "low-end" parts and 3 "high-end" parts per product line? are they eventually gonna jump to 9 cards per series, adding 3 "mid-end" parts to their product line?

the Geforce 4 u-dumb50/60/70iq.

it boils down to this:

1) nVidia's product cycle is too short, whether enthusiast, capitalist gamers like it or not...the rest don't (or shouldn't!).
2) they've too many products per line. it's like they're trying to target more than what the market consists of. We don't need 200 of the same card, each one costing a dollar less than the other and performing a wee bit worse than the other above it.

i realize that rehashing the same product under different names is a compelling move from a [profitable] business standpoint, but that doesn't change the point...they're still OVERLY rehashing the same product....i don't care if that's good for business because little trolls run out and pick up a new card as soon as reviews hit the web showing a graph disproportionately depicting a 4fps performance increase.

i poo on the remains of this thread.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,573
126


<< Well its buiness its called market segmentation.....value, middle line, premium .......can't sell premium stuff if the value stuff perfoms almost as well as the top of the line stuff......... >>

well i know what segmentation is... i still think nvidia is sandbagging drivers on the ti200 vs the ti500.







<< nvidia shouldn't be selling a so-called high part for 300 and a low end part for 100...because in reality, the low-end part is manufactured on the same process using the same architectural design but crippled in some way (i.e. by memory bandwidth + core speed, or whatever).. >>

the memory is the expensive part. remember that nvidia is only selling the chips themselves, which are running somewhere between $50 and $75 for the GF4 (not MX4, i'm not gonna count that rebadged MX2 POS as a GF4)
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0


<< huh? just because something performs well doesnt' mean it's efficient....performance doesn't entail efficiency. my 16 cylinder engine running off 8 cylinders may still perform as well as a tweaked out 4 or 6 cylinder engine (running of 4 or 6 cylinders, respectively), but it surely isn't efficient by any means...that's nvidia, the 16 cylinder engine running off 8 cylinders...praying for faster memory to be released 6 months down the road so they can rehash the same product slightly tweaked for an outrageous price. >>



So what does that say about ati then ? whats is the next products due from them the 8800 (or 8500ultra if you wish), and the rv250 which by all accounts is a combination of 7500 and 8500 technologies (my does that sound like a geforce4mx or what). The point is not that it is an efficient design more that at the moment it is the most efficient, you can only assess a product against its competitor and at the moment nothing comes close.



<< huh? (again)...the AIW is a product line of its own, and memory configurations are rather irrelevant for the most part...because then nvidia would have something like 9 products or more in their series...so we'll just ignore that >>



Oh I see because its got its own little name that makes it special does it right ok then
rolleye.gif
. And to say that the ati memory configurations are irrelevant is just plane naive the performance increases with core / memory speed as well as with memory size and these factors influence people descision on what version to buy.



<< i poo on the remains of this thread. >>



Stands to reason - all you've done so far Sh*t in the rest of it, grow up, perhaps when you have spent sometime in the real world you might be able to respond intelligently and without such blatent troll-like responses.
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
I just don't get the frothing-at-the-mouth GeForce MANIA going on here and everywhere else in the computer world.
They're not that amazing. Nothing that hasn't been done by many other video card makers and the image quality was really lacking right up to the Geforce4's! (It may have improved, finally.)

But nVidia is usually taking the brute force "faster, faster, faster" approach, while other video card makers are thinking of alternatives to expensive super-fast RAM. Like more textures per pass, more pipelines, whatever.

I'm not saying other graphic producers don't do the same, nVidia just seems to have adopted this policy more than others. Real innovations only come out every 2-3 years, in the meantime they churn out versions 1-8 of the same card, slightly improved or slightly CUT. Yes ATI has its rehashes, and even pulled the same kind of stunt as nVidia with slicing a core in half an putting it under the same label as its big brother. (Radeon VE/7000, Geforce2MX... see the similarities?)

Since the Voodoo1 I just haven't seen anything REALLY IMPRESSIVE, something really innovative.... different.
But hey, Research and Development is expensive. ;)
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
mingon, no reason to get stupid...

too late.

also, the AIW is a separate line simply because it targets a different market than the original 8500...i don't count nvidia's personal cinema based cards part of the same product line as regular gamer GF4 cards...different target market.

'most efficient' presumes most efficient at the moment (or most efficient at time t; t = time of comment)...so what the hell are u talking about? there's no object/universe-independent efficiency meter out there in another universe by which we determine the absolute efficiency of things...is there?

bluemax

thanks god!...some sensibility creeps through...
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< thanks god!...some sensibility creeps through... >>


You only say that b/c he agrees with your point
rolleye.gif
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
elfenix, AFAIK, nvidia actually sells memory they get a deal on to their outsourcing partners - like asus or MSI or whatever....but i do realize that memory is a big factor in price determination, however, they're still charging a hefty premium on their hand-picked "ultra" GPUs.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0


<< there's no object/universe-independent efficiency meter out there in another universe by which we determine the absolute efficiency of things...is there? >>



Wow showing your outstanding grasp of all things with that comment. when lacking a 'independent efficiency meter' or whatever you call it you measure against your nearest competitor i.e nvdia - ati this is just simple business sense. Whenever 2 competing manufactures are tendering work they will always outline their strongest points, and at the moment ATI cant compete on efficiency and performance of design. And it is this which you can measure a competitors card against, so when the new matrox card is show and it is x times faster than the geforce 4 line it will then use that card as its benchmark.
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0


<< You only say that b/c he agrees with your point. >>


No - he agrees with mine. ;) I'd just like to see something really..... new.
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
so would i...

which is obvious if you've ever seen the inside of my nearly 3 year old computer...slot a athlon, original GF256 (i confess, i fell for the first ever GPU), etc.

nothing's compelled me to upgrade...sure, graphics cards are faster now, and they have programmable T&L, vertex/pixel shaders, etc., but there's a lack of software at the moment (of course, not the GPU chipset manufacturers' fault).
 

Valinos

Banned
Jun 6, 2001
784
0
0
Why aren't gone yet nortex? You've only said your done with this thread two times now?

Let me reiterate...

ATI:

Radeon 7000
Radeon 7500
Radeon 8500 LE 64
Radeon 8500 64
Radeon 8500 LE 128
Radeon 8500 128
Soon to come Radeon 8800 and probably a 8800 LE

Just because these cards are cheaper than the competition now, doesn't mean a thing. They a generation behind nvidia now and have to bring their prices down because their performance doesn't compare.

During the time of the Radeon 8500's release nvidia had one new card. (Actually slightly before, but within the same product cycle)

Geforce 3

6 months later, there were two more choices

Geforce 3 Ti 200
Geforce 3 Ti 500

6 months later there were six more

Geforce 4 MX 420
Geforce 4 MX 440
Geforce 4 MX 460 (which it does exist, we have them stocked at work)
Geforce 4 Ti 4200 (which has not yet been released, but should be in the next few days according to reports)
Geforce 4 Ti 4400
Geforce 4 Ti 4600

Now I do agree Nvidia shouldn't have branded their MX line as Geforce 4 unless they were to have pixel shaders, but there isn't much difference between the amount of cards ATI and Nvidia have released in the same time frame. Which would be approximately a year.

I don't know what the fuss is about. I see both companies using similar tactics, except that ATI hasn't released their next generation card yet. Remember, ATI also prices their flagship cards at $300-$400.

Just stay calm nortex. ATI isn't going to give you a cookie for being their little zealot. Big players don't like groupies. :D
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
who said i was an ATI zealot?...it's assumed that if you're not an nvidia fan-boy, u must be an ATI one....that's simply wrong.

as for me and this thread, we're done (again).

(one last thing tho, it's nice that u markedly pointed out the memory configurations of the ATI cards but not for the GF4 cards...i won't even bother since nvidia's product line would look like the indefinite expansion of pi)...

....................................................................................................................(....)
 

Valinos

Banned
Jun 6, 2001
784
0
0
Well, what would you rather me call the Radeon 8500 (and LE) 64 MB parts and the 128 MB parts? Just Radeon 8500 all in one group? Gee, that doesn't confuse buyers.

Radeon 8500 MX and Radeon 8500 Ultra?

What's your point? If it matters so much

All GF3's had 64 MB

The Geforce 4 Ti series has 128mb except for some individual manufacturers that are making 64 MB version of the Ti 4200. The MX series all have 64MB

Ok, I'm done with you. Go home.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
I think the next major innovation from NVIDIA and ATI will be when DX9 is released. I think of all the major video card companies, NVIDIA has been the biggest innovator. Just my $.02.

 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The real problem at ATI in my eyes i their lack of vision. let me explain what i mean here.

ATI builds a video chipset with all these cool features and such. And what happens? the Only things that use them to their full extent are benchmarks and a few games with old engines (Half-Life). Sure Serious Sam uses Trueform, but you really can't notice it unless you play the demonstration map where it showes a side by side. So in reality ATI's innovations mean squat.

Meanwhile over at Nvidia they're developing a new chipset with greater Memory bandwidth, more memory on board, with better drivers, and that can actually run 4x AA above 20fps (unlike ATI). And guess what? These innovations or developments whatever you call them actually become useful to us in our Games. Maybe the 2D quality isn't ATI standard, but gamers don't usually play on Photoshop all day do they?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I think you're all missing or forgetting one very large market that ATI used to own but now falls second to.

This market is the Macintosh market. That's right...the Graphic Artist, Sound Recorder, video Editor market. nvidia cards come standard with the GF4MX Sure not a gamer's card, but you can get a GF4 Ti in it too. And the 2D isn't that bad either...in fact I'd say that even beyond a good PC with an ATI card Nvidia's cards, which are built by Apple in house BTW, perform 2D better.

This market may not be as large as the PC gamer's market, but it's more demanding...especially when you get crabby customers who's 250,000 Copy job just missed the Red tone by 3 degrees because of poor 2D color and now has to be rerun. Slowing down the output on the press for another week pushing everyone eolse back.
 

xype

Member
Apr 20, 2002
60
0
0


<< but gamers don't usually play on Photoshop all day do they? >>



no, but I do. shame only that lots of computer artists that _do_ 2D work don't have a clue and are happy with a cheap-chap gf2mx200. oh well. I do more 2D than 3D work
which means I'll most likely go ATI next time I buy. NVidia however navigated themselves too deep into the gamers market where there isn't as much money to ba made as
with FireGL material. I guess we'll be able to tell in a few years who of the both made the best decisions.

oh and even if ATI doesn't innovate much, it's enough to keep them in the game (hehe, clever wording, eh?). :)
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76


<< NVidia however navigated themselves too deep into the gamers market where there isn't as much money to ba made as
with FireGL material. I guess we'll be able to tell in a few years who of the both made the best decisions.
>>



NVIDIA Named Workstation Graphics Market Leader by Gartner, IDC and JPR

So much for the FireGL. NVIDIA has desktop and workstation market. It will probably take a while but I can see them eventually catching up to ATI in the mobile market as well.
 

xype

Member
Apr 20, 2002
60
0
0


<< So much for the FireGL. NVIDIA has desktop and workstation market. It will probably take a while but I can see them eventually catching up to ATI in the mobile market as well. >>



what workstation market? UNIX, cheap WinPC, Mac? surely NVidia might sell a bunch of overclocked geforce cards labeled as Quadro, but the
Wildcat market is a different league. and I think NVidia can't quite compete there. NVidia makes PC gaming gfx cards and not _real_ workstation
stuff that could play in the SGI/Sun league. of course it's probably a fince product for low-level 3D and CAD, but that's not really what I call
workstation market. it has to do with high-end PCs "catching up" with the low-end workstations.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
Heh. I'd say the Quadro competes quite well:

Performance of Wildcat 6110 to Quadro 900 XGL.

In some ways, Wildcat is better because it is faster but Quadro has newer features that games use which Quadro cards support and Wildcat does not and thus it will help propell Quadro further into the workstation market since many developers use these cards to develop games. Heck, the Wildcat 6110 isn't even a DirectX 8 card. Its a DirectX 7 card, but it does support OpenGL 1.3.