Apple and nVidia vs. ATI

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
with apple's previously long and strong relationship with ATI, why'd they move over to nvidia for their top end machines using GF4 MX based cards?

they're clearly inferior to the Radeon 8500....and the 8500LE surely isn't costing much more than a GF4 MX based card (the higher end models)....not only that, does price really even matter a whole lot when it comes to apple?

i could see them adopting the GF4 Ti series, simply because they ARE faster than the 8500s, albeit not a whole lot....but the GF4 MX?>..come on, apple can do better than that

(they could've went S3 Trio+ or something at least)
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
As I recall, Apple (or was it just Jobs?) was steamed at ATI for spilling the beans about their new Apple computer before the big Apple release "surprise party".
They threatened huge lawsuits and pulled almost all ATI support at that immediate moment.

ATI simply advertised saying their products were to be used in the new Apple computers.

I think Wozniak and Jobs are/were just a little weird..... ;)
 

AGodspeed

Diamond Member
Jul 26, 2001
3,353
0
0
It's because of the compact nature of the iMac.

The Radeon 8500 is physically larger and eats up more power than the GeForce4 MX series. That would not bode well for the iMac's very small and compact nature.

In addition, the GeForce4 MX series is quite a bit cheaper than the Radeon 8500, more than $50 on average (as per pricewatch). Of course, nVidia very likely gives Apple some special pricing deals if Apple buys GeF4MX's in large quantities.

I would hope that Apple uses the Radeon 8500 in their G4 Towers though. To not give the consumer that option would be utterly ridiculous.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
I would hope that Apple uses the Radeon 8500 in their G4 Towers though. To not give the consumer that option would be utterly ridiculous.

10/4, it would be a bad call IMHO.
 

kgraeme

Diamond Member
Sep 5, 2000
3,536
0
0
Simple answer: driver development and a lack of commitment on ATI's part.

ATI made platform-specific boards for the Mac because of endian issues and supporting those boards became harder and harder. So when the Rage128Pro or Radeon came out, it came out for both Mac and Windows, but you couldn't just buy a Windows card and put it in the Mac and just use the Mac drivers. This was especially frustrating since the Windows card was generally half the price of the Mac card. Also, the performance of the drivers for the card generally lagged behind that of the Windows counterpart. And recall that this was back in the day of ATI writing crappy drivers for Windows, so just imagine how bad they were on the Mac.

At the same time the Radeon was being developed, the GeForce cards were becoming the card of choice for gamers on Windows. This was happening at the same time that Jobs returned to Apple and was trying to reinvent the platform. One of his goals was to push gaming on the Mac. Some of you may recall that Quake 3 was first demoed on a Mac running a Radeon card. NVIDIA saw the Mac platform as an opportunity and began internal development.

In short succession, the Radeon was included on the G4 tower. The Cube was developed and would include the Radeon, but the Cube was still a top-secret project. A day (or was it two) before the unveiling of the Cube, ATI announced that the Radeon would be in the Cube. This announcement stole Job's surprise and as punishment he stripped the Radeon out and released the Cube with only the Rage128Pro. The customer was the victim. At the same conference, NVIDIA announced that starting with the GeForce2MX chip, every new architecture would have Mac support built in.

GeForce2mx was included on the Mac about six months later. This was Apple's "high end" graphics solution. Strange, since it was the "MX" so it still couldn't compete with the Windows gaming market. When the GeForce3 came out, NVIDIA stood by it's promise to support the Mac and the card was actually debuted on the Mac before it was seen for the PC. The game used to demo the card was Doom 3 also seen for the first time.

That's it in a nutshell. ATI sucked. NVIDIA stepped up. Of course, not much has really changed on the Mac. Gaming hasn't surpassed Windows. Drivers still aren't updated. The coolest cards still aren't available retail, etc...
 

Eug

Lifer
Mar 11, 2000
23,994
1,617
126


<< i could see them adopting the GF4 Ti series, simply because they ARE faster than the 8500s, albeit not a whole lot....but the GF4 MX?>..come on, apple can do better than that >>


You can get a Geforce4 Titanium if you're willing to spend the extra dough. Otherwise you can get a Radeon 7500 at the low end.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81


<< they're clearly inferior to the Radeon 8500....and the 8500LE surely isn't costing much more than a GF4 MX based card (the higher end models).... >>



Actually, 8500LE costing much, much, more than GF4 MX. Much more to design (almost twice amount of transisitors), and much more to produce.

You are judging by retail prices ($149 for VT MX440). It doesn't work this way in OEM channel market.

In this case, Nvidia sells thousands of MX440 for $15 each, and still gets healthy 30% margin on it. Apple then outsources the production of the card to different manufacturers and handles driver development.

 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0


<< Apple then outsources the production of the card to different manufacturers and handles driver development. >>



uh, apple doesn't outsource at all, nvidia does..and it's to 3rd party board manufacturers...also, i believe driver development is apple, but it's merely a reference driver to theat effect....

anyway, i still think that's one retardo reason for dropping the radeon line - because ATI was quick to jump the gun and gloat that their chipset was being used in the cube...

at any rate, the 8500 should still be used for the tower at least (i've seen some EXTREMELY small 7500 cards, which i think are surely better than the low-end GF4 MX series cards) which would EASILY accomodate the e/imacs.

i don't know, i wish i really knew what was behind nvidia's sweet-talking apple into adopting their chipsets over ATI's.