ATI was always the innovator

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
This is a short paper that I wrote for English class. It seems to me that Nvidia never really did anything except just follow the DX specifications while ATI always added something else to the formula. I think that once the drivers got I together Nvidia was never a mach. Nvidia was just a rebound from 3dfx to ATI.
What do you think?

It is purposely simplified so that a wide variety of people can comprehend it and not just baffled by numbers.


The year 2000 was the start of heated competition between these 3 companies. Nvidia was the first to release their new chip onto the world on April 26th

called the Geforce2 GTS (Giga Texel Shader). This chip supported DirectX7 effects and offered twice the power of its predecessor debuting at $349. (Shimpi,

Anand Lal) 3dfx answered to that call on July 11th with its VSA-100 (Voodoo Scalable Architecture). This chip, although a lot slower than Nvidia?s, was meant

to be placed along another of its kind to double the power. Two of these chips (Voodoo 5 5500) brought DirectX7 effects and their own exclusive cinematic

effects to the market two years earlier than everyone else. This chip debuted at $299. (Andrawes, Mike) Six days later ATI jumped in with its Rage6C (Radeon)

chip. This chip was impressive offering ingenuity in architecture and bringing DirectX7 effects as well as some of it?s own exclusive effects and some of the yet

to be released DirectX8 programmable effects. It debuted at $279. (Witheiler, Matthew) Between these chips the performance leader was the Geforce2 GTS

with the dual VSA-100 behind slightly beating the Rage6C chip. The VSA-100 was the last chip 3dfx would produce. One year of delay cost 3dfx millions and its

chip did not sell as well as expected and thus 3dfx was no more.

The year 2001 continued without 3dfx. February 27th marked the debut of the Geforce3 from Nvidia. This chip was not that much more powerful than its

predecessor, but offered DirectX8 programmable effects at the price of $499. (Shimpi, Anand Lal) October 17th would be ATI counter attack with the R200

(Radeon 8500) chip. This chip was decently more powerful than its predecessor and possessed DirectX8 programmable effects plus some of its own effects and

some of the unreleased DirectX8.1 programmable effects. This great chip debuted at $299. (Shimpi, Anand Lal) Overall, because of bad drivers, the R200

trailed behind the Geforce3.

The year 2002 had a big twist to it. On February 6th, Nvidia released its Geforce4 Ti 4600 chip. This chip offered a lot more power than the previous chip did

over its predecessor. It packs full DirectX8.1 programmable effects debuting at $299. (Shimpi, Anand Lal) ATI promised to release a chip that was insanely

powerful on August 19th. The R300 (Radeon 9700 Pro) chip was insanely powerful and featured unreleased DirectX9 cinematic effects plus some of its own

personal effects. This chip debuted at $399. (Shimpi, Anand Lal) The R300 won outperforming the Geforce4 by two. This caused major reaction from the

enthusiasts and started one of the most controversial topics since 3dfx was alive.

In 2003, Nvidia had now a chance to counter to see if they could steal back the crown. On January 27th, the GeforceFX 5800 Ultra chip was unleashed

featuring twice the power of its predecessors and DirectX9 cinematic effects and some extensions of its own debuting at $399. (Shimpi, Anand Lal) Even with

such power, the GeforceFX 5800 Ultra could only perform just as well, sometimes falling behind, presenting awful image quality due to bad drivers. These

results plus a really noisy fan that took up a lot of space made consumers jump onto the ATI bandwagon. (Shimpi, Anand Lal) ATI quickly countered on March

6th with the release of the R350 (Radeon 9800 Pro) chip. It was a small update of personal effects and power for $399. (Shimpi, Anand Lal) On May 12th,

Nvidia also countered with the GeforceFX 5900 Ultra. This chip was updated with some personal effects, power exchanges for more balance, better drivers and

a less noisy heatsink. The package was worth $499. (Shimpi, Anand Lal) The outcome was a decent performance lead over the R350 and image quality slightly

under the R350. It was also discovered that Nvidia was cheating with its drivers to get better performance. After many complaints, Nvidia changed them and

the decent performance difference turned into a slight performance difference. Most of the enthusiast turned to ATI after that.


I also have a more elaborate version about 12 pages long but still simple that tells more about the history of each company.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
The wacky formatting makes your post hard to read--maybe you can fix that?

As for nV not innovating, don't forget nV debuted with a quad-based (rather than triangle-based) card that bombed. :) They scrapped that and their next, more conservative (triangle-based) release started them on their steady road of success, leading to their current speedbump of a card (the FX in its current Clark Kent disguise ;)). I'm sure they'll regroup again.

I think it's less that nV followed DX specs and more that DX specs tended to follow nV's superior architecture.

Apparently the Radeon and 8500 were forward-looking for their time, albeit not the fastest performers. The tables seem to have turned, with some twists--now the FX is the more forward-looking part, but it's slower than the more incremental 9700 (which is based heavily on the 8500's PS1.4 architecture, according to people who appear to know more than me over at OpenGL.org).

Just throwing in my understanding of the recent past, perhaps in slight contrast to your view that nV was a follower, not an innovator. In fact, innovation comes not only from features, but from effective (read: fast) implementations of those features, something nV succeeded greatly at for quite a while (from roughly the TNT/2 to the GF4Ti).

Accelenation also has a 7 Years of Graphics roundup, if you're interested in corroborating evidence.
 

modedepe

Diamond Member
May 11, 2003
3,474
0
0
To say ati has been an innovator while nvidia has done nothing but follow is completely false. You could turn the whole thing back around and say ati hasn't been innovative. You could say that the r300 was completely uninnovative, that all they did was add some pipelines, up the clock speed, give it more bandwidth, and throw in dx9. But the truth is both companies have been innovative. To get cards to perform like they are these companies have done many things behind the scenes, which most of us do not realize.

Another thing--drivers count as innovation. What good is a blazing fast card if it isn't compatible with anything? True that ati's drivers are now good, but that was not always the case. Not all that long ago nvidia's drivers were completely dominating. Their drivers were both compatible and also able to nearly double the performance of some of their cards.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Sorry about the format. It's all just so strange, though with the FX line. How could such a company go bad and cheat. It just boggles my mind. And then there's that thing about Nvidia's main focus not on graphics card - that they're becoming a multimedia company. Did they just give up. And also the unstable NF3 IDE drivers. Seems like the company has just been lucky in the past. Maybe some of us are too attached. We've given them chances to change their ways and release superior products and still nothing. All we can hope to do is wait until the 3D GPU.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Two of these chips (Voodoo 5 5500) brought DirectX7 effects

I'm trying to recall if the Voodoo5 offered any of the new features DX7 brought to the table, anyone remember?

As far as the real heated competition, it was never really there between the three companies at the same time in the same market segment. 3dfx was an IP purchase by the time ATi made any headway in the enthusiast market. I'd say the start of the fall of 3dfx from the top wrung of the enthusiast market was the TNT, the GeForce1 pretty much finishing them off(due to how much earlier it hit then the VSA100 parts and the troubles that 3dfx was having at the time). Also, the introduction of what would evolve in to the programmable shader didn't start with the Radeon, it started with the GeForce1's register combiners(Doom3's baseline is built around that level of functionality).

The year 2001 continued without 3dfx. February 27th marked the debut of the Geforce3 from Nvidia. This chip was not that much more powerful than its

predecessor, but offered DirectX8 programmable effects at the price of $499.

This was sort of true, and I assume that you mentioned it in your expanded version but the GeForce2Pro and GeForce2Ultra were both considerably faster then the GF2 GTS. Also, the GF3 ended up significantly faster then the GF2Ultra in many things after a few driver revisions.

It was also discovered that Nvidia was cheating with its drivers to get better performance.

I assume in the expanded paper you bring up when ATi was discovered cheating with the Radeon? Another time, I know predates your paper but they also got nailed with what I still consider the most blatant cheat ever seen in the industry back in the days of the RagePro(their first 3D chip)- running ZD's bench they actually instructed their drivers to skip entire frames(they detected how the application rendered frames and then just flat out skipped a great deal of them, around 25% IIRC).
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Your paper's interesting, but here in reality, ATi has never been a leader in graphics performance UNTIL the 9700 Pro.

Yes yes, I know that arguably the 8500 is faster than the GF4 series, but it wasn't at the time.

ATi an innovator? Please. Without degenerating this into an ATi/nVidia flame-fest, I'm going to choose my words carefully and say that nVidia has stumbled a bit on its path, but one or two mistakes does not a failed company make. ATi, OTOH, has had few good steps, all recently, on an otherwise very poorly executed path.

I will grant you that ATi was very innovative when it came to multimedia. The All-In-Wonder series was WAAAY ahead of its time. That being said, it's plagued by the same drive support problems (lack therof) of all other ATi cards. Present generation excluded.

Some people say that ATi has turned over a new leaf. I hope so, since I just bought a 9700 Pro. But everytime I think back to getting screwed over and over again with the RageIIc, Rage Pro, Rage 128, etc... I shudder.
 

Wedge1

Senior member
Mar 22, 2003
905
0
0
I think this is an interesting thread. An honest discussion of what has taken place over the past 3 - 4 years with the leading graphics card makers. It's a nice departure from the ATI vs Nvidia war-like threads.

I remember all of these cards debuting, and how I was in shock of their prices. $300 and $400....man, and I wanted them all. And now, I have one that outperforms just about all of them, and it didn't cost a penny over $249.

I like them both fellas. Let the competition continue because as long as it does, we will win.

Honestly, don't you ever feel a little spoiled when you are sitting down playing your favorite game, knowing that there has been no other time in history when man was able to have such a toy? I certainly do. It has to do with my dad's story, too. How he was dirt-poor when he grew up, so even if such technology had existed in his youth he would not have known it. I bet many of your dad's can relate to what I am saying. Oh well, I'm getting a bit off topic, but the point I'm kind of trying to make is that we live in a great time. It's good to step back and assess all of this so that you might not be taking it for granted because it is so easy to do.
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
The article is fatally flawed due to the fact it's premised on a mistaken idea: That ATI has innovated more than Nvidia.

To cover this prejudice, unless there's an earlier part you left out, you start too late in the game for your points to be valid. You really need to go back to Fall '98 when Nvidia dropped the first TNT chip and 3dfx dropped a load in their shorts. You may remember that 3dfx's reaction was to mock the TNT's 32-bit color and AGP interface. "No games use 32-bit color and the AGP bus isn't as good as local memory.", they said and stuck with 16-bit only capable cards until the V4/V5.

Your next ommision was the original GeForce which brought the term T&L - Transform & Lighting - to the table. Once again, Nvidia was accused of rushing out cards on an arbitrary cycle out of greed when there were no existing games that could take advantage of what it was capable of.

What some caught on to, but others seem to have forgotten, is the fact that game development and hardware development boil down to a bit of a chicken-or-the-egg situation in which developers won't develop games if nothing can run them and cards may come out before anything can truly tax them. 3dfx once again mocked Nvidia and laughed themselves out of business.

Nvidia knew that if they didn't get mass quantities of their cards into circulation, developers wouldn't make the games that would inspire people to buy more Nvidia cards to play. See how that works?

As for the paper itself, it's a dry recitation of facts, figures, DX versions and MSRPs. Numbers are deadly in essays (why do card reviews use charts to show results? Exactly!) and you've got a lot of numbers there. You started with an incorrect thesis and poorly explained an arcane topic in a stultifying manner.

Hope you don't think I'm being too harsh, but an honest teacher would tell you the same, except for maybe the thesis being wrong to start. (Like he'd know.)

Good luck. Good writing takes practice, but even more importantly, it requires THOUGHT and an ordered method of expressing thoughts to communicate your points. Communication only occurs when the message received is the same as what was sent. If I describe a 4-legged animal with brown fur and white spots and you think "dog" when I was describing a cow, we aren't communicating. I may be EXPRESSING myself, but we aren't communicating.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I cut out the introduction and the conclusion. The thesis: The graphics chip market is one of the most competitive in the PC industry. And I chose to start in the year 2000, because I was hard for me to find ATI cards that were actually grpahics cards and not just regular video solutions before the year 2000. I took the easy way out, who would know at school. And the school is DeVry University - crappy expensive school, but accepts these types of essays.

I know about the Geforce3, was clocked similarly to Geforce2 Pro, I think it was this one, but does more work per cycle. I chose not to go out and mention anything other than release products.

And I can't back that the Voodoo5 had DirectX7, just judged by time.

I do agree that this essay and even the expanded one has a lot of flaws. I'm only 19 and my first computer waw given to me in late 2000. Although I'm a newcomer to the group, I've learned a lot. I remember when I was in the market to upgrade my video card from the TNT2 M64 when I stumble upon a Voodoo 5 for $250. I didn't know anything about it, but it looked pretty and judging from what some of my friends told me it was a great card. Didn't have the money to buy it though. But most of the flaws are there because of my lack of experience in the past. Before my first computer, I knew stuff like: Windows and double-clicking on a folder to open it. That's it. Now, I can't list everything that I know but I find it hard pressed that knows as much as me. I have even surpassed those friends with their recommendations. So I know it's not the best, but Its something.

And I also agree, now, that Nvidia really pushed to get the graphics out there. Thanx
 

Sahakiel

Golden Member
Oct 19, 2001
1,746
0
86
I think the Voodoo5 was a DX5 part with some DX7 features. From what I remember, 3dfx had a habit of using model numbers to denote DirectX generation.

A video solution is still a video card; just because the RageIIc seriously sucked ass (I had one) doesn't mean it should be declassified as a video card.

I agree that the paper should start earlier; at the very least include 3dfx at its height in popularity. I see no mention of Glide or OpenGL for that matter.

Sudhian had a pretty good multi-part overview of the last five years or so.
 

SectorZero

Member
Sep 29, 2002
96
0
0
It's an interesting paper Vian, ATI has indeed made some significant contributions over the years, but I think you're selling Nvidia too short.

Nvidia has been a driving force in the graphics industry since the TnT. Circa late 97 or early 98? With their incredibly agressive 6 month product cycle and outstanding driver team, Nvidia consistantly released beter and better products, with near perfect execution.

One of Nvidia's most significant advances was to not have a propriatary API. This was unheard of at the time. Everybody had their own API in those days, 3Dfx, ATI, Rendition , even Matrox. They were all fighting it out for dominance, hoping to capture the market so that anyone who wanted to play games on a PC would have to buy their cards. Of course 3Dfx looked unbeatable and were were headed for a Glide based gaming industry. Nvidia's support for open standards DirectX and OpenGL, and ID releasing Quake2 with full OpenGL support completly leveled the playing field.

I think this is one of Nvidia's most significant achievements because it freed the consumer. Most of us "old timers" will always feel slightly reverent towards 3Dfx for starting the whole revolution. But looking back, it's clear now who had a better vision of where the industry should be headed.

One year ago ATI released the 9700Pro, and it completly dominated everything on the market. Which is exactly what the GeForce 256 did when it was released.

Whatever the graphics industry may look like today, most of us feel ATI's technology is superior, it could completly change a year from now.

I really like how you've researched your paper, but I think you should go back at least 2 more years. In those days gaming on ATI was a joke. While that is certinly not the case today, I think it would more clearly show the cyclical nature of this incredibly competative industry.

Nvidia has been getting a lot of bad press lately, most of it deservedly so, but whatever you think of their current technology or business practices, you cannot deny their past contributions, and the contributions they will most likely make in the future.