ATI Was Always The Innovator (Elaborate Edition): 3D Accelerated Competition

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
For those of you who asked about the history, here is the elaborate copy done on early 2003 right after the 5900 Ultra was released. I changed the format, took me about 20 minutes to arrange it all, but it's all there in its originality.


Competition is one of the things that drive man. If competition didn?t exist, we would have taken longer

to evolve to our present evolution. It?s interesting to see what sorts of things are competitive. Many

people may not know it, but the 3d accelerator market is one of the most competitive in the PC industry.

And the question arises: what makes this market so competitive? Sure, all that 3d accelerators provide

is power for video games, but there are many gamers in the world. So many, in fact, that big companies

like 3dfx, ATI, and NVIDIA have done everything they could to bring most of the market share in their favor.


To get a better understanding about 3d accelerators and how they communicate with the rest of the

computer, I provided some extra information. The 3d accelerator itself is just a chip. This chip is then

placed on a Printed Circuit Board (PCB) where it interacts with resources it needs. This combination is

called the graphics card. Graphics cards are plugged into a slot on a motherboard which connects the card

to a motherboard bus, a bunch of wires printed on the motherboard PCB that direct data to different parts

of the system. There are two types of slots that connect to their proprietary bus. There is the Peripheral

Component Interconnect (PCI) slot which runs on its PCI bus and the Advanced Graphics Port (AGP) slot

which runs on its AGP bus. Which bus is the better of the two? The AGP is the better. Thomas Pabst from

Tom?s Hardware Guide says, ?AGP enables graphics hardware to do its job faster.? The AGP bus is similar

to the PCI bus except that it is quicker and has extensions to keep track of data. The main advantage of

AGP is that it can store textures on main memory quicker than the PCI bus. Textures are stored on the

main memory to reserve the quicker local memory for higher priority applications. Now that we?ve covered

the hardware side, let?s take a look at the software portion of 3d accelerators. <a target=new class=ftalternatingbarlinklarge href="http://www.tomshardware.com/graphic/19970805/index.html">AGP - A New Interface
For Graphic Accelerators</a>


Hardware is totally useless without software, therefore software is a must. Apart from the graphics driver,

which allows the computer to communicate with the graphics card, an Application Programming Interface

(API) is needed. According to Nvidia, An API is ?a standardized programming interface that enables

developers to write their applications to a standard and without specific knowledge of hardware

implementations. The benefit is that a single application can run on a wide range of hardware platforms

instead of needing to be rewritten for each of those hardware platforms. The software driver for the

hardware intercepts the API instructions and translates them into specific instructions tailored to specific

hardware.? There are three APIs in the industry. 3D Glossary


The Open Graphics Library (OpenGL) API was developed in 1992 by Silicon Graphics. It is used by

developers of all sorts because of its support for a wide range of platforms, its efficiency and its stability.

This API is an open standard which is updated frequently by a collection of companies called the Architectural

Review Board (ARB). When it needs to stay immediately current with new features, it then relies on its

extensions feature to execute them. Through my experience, this API proves to be the best with image

quality and performance. Direct3D vs. OpenGL: Which API To Use When, Where, And Why


Another API is called Glide. Glide is no longer used but I felt it needed to be mentioned. Glide was a

proprietary API developed by 3dfx for its Voodoo chipsets. This efficient and easy code boosted performance

on Voodoo cards only and is one of the main reasons why 3dfx succeeded as much as they did. With the

demise of 3dfx in 2000, Glide also went with it. It is regarded as however, the one of the best APIs at the

time. Direct3D vs. OpenGL: Which API To Use When, Where, And Why


The third API is Microsoft?s DirectX originally developed by RenderMorphics. This was a poor API when it

debuted, but on the seventh try with DirectX 7, it finally gained praise from developers for finally working

properly. Now, with the recent release of DirectX 9, its power lies with shaders. Major downsides to this

API, is its very lengthy code and that it is updated once a year which is too slow for the graphics industry. <a target=new class=ftalternatingbarlinklarge href="http://www.gamedev.net/reference/articles/article1775.asp">Direct3D vs. OpenGL: Which API To Use When,

Where, And Why</a>


Now that you know how the Graphics cards communicate with the computer, it?s time we take a look at

threecompanies that have influenced our lives by competing against each other for the top.


This company should need no introduction. 3dfx Interactive fathered the 3d accelerator industry. 3d

accelerators existed prior to 3dfx, but it was 3dfx that turned the heat on and made 3d graphics what they

are today by releasing a powerful and affordable 3d accelerator in October of 1996. The only problem was

that it only functioned in 3d and the computer needed a separate 2d card for 2d applications. In time

though,it turned out that it wasn?t a problem at all as consumers overlooked that and kept the 3dfx atop

the industrywith its Voodoo chipset. An attempt was made by 3dfx at combining 3d accelerator and 2d

accelerator into one with the Voodoo Rush 2d/3d accelerator, but it provided mediocre 2d quality and slower

3d performance than the regular Voodoo and failed to attract consumers. It was two years before 3dfx

release the follow up with the Voodoo 2 in March of 1998. This card supported higher resolutions, at twice

the speed of its predecessors, and allowed a two-card connection known as Scan Line Interleave (SLI) that

provided almost double the performance. During the reign of the Voodoo 2 was when 3dfx was the most

popular. The Voodoo Banshee was the 2nd attempt at a 2d/3d graphics card in one and again it was slower

than the regular version, but sold very well baring the 3dfx logo. Six months later in April of 1999, the new

Voodoo 3 chipset came out along with a new logo and TV spots, which 3dfx pioneered. 3dfx also

apprehended a graphics PCB maker known as STB that allowed 3dfx to make their own cards, gaining more

profit per card, but distributing less. The Voodoo 3 came in three flavors: a slower budget card, a regular

mainstream card, and a quicker high-end card. It was the first Voodoo to use and AGP slot, although it

didn?t support AGP texturing, which almost defeats the purpose of having an AGP connection. The Voodoo 3

outperformed the competition, but because of the lack of the AGP texturing feature, true color and some

other features, consumers started switching over to the competition, choosing image quality over performance. <a target=new class=ftalternatingbarlinklarge href="http://www.revolutionpc.com/articles/index.php?mode=content&articlenumber=6">History Of

3dfx</a>


3dfx?s next 3d accelerator would fix many of the Voodoo 3 issues as well as introduce features beyond its

time. The Voodoo Scalable Architecture 100 (VSA-100) 3d accelerator features scalable architecture that

allows it to run two or more chips, up to 32, in parallel. The VSA-100 powers the Voodoo 4 4500 with one

chip, the Voodoo 5 5500 with two chips, and the Voodoo 5 6000 with four chips, providing almost double

the performance and bandwidth when the amount of chips is doubled. The VSA-100 finally gave Voodoo

fans true color rendering at high resolutions, but it still lacked the AGP texturing feature and a recent

feature known as T&L, explained in later. To make up for that, 3dfx introduced a feature known as the

T-Buffer, announcing cinematic effects two years earlier than the rest of the industry. These features make

the VSA-100 a very attractive 3d accelerator with excellent performance, but due to 9 months of delay, the

otherwise promising chip debuted with performance that trailed behind cheaper existing cards. Chip

shortages also plagued the launch allowing limited distribution. 3dfx Voodoo5 5500


The VSA-100 delay had cost 3dfx millions of dollars and limited distribution hurt profits. Not having much

money to stay in business much longer and having debts worth millions, 3dfx decided to sells its assets to

its rival Nvidia for a total worth of 70 million dollars and 1 million shares. 3dfx released several of their

Voodoo 4 4500 and Voodoo 5 5500, and manufactured only a couple of hundred of the Voodoo 5 6000

cards before closing its doors. In the eyes of many, the Voodoo 5 5500 was, at the time of release, the

king of video cards and some still consider it to be a great card, refusing to upgrade. <a target=new class=ftalternatingbarlinklarge href="http://zdnet.com.com/2100-11-526472.html?legacy=zdnn">3dfx Dissolves, Sells Assets To

Nvidia</a>


When 3dfx closed its doors, it had a prototype of an updated version of the VSA-100, known as Daytona, as

well as a prototype of one of three planned next generation 3d accelerators. The Spectre line featured two

Rampage rasterizers and a Sage geometrical processor in the high end solution. This card had the

theoretical power to surpass next years? Geforce 3 from Nvidia. A protype existed for the low end Spectre

with only one Rampage and a Sage. The Fear line was supposed to feature a Fusion rasterizer, a more

advanced version of Rampage, and the Sage2 geometrical processor. Finally, the Mojo line featured a

rasterizer and geometrical processor combination that boasted the innovation of Tiled Architecture. That is

all that is known about 3dfx?s future products. 3dfx Tribute: Rampage, Sage, Fear, Mojo...


At this time in the industry, Nvidia should also need no introduction because it is now the 3d accelerator

king. Formed in January of 1993 by three industry veterans, Nvidia is the one of the most innovative

companies and always keeps pushing graphics to the next level. The NV1 was the first card Nvidia released

back in January of 1995. The NV1 wasn?t just a graphics card, but a multimedia card capable of excellent

graphics, both 2d and 3d, sound and input/output processing. It used Quadratic Texture Maps, which use

the curved sides of polygons, to display smoother graphics with fewer calculations than today?s polygon

standard. The NV1 also stored textures in the system ram as does today?s AGP technology. Although, the

NV1 was technologically superior to other graphics cards and some sound cards, when Microsoft finalized

the DirectX API, choosing polygons as the 3d standard, Nvidia?s multimedia card became useless as

developers were unwilling to program for graphics cards that didn?t support Microsoft?s API. Nvidia seemed

doomed because there was no way to come up with a new graphics card in time to save itself from going

under. Their savior came in the form of Sega, who funded research and development for the NV2 to be

used in a future home console called Dreamcast. Ultimately, Sega dropped the project, although, if not for

Sega?s funding, Nvidia might not be here today. Before Nvidia started work on a brand new PC 3d

accelerator, they set guidelines. It decided to focus only on cheaper single-chip 2d/3d accelerators, adopt

DirectX as its native API and a six month product cycle, which would, as Allan Dang likes to say, ?provide a

safety net that prevented any single mistake from becoming a company-ending disaster.? Nvidia?s next

card, the Riva 128, promised to outperform the Voodoo accelerator, be the first to use AGP, although

incomplete, and the first to integrate a full hardware triangle setup engine. The chip debuted in fall of

1997 and because of the lack of image quality features of the Voodoo chipset, the Riva never made it big,

but because of its low cost, it became and excellent choice for Original Equipment Manufacturers (OEM?s)

such as DELL. Six months later, a revamped version of the card was released called the Riva 128ZX

featuring more speed and more memory. One month later, Nvidia announced the Riva TNT using TwiN

Texel (TNT) technology that produced two pixels per clock, effectively doubling performance. The TNT

provided support for many image enhancing features, namely true color. When it was released, the TNT

came right behind the Voodoo 2 in performance while providing better image quality. Six months later in

March of 1999, the TNT2 was announced boasting the highest speed and performance. The TNT2

provided great performance near the Voodoo 3 and better image quality. Many gamers decided to switch

from 3dfx accelerators to Nvidia accelerators at this point. Six months later in the September of 1999,

Nvidia announced the first Graphics Processing Unit (GPU) in the form of the Geforce 256. GPU technology

integrates Transform and Lighting (T&L) with the regular 3d accelerator, relieving the CPU from extra

calculations thereby increasing performance versus regular 3d accelerators. T&L integration combined

with a lot memory and high speeds, put this graphics card above all and pointed Nvidia to the king?s table.

History Of Nvidia


In April of 2000, Nvidia announced the successor to the Geforce 256, the Geforce 2 Giga Texel Shader (GTS).

The GTS was fully supported DirectX 7 and came with more memory and higher speeds than its

predecessors. The Nvidia Shader Rasterizer (NSR) was a feature that implemented the first version of a

pixel shader, explained later. The GTS?s performance was unsurpassed by none and continued to reign for

another six months, when Nvidia would release speed updates. Nvidia Geforce2 GTS


One year later in 2001, Nvidia released the Geforce 3, the first programmable GPU, with full DirectX 8

capabilities. The NFiniteFX Engine feature was capable of vertex shaders and pixel shaders. Vertex

shaders manipulate vertexes in a 3d image allowing it to change shape. Pixel shaders manipulate the

lighting and surface effects. Lightspeed Memory Architecture (LMA) was a feature that saved memory

bandwidth, a big limitation in the Geforce 2. The Geforce 3 did have amazing image quality features, but it

didn?t have the speed to execute them efficiently, although, it still performed above the competition. <a target=new class=ftalternatingbarlinklarge href="http://www6.tomshardware.com/graphic/20010227/index.html">High-Tech And Vertex Juggling ? Nvidia?s New Geforce3

GPU</a>


Another year later in 2002, Nvidia released the Geforce 4 Ti 4600 to quickly replace the Geforce 3. The

Geforce 4 had very few hardware optimizations to boost speed: the NFiniteFX II with dual vertex shaders

and LMA II, saving memory bandwidth more efficiently. Other features included a new method of

Anti-aliasing (AA) known as Accuview, which tries to give high performance and high image quality, and

NView which supports multiple displays. This card continued the Nvidia tradition of outperforming the

competition. Nvidia GeForce4 ? NV17 And NV25 Come To Life


Another year later in 2003, the repeatedly delayed Geforce FX 5800 Ultra was released bringing to gamers

the ?dawn of cinematic computing,? as Nvidia likes to call it, with full DirectX 9 capabilities. This chip

provided almost double the speed of previous chips and an increased number in Vertex Shaders and Pixel

Shaders for higher performance and effects provided by the CineFX Engine. The speed of the FX chip is so

high that a noisy, elaborate cooler named the FX Flow was designed for it. An optimized version of LMA II

has now become Intellisample Technology. Despite all of these features and speed, the card was just

about as equal in performance to the competition, who was about half as fast. The 5800 was extremely

criticized for the noise it created with its elaborate cooler and its high price for decent performance and

horrible image quality. Because it received a bad reaction, Nvidia decided to release a revision of the

Geforce FX 5800 Ultra three months later named the Geforce FX 5900 Ultra. <a target=new class=ftalternatingbarlinklarge href="http://www6.tomshardware.com/graphic/20021118/index.html">GeForceFX ? NVIDIA Goes

Hollywood</a>


The Nvidia Geforce FX 5900 Ultra featured a slower speed, a change of memory interface to allow more

memory bandwidth while producing less heat, a more efficient, less noisy cooler and a new release of

drivers that substantially increased performance and image quality. A new feature was also included

named Ultra Shadow Technology, which allows easier programming of shadows. In the end, the 5900

outperformed the competition by a fair margin and had image quality on par with the competition,

restoring many gamers? faith in Nvidia. Nvidia is still the king of 3d accelerators. <a target=new class=ftalternatingbarlinklarge href="http://www6.tomshardware.com/graphic/20030512/index.html">Nvidia GeForceFX 5900 Ultra ? The Way FX Is Meant To Be

Played!!</a>


ATI Technologies Inc. was founded in 1985. ATI was never a big contender in the 3d accelerator industry

until recently in 2000. The reason ATI never made it big, was because of their terrible drivers. Before

the year 2000 with the release of the Radeon, ATI created decent hardware, sometimes excelling, but

was always dragged backwards by their drivers. The hardware is as good as the driver. The reason ATI

has existed so for so long is because of its relationships with OEM?s. ATI Radeon 64MB DDR


In July of 2000, ATI released the Radeon 256. The Radeon featured the T&L integration similar to the

competition. The Charisma Engine feature was capable of Vertex Skinning, Keyframe Interpolination and

Pixel Tapestry, ATI?s own NSR, which would provide better performance and effects in games that

support it. The Radeon also supports some features of the unreleased DirectX 8. The HyperZ feature

does not render objects that aren?t seen by the viewer to save memory bandwidth. Despite these great

features, the card is slower than the competition and doesn?t perform as well. ATI Radeon 64MB DDR


In October of 2001, ATI released a card quicker than the competition named the Radeon 8500. The

Charisma Engine II introduced Smart Shader and Pixel Tapestry II. Smart Shader gave ATI the power of

vertex shaders. Pixel Tapestry II gave ATI the power of DirectX 8.1 pixel shaders. Smoothvision

provided AA and Anistropic Filtering (AF), a feature that provides more detail to skewed areas, at less of

a performance penalty while providing a top quality image. HyperZ II technology is a more efficient

bandwidth saving feature. Video Immersion II and Hydravision are added for better video quality and

support for multiple displays. Truform is a very interesting feature, adding more polygons to older

games to make the graphics look smoother. This technique, unfortunately, was never perfected. This

time ATI had the speed and still had the features, but a driver update from the competition provided

enough performance to leave the 8500 behind. After that day, ATI started to update its drivers often. <a target=new class=ftalternatingbarlinklarge href="http://anandtech.com/video/showdoc.html?i=1544&p=1">ATI?s Radeon 8500 ? She?s Got

Potential</a>


In July of 2002, ATI dented Nvidia?s crown, finally emerging out of the shadows with its Radeon 9700

Pro. This card has full DirectX 9 support, allowing cinematic effects. HyperZ III is added with new

tricks to save even more bandwidth than previously possible. Smoothvision 2.0 is implemented with a

few extra optimizations in order to deliver better quality picture. ATI also introduces the Video Shader,

which is able to more efficiently handle video data. The Radeon 9700 Pro finally beat the competition. <a target=new class=ftalternatingbarlinklarge href="http://www6.tomshardware.com/graphic/20020718/index.html">ATI Takes Over 3D Technology Leadership With

Radeon 9700</a>


Six months later, The Radeon 9800 Pro featuring minor optimizations to the 9700 that increased

performance substantially. It was only beaten by the Nvidia Geforce 5900 Ultra. Although, it still has

a lot of bugs to work out with the drivers, they have come a long way and now actually pose some

competition to the competition. <a target=new class=ftalternatingbarlinklarge href="http://www6.tomshardware.com/graphic/20030512/index.html">Nvidia GeForceFX 5900 Ultra ? The Way FX Is Meant To Be

Played!!</a>


In conclusion, this essay details how competitive the market is, making it one of the most competitive

in the PC industry. The downfall of one great company in such a short time and multiple product

releases just about every six months to keep companies competitive prove this, while demand for

better graphical effects and more powerful 3d accelerators will ensure that the market will keep going.
 

Kongzi

Member
Jul 6, 2003
50
0
0
Christ. Get rid of that double spacing :D. It makes it awfully hard to read. May be good for a printed report... not for this.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Couple of things-

'Vertexes' isn't a word, vertex is singular vertices is plural. A minor thing that people regularly get confused, just to let you know.

Keyframe -interpolation- not sure if you had a typo there or not.

The VSA-100 finally gave Voodoo fans true color rendering at high resolutions, but it still lacked the AGP texturing feature and a recent feature known as T&L, explained in later. To make up for that, 3dfx introduced a feature known as the T-Buffer, announcing cinematic effects two years earlier than the rest of the industry.

Besides lacking AGP texturing and hardware T&L, it was also missing proper trilinear filtering, anisotropic filtering, EMBM, EMCM and Dot3, all of which actually made games more 'cinematic' while none of 3dfx's TBuffer effects were ever used(ie- supported by developers, not FSAA which was application independent). The T&L issue was a big one for 3dfx, but they were missing a lot more features then that.

In July of 2000, ATI released the Radeon 256. The Radeon featured the T&L integration similar to the competition. The Charisma Engine feature was capable of Vertex Skinning, Keyframe Interpolination and Pixel Tapestry, ATI?s own NSR, which would provide better performance and effects in games that support it.

Vertex skinning was also part of the GeForce1, ATi increased the amount of matrices, they didn't introduce the feature. Pixel Tapestry was DOA, nV's NSR ended up proving the more desireable solution(see evidence- Doom3 which is based on the GF1's register combiners).

The NV1 also stored textures in the system ram as does today?s AGP technology. Although, the NV1 was technologically superior to other graphics cards and some sound cards, when Microsoft finalized the DirectX API, choosing polygons as the 3d standard, Nvidia?s multimedia card became useless as developers were unwilling to program for graphics cards that didn?t support Microsoft?s API. Nvidia seemed doomed because there was no way to come up with a new graphics card in time to save itself from going under. Their savior came in the form of Sega, who funded research and development for the NV2 to be used in a future home console called Dreamcast.

Actually, the NV1 was the chip used in the Sega Saturn. nVidia didn't do that badly with that part, selling several million. They did hitch their train on the wrong technological path though.

The GTS was fully supported DirectX 7 and came with more memory and higher speeds than its predecessors.

The GeForce1 had the same feature set as the GF2 and they both were available with the same amount of RAM(both came in 32MB and 64MB versions, the GF2 didn't offer a 128MB part).

The 5800 was extremely criticized for the noise it created

A minor point, but this is true about the 5800Ultra, not the 5800.

In conclusion, this essay details how competitive the market is, making it one of the most competitive in the PC industry.

You left out the graphics leader, I didn't see a mention of Intel at all. Perhaps commenting that you are focusing solely on the enthusiast market and ignoring the actual larger market? For instance, when you were talking about 3d(actually that should be a 'D' given the time)fx being the market leader ATi actually dominated the market. 3D/dfx was never a market leader in the broader sense, they actually were never remotely competitive.

Sure, all that 3d accelerators provide is power for video games, but there are many gamers in the world.

This is absolutely not true. Besides gaming, you have the visual effects studios, feature CGI houses and then the more serious applications. Pretty much everything built today uses a CAD/MCAD program in its design phase, and then there are medical equiptment which is increasingly relying on 3D power to aid them in diagnosis and in some cases actual surgery. It is another interesting story how consumer equiptment progressed faster then specialized designs allowing consumer based products to become major players in the professional realm. In the instance of medical useages of 3D equiptment, they actually tend to use straight consumer hardware(the GeForce/Radeon line versus the Quadro/Fire line their MCAD counterparts tend to rely on).
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I'm impressed, thanks for the extra info Ben, sorry bought some typos. And yes it was based on the higher end Video Cards.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Schadenfroh
you fanboys have way to much time on your hands

It's a good thing a non-fanboy like yourself has just enough time for snippy remarks to keep those fanboys in line! :p
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
The paper is still crippled by your flawed thesis which isn't even proven. You aren't doing an investigative paper, you're writing a tract based on the premise "ATI makes all good things happen." Because, by your own admission, you have an extremely short frame of reference, you're judging from a very short horizon. (Kids your age prolly think Green Day invented punk because you've never heard The Ramones.) Despite your title that ATI was always the innovator, you don't back it up (that I can tell beyond the last year finally bringing them to the lead. While that's nice for them, it doesn't prove your thesis.

Another problem is doing the histories of each company separately - there's no feel for empires rising and falling. You can read that 3dfx had problems getting products out the door, but it really means something when you can see that Nvidia is raining death from above in the form of killer products, shipped on time. The role of OEMs is missing AFAIK. ATI makes it's money selling cheap chips for office PCs, not l337 pimp rig HW.

Please note that my criticisms have nothing to do about whether you favor ATI over Nvidia and everything to do with writing effective and HONEST articles. Dryly copying and pasting factoids into a semi-linear narrative isn't creative, weaving those facts into a meaningful story is. What you seem to consider innovation is mostly bullet points - cheap PR copy that may or may not have practical effect on the end user experience. T-Buffer = Truform = CinemaFX - what do they actually DO?!?

As I mentioned in the other thread, you MUST learn to take a broad overview of the intellectual terrain you want to cover. Sometimes the end destination will end up a bit different that you thought it would be, but it's better to learn from checking your premises than putting on blinders (ATI is teh r0x0r!) and tearing to the finish line, ignoring all the contradictory evidence along the way. If wrote a paper about how Rancid is the greatest band without knowing of the existance of The Clash, how valid would it be?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
(Kids your age prolly think Green Day invented punk because you've never heard The Ramones.)

LOL- my 3.5 year old loves the Ramones. He heard "Blitzkrieg Bop" on that phone company commercial, started jamming, so I dug out my old "Ramones Mania" cd from college days.

If wrote a paper about how Rancid is the greatest band without knowing of the existance of The Clash, how valid would it be?

Aaaaahhh. Flashback. <remembers buying "The Clash" and "Give'm Enough Rope" when they first came out>
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: BenSkywalker
Couple of things-

'Vertexes' isn't a word, vertex is singular vertices is plural. A minor thing that people regularly get confused, just to let you know.
Both are acceptable. I couldn't track down an oxford reference, but both Webster's and The American Heritage Dictionary of the English Language consider Vertexes to be a valid plural of vertex. It's you americans and your corruption of english at work again! :p j/k Linkage

You left out the graphics leader, I didn't see a mention of Intel at all.
That's because they aren't part of the accelerated graphics business, which given the discussions in the article any reasonably intelligent person who was not nitpicking would see was the overall subject of discussion.

VIAN, the one overall important thing you missed is how graphics accelerators are used in other businesses as Ben posted.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Thanks for the feedback. The point i am trying to make is not that ati is the innovator at all. My essay clearly states that Nvidia is the innovator and Ati has just messed up Nividia's rythme. And this is a symplified paper, for all types of people to understand the overall competition between these companies. I was trying to get away from lots of numbers and not explain every little detail like T-buffer, I just summarized it to cinematic effects. Although there is a question about what is considered cinematice effects, I didn't think about it at that time as time was of the essence. And I hate Greenday. This new thread that i have posted has nothing to do with ATI being the innovator. I just put that name to attract those who have read my last topic. ATI was always the Innovator. The title of this topic should be 3D Acclelerated Competition.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
I'd have to disagree in the sense that ATI is just following nVidia. They each have their own specialty area. ATI downright kicks the crap out of nVidia when it comes to AA/AF performance since the 9700Pro (for competing models). ATI is also the only graphics vendor out there to have a fully compliant DirectX9 card. They also have the single best solution for media centre PCs in their All-In-Wonder line, and have for some time. There's a lot more to video cards than game performance.
Conversely to that list, nVidia has provided great Linux support, and was the speed leader prior to the 9700Pro's introduction.

If you want to look at another (as of yet unmentioned) innovative company, check out Matrox, they have done a lot of work with developing multimonitor solutions that are capable of handling 3d games.