Nvidia Launches The G80 Graphics Chip; An Interview With Jen-Hsun Huang

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Nvidia Launches The G80 Graphics Chip; An Interview With Jen-Hsun Huang

Here is a quote apoppin and others are sure to find interesting:
Jen-Hsun: We don't need to talk about Fusion in order to get excited. We absolutely have more than we can do right now GPUs and will stay focused there. We are just not working on CPUs. There are a lot of things I am working on. Without any exception, I?m not working on CPUs.

Wednesday, November 08, 2006

Nvidia Launches The G80 Graphics Chip; An Interview With Jen-Hsun Huang

Dean Takahashi, 11:01 AM in Dean Takahashi, Gaming

I took some time to interview Nvidia CEO Jen-Hsun Huang about the company's latest graphics chip, a "general purpose GPU" that looks and smells suspiciously like a CPU. But it's really not an attempt to bump Intel aside. It's Nvidia's way to find new markets for the latent potential of the processing power of its PC graphics chips. Read on to hear more.

What is this non-graphics oriented processing that can be done on the G80 and the notion you can assume more responsibilities from the CPU. Can you explain that?

Jen-Hsun: Andy Keane who is the general manager of our computing business unit. His job is to work with architects create products and to take to market to serve the need for high-performance computing or general purpose applications. You?ve seen the part of the presentation that explains that the G80 is the worlds first DX10 unified architecture. We used to have geometry processors that were in the CPU and vertex shaders that were in the front end of the chip, etc. We?ve taken all that functionality and created one unified processor architecture. It?s a scaler architecture with a lot of floating point performance and energy performance, 128 processors running in a chip. Altogether, 3 Teraflops or so. All of these different processors could support a whole bunch of threads themselves and run concurrently. There are thousands of threads that run inside chip concurrently. We created a programming model of a chip supported by a complier and the model is a language extension of C. Architecture for interface is called ?CUDA.? It?s a general purpose programmable processor that is optimized for very heavy weight, large amount of data and data-parallel applications. So those are the high-level bits.

How do you answer whether or not you can also do non-graphics related processing?

Jen-Hsun: That?s right. We are really good at applications and programs where the instruction complexity is not extremely high, but the data is enormous. Gigabytes of data and moderate amount on construction complexity. Whereas the CPU deals with infinite amounts of construction complexity. Almost any lousy programmer who could write some C code to the extent that you could compile, could pretty much run that program. A GPU or G80 with compute, you?d still have to be moderately thoughtful about how you?ll structure/factor the program. If you structure/factor it right, the amount of throughput you get out of it is extraordinary. At that point you can utilize the 3 Teraflops of floating point performance inside the chip.
Andy: If you look at what applications naturally divide between GPU/CPU, Jen-Hsun talked about varying workloads, irregular data, structure that exist is operation systems and applications so that naturally migrates to CPU because that?s what it was built to do. There are another set of problems where data is large, relatively parallel. That?s where things will migrate to the GPU. An extreme example of that is in a game. You have advantages in game, AI, input, a lot of these inherently disconnected or random functions and then the graphics processing extending into geometry and physics now migrating to GPU because it fits architecture well. Take it to next step, and find elements of code or anything that?s decomposed into irregular data--this will find a home on GPU. Architecture now maps problem.

When you a running a game on the G80, what do you need the CPU for? Is it less necessary than it was in the last generation?

Jen-Hsun: Creation and destruction of geometry was never done before on a GPU. With Direct X10 and GeForce 8800 we can now create and destroy geometry on graphics processor. Geometry shader is a whole new class of applications that can move from CPU to GPU. That has a number of applications from hair rendering, shadow rendering, etc. A number of key graphics algorithms have major components that deal with geometry migration. And as a result, a number of games have bottlenecks because of this you still have a number for things like AI, database management that happen on CPU, it is still an important component. But now the GPU is no longer bottlenecked by the CPU in terms of physics effects, geometry, and shading.

Jen-Hsun: Now its boiled down to two things ? control and processing. Processing will go on the GPU. Control will remain on the CPU. More things you can Control. Huge things amount to control. Games are richer, faster, more alive.

What?s the state of tug-of-war for CPU and the GPU given AMD?s acquisition? Reports on the Inquirer suggest that Intel is doing stand-along graphics chip. There are reports of Nvidia using the former Stexar team to create something more CPU-oriented with GPUs. Given all of these things, what is the state of the tug-of-war?

Jen-Hsun: We don't need to talk about Fusion in order to get excited. We absolutely have more than we can do right now GPUs and will stay focused there. We are just not working on CPUs. There are a lot of things I am working on. Without any exception, I?m not working on CPUs.

But the definition of what is what is now changing. I guess that?s the reason for the fog.

Jen-Hsun: Only thing that changed is the AMD acquisition of ATI. The GPU is about to go through a revolutionary time. That more than ever before the GPU will be perceived as a general purpose processing unit to help solve vastly complex data problems. We happen to think there are all kinds of applications from image processing to computer vision to high-performance computing, to stock option price calculations to physics simulation to geometry processing. So many interesting problems that the industry hasn?t tackled. We?ve finally created a revolutionary architecture to tackle that in a thoughtful and elegant way. And a way that is consistent with the programming model that they understand, which is C.

There are some things moving around with the PC platform, with the Intel Geneseo project and AMD with Torrenza platform as well. How do you build something that will fit in with one of these schemes?

Jen-Hsun: The two architectures are very different at the lower-levels. But at the conceptual level they are very similar. Makes a GPU share for the first time coherently the memory system of the computer. We love the fact that finally the GPU memory system is coherent to the CPU system. Now that the GPU is a co-processor to the CPU its a fabulous advancement of where people see the important of CPU computation. Its a recognition that the GPU is becoming a more important part of general purpose computing. Whether it?s hyper-transport based or PCI-express based, we are neutral.

Is the GPU average selling price going to go up relative to the CPU?

Jen-Hsun: I don?t know. The market will have to decide that. One observation I?ve made ? the GPU has been the only one to increase consistently over 15 years. The GPU has an ASP of $17 when NVIDIA first started. The lowest price was $5. In the course of last 10 years, the ASP of the GPU has gone from low teens to low 30s. Don?t know of any component in last 10 years gone up by factor of 3. Or up at all.

I noticed when I did a check with a few folks that the CPU ASP is still higher than the GPU ASP.

Jen-Hsun: It sure is. The CPU ASP, when you weigh in the server price, is in the thousands. And there are 6 million servers built each year. That shifts ASP of CPUs. I?m not sure what the ASP in exclusion servers, I think it?s below $100. But nonetheless, certainly still higher.

Is this an important thing to watch? To see who is winning?

Jen-Hsun: Not really. The important thing is that we continue to add value to GPUs, and hopefully the value will go up. It?s not important to me who wins. NVIDIA has always been a GPU company. We really believe our purpose in life is to transform the computing experience and solve the most complex visual computing problems. That is our purpose. It does not exclude AMD or Intel. We support both microprocessors. CPUs are a good thing. It also helps us achieve our purpose. Intel and AMD have a war going and we are not part of that war. We are friends with them.

Windows Vista will drive more adoption of GPUs from your point of view?

Jen-Hsun: If you look at OS10 as a surrogate for the future of Vista, people will realize that OS10 laid down the user interface. Things like the windowing system, icons, went to 3D, had drop-shadows, etc. PPTs, spreadsheets, iTunes will go to 3D. Everything goes to 3D once you have the basic foundation of 3D. The important thing about Vista is that the foundation is in 3D. All application developers from Adobe and iTunes ? various apps will go to 3D without much concern.

DirectX10 making GPGPU possible. What does Microsoft think?

Jen-Hsun: Microsoft sees, and I?m only interpreting from the movements they make, the next wave of apps are heavily consumer oriented, heavily weighted on the experience of the consumer. They believe that the intersection of Image/graphics/database processing could enable a new class of applications. One of my favorites is called Photosynth ? do a search. It is an extraordinary application. It is taking photography and 3D graphics and merging them together.

Is this what they talked about last week?

Jen-Hsun: Possible, I wasn?t there. Search Photosynth, we are working with that group. Dragonfly is the code name for project. It?s really incredible. Definitely the next-level. You can do that without intense image processing to correlate control points and photos and navigate it in 3D.
I guess at 3 Teraflops, you are at least 50 percent faster than the PlayStation 3?
Jen-Hsun: It?s basically bigger. Cell has been around 2- 3 years. It?s later in time so not surprised it?s more.

So that means the PC has come back to the forefront in graphics?

Jen-Hsun: Yeah, I don?t think anyone denies that. It?s expected that PC evolves and advances on 6-months cycles. And consoles advance on a 5 year cycle. In very beginning, game console is faster and more powerful than the average install base of 150 million PCs, quite an achievement. Over time, the PC continues to evolve, and you sell over 250 million, over next few years, PC becomes more powerful than game console again. Programmers extract every last drop out of game console. It gives consoles a factor of 2 advantage, without software overhead, using high-level tools to write programs. PC has advantage over the game console.

So with the unified shaders ? ATI went with that architecture with Xbox 360. Do you have a different take on it?

Jen-Hsun: They went to a unified shader architecture that unified vertex and pixel shaders. We are a unified processor architecture, we unified 3 types of shaders: geometry, vertex, and pixel shaders. In addition, our processors are C-programmable processors. It?s a radically different thing. G71 had shaders and were programmable, but they were not programmable by C. G80 is unified and programmable through C.

What would that mean on a strategic level?

Jen-Hsun: The fundamental revolution of G80, is that we are making programmable element, the processor accessible for general purpose computing. GPGPU means you can still program triangles and texture maps in a clever way, numbers you read out of frame buffer are answers. To teach someone to do that is weird. Only a few researchers around world can do that. With G80, we?ve given it a programmable architecture similar to kind people can understand.

It sounds a lot like the GPU is contending with CPU. It?s taking on more of CPU?s functions. Is there a semantic problem here?

Jen-Hsun: We are not taking on the work of the CPU. We are doing the work that the CPU could do before. It?s not like people weren?t using X86. We are doing the work that wasn?t being done before in a satisfactory way. Right now I visualize that there are two axes. The vertical axes is the CPU ? designs for any instruction level complexity. It needs to be syntactically correct for the CPU to run it. Doesn?t matter if its? written well. The CPU will compile it. The infinite levels of instruction level complexity and frankly the amount of data complexity; as long as you can fit it in four vector it?s happy. Amount of floating point is 1- 2 issues per pop cycle. So not a whole lot of data complexity, but an infinite amount of instruction complexity. In the case of G80?s computer, we have a lesser instruction complexity capability. You could pile on gigabytes of data and we?ll stream right through it. So the two axes ? one is data level complexity, and one is instruction level complexity.

What?s your thought about the low-end?

Jen-Hsun: (Integrating for the) high-end doesn?t make sense. Too much innovation left in the GPUs. Integrating worst of either worlds makes high-end product completely unsuccessful. On the low-end the question has got to be asked ? integration of two different chips. The Southbridge chip is another chip. The CPU uses fewer transistors, but fast ones as you know. GPUs don?t do that. I?d rather have lower-leakage transistors. Frankly that tends to be quite similar to Southbridge. It?s either integrated GPU or CPU with MCP. In the low-end no one cares anyway.

AMD is making the argument that it can cut costs out of the system.

Jen-Hsun: Yep, but you still have two chips, unified memory. Maybe they?ll save a dollar, but they won?t change the world.

On the G80 itself, how long did it take?

Jen-Hsun: Four years. We built is ground-up brand new.

Do you consider it late?

Jen-Hsun: Yes, we tried to get it out last year. But it was just too big. So it?s late by our own standards. In the end, it cost us 4 years. It was important to get it right. There were 600 man years total. We started with 10 people working on it and grew to 300 eventually.

Competitively, the G80 can get you ahead for a while for what reasons?

Jen-Hsun: From an architecture perspective. It opened up scope of apps we add value to. It?s the first GPU where the application is larger than graphics processing, with physics, high-performance computing, and image processing. We are working with folks like Adobe and Apple. We are bringing image processing to the desktop. It is so important to us. Ultimately, the applications where we add value to will reflect what kind of company we are. Up until now, we were known for just video games.

And what about power consumption?

Jen-Hsun: It?s two times better than the last generation.

What?s the actual wattage?

Jen-Hsun: 177 watts. The power per watt of architecture is twice as good as last generation. It is all about efficiency. If I know that architecture had better performance per watt, reducing power is easy.

I saw the frog demo ? what will this enable? The quality of games?

Jason: The stream output will come through to the geometry shader. The output of data into memory without going through the process pipeline. Take data back in for much more advanced shader calculations and effects. Hair and shadow algorithms leverage technology for much higher quality of effects. Deformation aspect ? using objects to actually deform objects. Mix that up with geometry shaders, and realistic images. Rigid chunks of triangles, etc? will go away. It will create a world that is really soft and malleable and stretchable. When people animate faces, joints, and hair it will be soft and flowing.

Are you falling in love with the female characters?

Jen-Hsun: I?m still in love with the first three. I love them all.

Is there anything else? Will waiting for Vista impact you?

Jen-Hsun: Microsoft has programs that allow for upgrades. That should help keep the momentum.

What?s your favorite PS3 game so far?

Jen-Hsun: Metal Gear Solid is the most beautiful to watch. I like beauty, but also games that you can learn in 30 seconds. I won?t learn Metal Gear Solid in 30 seconds.

I want to play a game with you one of these days.
Jen-Hsun: Let?s do it!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Confirmed: NVIDIA entering the Intel IGP market, preparing large G8x lineupneup
NVIDIA has just held its quarterly earnings conference call, discussing its financial results and answering questions from analysts. The quarter is strong overall with record revenue and rapidly increasing market share, although with slightly lower net profit than expected by analysts. This was partially due to a confidential patent licensing charge of $17.5M related primarily to old products. But the real shocker is that Jen-Hsun Huang, President and CEO of NVIDIA, announced in the conference call that they're working on Intel IGP solutions, as customer demand is increasing and the merged AMD-ATI is leaving the market, thus creating a void to be filled. They're currently aiming at an introduction date as early as Spring 2007.

Jen-Hsun cited high-definition video and DirectX10 as key reasons for the increased demand and necessity, but it is unclear whether he implied by that this IGP is G8x-based or not. NVIDIA is certainly boasting its architecture's scalability and increased performance per watt, and some presentations slides are in fact hinting at CUDA, their new GPGPU framework with direct support for C-based programming, extending all the way to embedded markets. It could also be that NVIDIA is working on a G80 derivate for the GoForce product line, potentially with only some parts shared (like the shader core), or a completely different architecture that boasts similar programmability.

No matter what, they're working on at least nine more G8x-based products, that is to say, ones with unique codenames. This is substantially more than the historical average, although if an IGP was included in there and the notebook parts had separate codenames, that'd be roughly four chips for the desktop lineup - the same number ATI's lineup currently sports.
More detail here

Also, Ars Technica has an intruiging snippet in their G80 news article from an earlier conference call
Some time ago, NVIDIA hosted a conference call in response to a pair of announcements from ATI and Peakstream in the area of stream processing. The company was coy about what they were up to, and all they'd say is that their project wasn't like what either Peakstream or AMD/ATI are now doing. The Peakstream and AMD/ATI approaches were, in the words of the NVIDIA spokesperson, "like putting lipstick on a pig"?they were taking the existing GPU architecture and bolting on a software layer of abstraction that's intended to hide its graphics-specific nature while exposing its stream processing functionality.
:D

Some early Mercury research data also
Shares of graphics chip producer Nvidia (NVDA) were higher today after the market research firm Mercury Research said that the company gained considerable market share against arch-rival ATI Technologies (ATYT) in the third quarter. ATI is in the process of being acquired by Advanced Micro Devices (AMD). According to a report by Merrill Lynch analyst Sidney Ho, Mercury?s data showed Nvidia with 29% of the overall graphics market in the quarter, up from 21% in the second quarter. Intel still dominated the market with 38% share, though that was down two percentage points. ATI?s market share declined to 21% from 28%.
Based on the same set of data, Deutsche Bank analyst Pranay Laharia raised the firm?s price target on Nvidia shares to $30 from $25. The strong Mercury data, the analyst wrote, raises the possibility that Nvidia could report revenue for the fiscal third quarter ending today above its previous guidance of 8%-10% growth. Nonetheless, Deutsche Bank maintains a Hold rating on the stock, asserting that the shares are ?priced for perfect execution.?
Nvidia shares today jumped $2.10, to $34.87.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Gstanfor
Since when is refuting a claim trolling?

Maybe if you simply stopped trolling we'd quit calling you one.


Just a thought.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
What?s your thought about the low-end?

Jen-Hsun: (Integrating for the) high-end doesn?t make sense. Too much innovation left in the GPUs. Integrating worst of either worlds makes high-end product completely unsuccessful. On the low-end the question has got to be asked ? integration of two different chips. The Southbridge chip is another chip. The CPU uses fewer transistors, but fast ones as you know. GPUs don?t do that. I?d rather have lower-leakage transistors. Frankly that tends to be quite similar to Southbridge. It?s either integrated GPU or CPU with MCP. In the low-end no one cares anyway.
 

dev0lution

Senior member
Dec 23, 2004
472
0
0
Whether you agree with their direction or not, Jen-Hsun definitely has a clear vision of where he wants NVIDIA to go, where the market is heading and a detailed knowledge of what they're doing at all levels of the organization to get there.

Had the pleasure of meeting him at the G80 launch and besides being an eloquent speaker, he takes the time out to acknowledge his partners and supporters. Not that I wouldn't say that about the 1/3rd of AMD that's formerly known as ATI, but I will say I was thoroughly impressed by his vision and the manner in which he represents his ideas and company.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Matt2
What?s your thought about the low-end?

Jen-Hsun: (Integrating for the) high-end doesn?t make sense. Too much innovation left in the GPUs. Integrating worst of either worlds makes high-end product completely unsuccessful. On the low-end the question has got to be asked ? integration of two different chips. The Southbridge chip is another chip. The CPU uses fewer transistors, but fast ones as you know. GPUs don?t do that. I?d rather have lower-leakage transistors. Frankly that tends to be quite similar to Southbridge. It?s either integrated GPU or CPU with MCP. In the low-end no one cares anyway.


so where do i buy engrish dictionaries again? that sounds like a load of jabberwocky. its just loosely relating sentences put end to end lol!
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Originally posted by: Gstanfor

Is the GPU average selling price going to go up relative to the CPU?

Jen-Hsun: I don?t know. The market will have to decide that. One observation I?ve made ? the GPU has been the only one to increase consistently over 15 years. The GPU has an ASP of $17 when NVIDIA first started. The lowest price was $5. In the course of last 10 years, the ASP of the GPU has gone from low teens to low 30s. Don?t know of any component in last 10 years gone up by factor of 3. Or up at all.

That's a dim statement when not qualified whether adjusted for inflation or not; which assuming not devalues the ASP figure by 25-50% over the 10-15 years and thus negates a significant amount of the claimed increase.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
A pro quality transcript of the conference call, and an interesting financial take from a long time follower of ATi.
Umm...I don't know what to say except...OWNED. Nvidia basically melted ATI's face off. ATI used to have north of 70% market share in notebooks and now Nvidia has the dominant share...AMAZING. Nvidia simply mopped the floors with ATI. After reading this transcript and the numbers, anyone who has followed the FINANCIAL side of these companies for as many years as I have can only come to one conclusion...ATI was going to be broken utterly by NVDA over time and they were forced to sell out to AMD.

ATI would have had to report the biggest miss in their history this quarter had they not been bought out by AMD. With the full impact of these numbers now weighing on me, I am starting to reconsider my investment in AMD...I think they paid far too much for ATI considering these numbers. Had ATI been forced to report their numbers they would be trading in the $10-12 range instead of the life preserver thrown out by AMD at $21. You simply can't be late to market for this long of a period of time and still retain your market share. Jen ate Orton's lunch and asked for seconds.

WOW.

EDIT: I expect the first Intel Vista ready chipsets will be DX10 for the IGP mainstream. Jen is going to try and bring IGP to the mainstream where Intel simply cannot compete. This will preserve his margins and give him a leg up on the eventuality of AMD/ATI doing the same. AGain...FIRST TO MARKET.

EDIT: One more insight
Well, you got to give it up to Orton - he pulled off a deal just at the right time, and to make matters even better the bulk of it was cash. In other words, if ATI underperformance does drag down AMD's share price, former ATI shareholders would be largely isolated from this. What a great deal - as o_e noted, had it taken a few months more to complete, there is no way ATI shareholders would get whiff of that kind of money.

BTW, when it was revealed that should the deal fall though ATI would be the ones paying AMD the restitutions, many people have shocked, assumption being that it was AMD, not ATI desperate for this to work. Sure explains a thing or two now, doesn't it?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
A few more insights from ATi obsevers "just because"

Text
My biggest knock on Dave Orton, besides the long list of delays and late product launches, is his inability to accurately forcast the market. From PCIe adoption to quarterly revenue and gross margin expectations, Dave Orton hasn't been very good at forcasting...period. This has lead to the the multiple warnings and misses experienced over the past couple of years. He has always underestimated Nvidia's ability to put pricing on ATI's asps and gross margins because of underestimating Nvidia's ability to hit the market running with hard launches on new products that make ATI's launches look late and as a response to Nvidia.

This last quarter is yet another example of how Orton would have been wrong...AGAIN...on his previous quarterly forcast. Excluding the loss of the Intel chipset business, Nvidia clearly gained enourmous SEQUENTIAL share from ATI in desktop and more importantly notebook. ATI's internal forcasts haven't been right in a very long time concerning gross margins, HDTV sales, handheld, launch dates or other key metrics. It is no surprise to me that ATI forcast incorrectly, sold the forcast to AMD, and have fallen short as evidenced by NVDA's numbers. NVDA didn't get to 40% + gross margins, huge sequential market share gains and become the leader in notebook GPUs for the first time in the company's history at the expense of Intel. This happened at ATI's expense...AGAIN.

In sum, had the deal not closed as quickly as it did, this continued market share loss would have been more evident to AMD. I am of the opinion that ATI's market share loss had finally reached a tipping point last quarter and started to accelerate...again not anticipated by Orton. This is the second major product launch cycle that ATI is 2-4 months behind. This will further depress margins, market share, revenue and ASPs as ATI tries to move "last generation" high end product against Nvidia's "current DX10 Vista blah blah balh" products. ATI has no pricing power, no flagship product in the lead and is simply trying to sell their stack against a superior stack from Nvidia. As a result of this decling market share, which I believe is accelerating, AMD overpaid by about 25% or $1.3 billion for ATI IMHO...maybe more.

When R600 does finally come to market, it will be up against a mature G80 that has had months of uncontested pricing power and months of yield tweaking that should give Nvidia the option of pushing prices down or allowing them to tweak G80 to repond in time to any R600 surprises. Again, ATI lacks the initiative and simply won't have any pricing power against G80. In addition, the rest of the G80 stack will begin to appear when R600 launches and the problem for ATI moves from just the high end down further into the stack. ATI will have 1-3 SKUs against Nvidias 4-7 SKUs in the February/March time frame. Will ATI be able to sell all that old product against Nvidia's newer stack? Will ATI be forced to do a big write off again?

If AMD did realize all this BEFORE the deal gained terminal velocity I would be VERY surprised. I think ATI got lucky and AMD simply overpaid for a company they could have picked up for at least 25% less once this last quarter saw the light of day. If R600 is some leapfrog technology that gives ATI an R3xx type of advantage of 6 months or more, I will be the first to say I was wrong. The odds of that are slim as G80 seems to have exceeded everyone's expectations...again. Nvidia simply has the killer instinct that ATI lacks and they understand the stakes of a zero sum game.

Text
From having followed this for awhile,as an average guy who only knows what he reads and investigates,this is how an average guy sees it.


Orton announces great things to come

stocks rise

Orton sells millions in shares(so of course he's doing all right)

Orton then announces inventory/stock problems and that they must cut margins to move stock

Orton announces 520 delays

stocks plummet,accusations lawsuits

520 comes,great tech,but late another "lower than expected quarter"

there is no problem on the tech side,but the management is a joke,which was confirmed after reading ATI's 3rd Q statement.

And now after having the 1st unified graphics chip(xbox 360)which should have given them the jump,they are late again and losing more ground.

Of course the shareholders were all for it,they were sick of seeing the management run things into the ground,hopefully,AMD doesn't let Orton make anymore statements or decisions

Text
In sum, had the deal not closed as quickly as it did, this continued market share loss would have been more evident to AMD. I am of the opinion that ATI's market share loss had finally reached a tipping point last quarter and started to accelerate...again not anticipated by Orton. This is the second major product launch cycle that ATI is 2-4 months behind. This will further depress margins, market share, revenue and ASPs as ATI tries to move "last generation" high end product against Nvidia's "current DX10 Vista blah blah balh" products. ATI has no pricing power, no flagship product in the lead and is simply trying to sell their stack against a superior stack from Nvidia. As a result of this decling market share, which I believe is accelerating, AMD overpaid by about 25% or $1.3 billion for ATI IMHO...maybe more.

I have been saying this for years. I think ATI even botched the 9700 launch. They should have had an entire line from top to bottom of DX9 cards waiting in the wings. Instead they had the 9700 Pro which killed the competition then a neutered 9700 ala 9500 Pro to fight the 4600 in the mid range. Fine they figured out something until the scaled down 9600 pro showed up. But their margins took a hit due to sticking a high end die in a mid end product. Nvidia survived that intial assault fine until they handed ATI the silver platter with the NV3.x

Honestly the only launch ATI has won in the history of this competition was due to an Nvidia screwup. And even then ATI doesnt have the planning to really push Nvidia into a hole. 12 months after the NV3.x debacle Nvidia bruised but not out they come out with the NV4.x and are right back in it. What is ATI's response? Parts that take months to show up in the channel, arent much faster, and fail to hold people's interest. Nvidia has a top down approach from the 6800 down to the 6200. What is ATI's answer? X800s, 9800s and the thrice recycled R200?

The deathblow was the 7800GTX. It showed up fast, it showed up for sale, and people bought it in droves. The X1800XT had a manufacturering glitch that took what, 6 months to realize it was a software problem? Another fine example of poor management. The x1800XT shows up late, slow, and hot. It is replaced by the X1900XT 3 months later. A fine chip but their mindshare has fallen as Nvidia decides to simply be within ball park of that chip with a part that is half the die size. After playing with ATI for awhile they release the GX2 that single handily shoves ATI's margins trhough the floor on the top models. Not to mention another complete lackluster lineup of mid and low end cards.

Now this latest blow to ATI with the 8800GTX which is simply a nice GPU. ATI's response is silent, probably late, hot, and may beat this chip by a few % points. Unless Nvidia has a revised G80 waiting in the wings to steal their show. And chances are ATI wont have a full compliment of cards either.

I said when AMD signed the papers they should have had ATI's executive and upper management teams walking papers ready. AMD isnt exactly the best run company either, so they should take all precautions to keep the incompetence at a low.