Well This Can't Be Good: AMD Axes Carrell Killebrew & Other Employees

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NIGELG

Senior member
Nov 4, 2009
852
31
91
Good to see people enjoying the 'death throes' of AMD,calling ATi 'value products' and reveling in the apparent struggles of AMD.

In case people are wishing for AMD'S death then be prepared to pay a lot more for your hardware after your gloating,'i told you so' and 'kicking them when they're down' joyfest is over because Intel AND nVIDIA will relish you.....
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As far as I remember it was a MASSIVE improvement over the HD 2900, enough for it to be very competitive with the 8800 series.

6-10% faster on average than an HD2900XT is massive?
HD3870 only offered about 85% of the performance of the 8800GT.

And how do you know that? What was its die size? As I recall, its die size wasnt bad at all compared to Nvidia's equivalent, so it sold well and didnt cost too much to make.

Why even bring NV into this? Compare HD4870's die size to older high-end ATI cards.

9800XT ($499) = RV360/380 dies size = 210-mm^2 (best I could find)
X800XT ($499), X800XT PE ($549) = 257-260 mm^2
X1800XT ($549-599) = 263 mm^2
X1900XT ($549), X1900XTX ($649) = 314.5 mm^2

vs.

HD4870 ($299) = 256 mm^2 (priced at almost half of X800XT/X1800XT despite similar die size!! Can you say giving up huge profits?)

The ATI cards which previously sold for $500-650 were similar in die size to the $299 HD4870......AMD basically left $200-350 of profits off the table by pricing the card so low, and pursuing the small die strategy vs. previous performance strategy.

[Regarding your comments on HD5850/5870/6950/6970 cards)

As above, die size not too bad, prices good.

HD5870 ($350) = 334 mm^2 (failed at achieving a small die)
HD6970 ($370) = 389 mm^2 (failed at achieving a small die)

False, on both accounts. The die sizes for HD5800 and HD6900 series were larger than previous high-end ATI cards, and their prices were ~$200+ lower. Using NV's die sizes to make AMD cards look "good" is not a good argument since:

1) We don't know specific die costs for NV, so let's leave them out of the discussion.
2) NV's gross margins far exceeded AMD's even under Fermi generation, just look at the financials.
3) This doesn't address how AMD's GPU strategy, which involved pricing their cards so much lower (i.e., to just gain market share) while increasing die sizes, is actually better than ATI's strategy of the past.

But what makes you say $150 mid range is the equivalent to the $300 mid range?

ATI's previous mid-high-end range cards sold for far higher prices than $140-160. X800Pro ~ HD5850 equivalent had an MSRP of $399 vs. $269 for HD5850 or $299 of the HD6950. Since AMD's high-end cards now sell for $300-370, they essentially had to price all the other cards lower on the food chain lower, including mid-high end and mid-range.


Seems to me that the prices still have the same range - from $50 or whatever to about $600 for the 6990. A 6990 is AMD's top end card, whether it is a dual chip card or not. Doesnt matter.

Not sure how you missed the part that a single HD6990 has 2x 389mm^2 die onboard. So while AMD is selling this card for $700, ATI used to sell a single high-end GPU card with die sizes of 210-315 for $500-650.

Outside of costs, there is another reason why it matters. HD6990 is a limited volume card, not like 9800XT, X800XT, X1900XT were, etc. It also has inherent flaws such as a requirement for a much larger power supply, possible CF scaling issues, micro-stutter problems (for some people), and its loud as a jet engine at load. So it misses a lot of the characteristics of the previous ATI high-end cards that would make it attractive on a much wider scale. Its status is more of a halo card. Also, this doesn't address the business concerns of pricing HD6850/6870/6950/6970 so low.

Like I said, the pricing for the entire range has pretty much moved down. Since not many HD6990 cards are sold, by AMD not creating much faster cards than GTX560/560ti/570/580, AMD cannot command $100-200 price premiums. If you recall 9800XT, X800XT, X1800XT/X1900XT generations, AMD cards were actually faster overall, outside of OpenGL games, which were very dominant at the time (i.e., kept NV's performance in the game). But in shader intensive games, and especially with AA modes in DX9, ATI always had a faster high-end card than NV in all the previous generations after GeForce 3 until HD2900XT.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I thought Apple was no more.

You mean Steve Jobs, the man who's absence from Apple was marked by its decline and brush with obscurity, the same man who's return to Apple brought an explosion in innovation that elevated the company to the heights of envy by all competitors.

Yes that man is no more, and it remains to be seen if the second time around his predecessors do better than they did the first time he departed ways with Apple.

From what I have seen in the past weeks, I'd say its about to be dejavu all over again.

In regards to Apple's management after he was fired in 1985, Jobs said the company's "corrupt" leaders only sought to enrich "themselves mainly, and also for Apple—rather than making great products." He rejoined the company as CEO in 1997 and turned Apple around soon after.

Source

Apple gives new CEO nearly $400M in stock

New Apple Inc. .. CEO Tim Cook has been given a million shares of the company stock that vest over the next 10 years worth an estimated $384 million.

Source

Apple Doles Out $60 Million Stock Grants to Retain Senior Vice Presidents

Apple Inc. (AAPL), seeking to retain its management team in the wake of former Chief Executive Officer Steve Jobs’s death last month, gave $60 million restricted-stock grants to most of its senior vice presidents.

Source
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
How? And what would be the resources here? How much time needed? Intel is a very capable company and they can't make a dent in the discrete market and the graphics aspect of AMD's APU's, there is differentiation.

Intel doesnt need to make a dent in the discrete market. AMD wouldnt either if they built a fusion type processor without buying ATI. You dont think AMD could have built a fusion processor for less than 5 billion?

This is an integrated level performance part. Is it faster than Intel's solution? Yes, but so what? 30-50 dollar discrete parts beat it.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Nvidia is an enthusiast company while AMD is not so much. Leadership always commands a premium.

The same things were said when ATI had the single GPU crown, had a feature-set and quality advantage with the X1900. Even with nVidia's strengths, AMD did retake the over-all discrete leadership last year and proves how important execution truly is.

AMD has placed more focus on pro-active for gamers by offering Eye-Finity and MLAA, which were both very welcomed, and placing more focus on developer relations, which certainly isn't just marketing exercises. Steps forward -- and hopefully as AMD understands the needs more for their customers, they may do even more.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Intel doesnt need to make a dent in the discrete market. AMD wouldnt either if they built a fusion type processor without buying ATI. You dont think AMD could have built a fusion processor for less than 5 billion?

This is an integrated level performance part. Is it faster than Intel's solution? Yes, but so what? 30-50 dollar discrete parts beat it.

But, Intel did indeed try! I think AMD could of designed a fusion type processor in house but it wouldn't be the differentiation Fusion is. You receive what you pay for.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
You mean Steve Jobs, the man who's absence from Apple was marked by its decline and brush with obscurity, the same man who's return to Apple brought an explosion in innovation that elevated the company to the heights of envy by all competitors.

Yes that man is no more, and it remains to be seen if the second time around his predecessors do better than they did the first time he departed ways with Apple.

From what I have seen in the past weeks, I'd say its about to be dejavu all over again.
I thought Apple was no more at the year 1990. Although Jobs is ceased, the momentum of his work is still there. Whether or not those new blood can maintain this momentum is to be seen.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
AMD has placed more focus on pro-active for gamers by offering Eye-Finity and MLAA, which were both very welcomed, and placing more focus on developer relations, which certainly isn't just marketing exercises. Steps forward -- and hopefully as AMD understands the needs more for their customers, they may do even more.

Isn't this thread about how they just fired half the guys that did that?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I thought Apple was no more at the year 1990. Although Jobs is ceased, the momentum of his work is still there. Whether or not those new blood can maintain this momentum is to be seen.

And what does your knowledge of history tell you is the likely outcome for Apple?
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
...Wow, this thread is amazing. I've learned, that AMD is giving away their GPU's, is the next 3dfx, has worse drivers than Nvidia, which hasn't been true for several years.

Lets do some math here(I'll use msrp):

HD 6970
Die size:389
Price:369$

GTX580
Die size:520(+33%)
Price:499$(+35%)

OH GOSH, assuming equal packaging costs and ignoring yield penalty, Nvidia has 2% higher margins!

HD 6950
Die size:389
Price:299$

GTX570
Die size:520(+33%)
Price:349$(+16%)

Wait...

HD 6870
Die size:255
Price:239$

GTX560Ti
Die size:360(+41%)
Price:249$(+4%)

?????

HD 6770
Die size:170
Price:120$ (iirc, might be wrong)

GTX550Ti
Die size:238(+40%)
Price:149$(+24%)

Nvidia might not have the amazing margins people in this thread are thinking it has. It goes without saying that the last four compared cards sell a magnitude more than the first four.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
But, Intel did indeed try! I think AMD could of designed a fusion type processor in house but it wouldn't be the differentiation Fusion is. You receive what you pay for.

Intel only made one serious attempt with the i740. And IMO it was more a marketing gimmick to push AGP graphics on their Pentium II line of processors. It didnt last long and once the adoption became widespread the i740 disappeared from the discrete world and showed up later modified in the integrated world.

Larrabee imo felt like a test bed for HPC that was being touted as a discrete card. Once Intel pissed in Nvidia's cheerios the patent war prevented Intel from releasing it. I suspect we will see a derivative of it show up in the HPC world crunching a lot of numbers instead of rendering Crysis 2.

I am not really sure what differentiation you are talking about that they got from ATI for their fusion chips they couldnt had done internally or worked with Nvidia\ATI to build on their own at significantly less expense.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
...Wow, this thread is amazing. I've learned, that AMD is giving away their GPU's, is the next 3dfx, has worse drivers than Nvidia, which hasn't been true for several years.OH GOSH, assuming equal packaging costs and ignoring yield penalty, Nvidia has 2% higher margins!

Your entire analysis of die sizes above vs. GPU selling prices rests on the assumptions that the cost to manufacture a die size for NV and AMD is similar, and that selling prices scale almost linearly with die sizes. I think that's too simplistic. There are too many factors involved such as NV vs. AMD's contract terms negotiating powers, economies of scale at play, other GPU specifics components (PCB, VRAM costs, coolers, etc.)

And also, you are comparing HD6950 vs. GTX570 and HD6870 vs. GTX560 Ti. Performance wise, HD6950 competes with the GTX560Ti, while HD6870 competes with the GTX560. So if you wanted to compare prices, GTX570 (Nvidia's 2nd fastest single-GPU sells for $80-100 more than AMD's 2nd fastest single-GPU card, the HD6950).

Not a single person in this thread has argued that NV has 2% higher margins than AMD based on die sizes. Like I said, NV die sizes, their packaging costs and R&D spending should be left out of the discussion since the firm sells its Fermi GPUs in Quadro, Tesla and consumer graphics lines. NV's gross margins, 59% desktop discrete market share, etc. speak for themselves.

But when talking about ATI vs. AMD, under AMD, die sizes increased and average selling price per card fell. At the same time, AMD's performance dominance from 9800XT, X800XT, X1800XT, X1900XT is no longer present. Instead, they resort to competing on price, and on price/performance. It's great for us gamers to buy an HD4850 for $199, HD4870 for $299 or HD6950 for $299. But I have my doubts the old mgmt at ATI would have been as enthusiastic. Afterall, they were selling X800Pros for $399 against a still faster 6800GT and X800XTs for $550.

That is the main issue I brought up. I have no idea why you keep talking about NV die sizes since we know nothing about NV's costs.

Care to explain this:

1) If AMD's small die strategy worked, how come their desktop discrete GPU die sizes increased, while the same segment's GPU prices fell? Surely, with over 50% dGPU market share, would the CEO start axing so many guys on the GPU side if they were making a lot of $? I mean really, lay off the guy behind Eyefinity?

2) Why did NV continue to successfully sell its high-end single-GPU cards for $500 but we haven't seen any single-GPU Nvidia card for more than $379 in 4 years? When ATI designed 9800XT, X800XT, X1800XT, X1900XT and their 'xx50' refreshes, ATI knew that a part of the allure of those cards is maintaining high-end enthusiast image. Average Joe walked into a store and he "heard ATI cards were the fastest". So he bought low and mid-range ATI cards too. Now, what's faster? Nvidia. Average Joe hears Nvidia makes the fastest cards, so what does he buy? Nvidia.

ATI thought that they didn't need a single high-end GPU card for "halo", "Marketing", enthusiast gamer mind-share. But if true, how come NV still has ~ 58-59% desktop discrete market share? AMD consistently tends to price their desktop cards lower than NV for a similar level of performance, and yet is still unable to have > 50% dGPU market share on the desktop. Talk about a 'small die' strategy that neither got you high profits nor high market share on the desktop. That's supposed to be successful? Most of AMD's success came in the notebook market. The small die strategy worked much better in that segment.

3) How is selling an HD6990 with 2x 389mm^2 die for $700 is not "giving it away" vs. ATI selling high-end single GPU cards of the past with die sizes ranging from 210-315mm^2 for $500-600?

Care to explain how an HD5870 and HD6970 with die sizes larger than previous ATI high-end cards are selling for $200-250 less? That's not "giving it away" from a business perspective?

4) When NV was in the lead, they released GTX260 for $399 and GTX280 for $649.
When AMD was in the lead, they released HD5850 for $269 and HD5870 for $379. HD5850/5870 was much much faster than GTX280/285 at the time and Fermi was nowhere in sight. Why wasn't AMD aggressive enough and priced them at $399 and $650? Does their team not believe enthusiasts will pay those prices for AMD GPUs? Brand value erosion?

5) If HD7970 launches way ahead of Kepler and smokes the GTX580, is AMD going to price it at $379 again? If yes, then AMD is giving up huge profit margins in favour of larger market share. This strategy often works in certain businesses (Walmart, Costco, etc.). But given that AMD is worth a hair over 4.2B and ATI on its own was worth 4B, it's clear to investors that the ATI part of AMD is no longer worth as much as it was in the past. So clearly, the only way for this to be true is if their future growth prospects have far diminished from ATI's days and/or their profits have fallen to the wayside resulting in far lower discounted cash flows.....

6) You know you have given up competing in the professional graphics space when you start selling your professional cards as low as $189.....Talking about competing on price....

It's extremely difficult to argue that AMD's GPU side is doing great when AMD as a whole has a market cap worth less than ATI was as a standalone entity - plain and simple.
 
Last edited:

waffleironhead

Diamond Member
Aug 10, 2005
7,124
623
136
If AMD's small die strategy worked, how come their GPU die sizes increased, while GPU prices fell?

let me take a shot at a simple answer: competition from nvidia combined with the market collapse drove prices down across the board for cards from both vendors.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I am not really sure what differentiation you are talking about that they got from ATI for their fusion chips they couldnt had done internally or worked with Nvidia\ATI to build on their own at significantly less expense.

That's the beauty of conjecture -- one may never know. The executives have to actually make the tough decisions.
 

iCyborg

Golden Member
Aug 8, 2008
1,387
94
91
The things I brought up such as:

- Too high acquisition premium paid for ATI, with questionable synergies
- A history of low average selling prices for CPUs and GPUs, and downward pressure on the CPU side due to lack of competitive product offerings
- Lower than expected profit margins in both divisions
- Declining server and desktop market share for years now
- AMD being burdened with massive debts after ATI buyout
- Poor execution on the CPU side (Phenom I/II/Bulldozer) --> Lower focus on IPC, and power consumption
etc.
- Market growth in tablets/smartphones will far exceed that of laptops/desktops/graphics cards in the next 5-10 years

^^^ All of these things have been described for years by Wall Street analysts, hardware gurus/hardware review websites and equity research resports on AMD. I didn't "invent" these ideas as mine, using some "hindsight bias" as you put it. This information was already stated by professionals, before I even brought it up.
If you look at what I was arguing about, you'll see it's nothing to do with more than half of these. And you make it sound like it was a voluntary decision to have lower ASP. nVidia used to sell 8800 Ultra for a lot more than they charge GTX 580 for now. Or 8800GT that I also mentioned, they could have priced it lot higher; obviously there's more to pricing than maintaining ASP for both companies. Comparing 9700 Pro to 4870 also ignores other factors - recession and declining PC sales and general consumer spending, competition, shift to mobile etc.

When I say 'hindsight bias' I'm primarily referring to "small die strategy was wrong" and "4870 pricing was wrong". If you can find this stated by the majority of "Wall Street analysts, hardware gurus/hardware review websites and equity research reports" (and in 2008., not now) then I'll accept it, but I don't recall such reports back then (or even now TBH).
Even ATI acquisition is arguable, IIRC most of the opinions were of the cautious "we'll see how it pans out" sort, instead of providing strong arguments that it was bad. And maybe it was too high a price, but APUs are currently the best thing AMD has going on. Would Bulldozer be better if they had invested more money into it instead of buying ATI? Maybe. Or maybe it wouldn't be much better and someone would be arguing that if they had bought ATI they could have had Fusion in 2009/2010 which would've been so much better for mobile market...
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
And what does your knowledge of history tell you is the likely outcome for Apple?
Your question exceeded the scope of my knowledge. On the surface, I'll say Apple is on a doom course. In fact, it was at its doom course all along if it was not for Steve Jobs. However, it was not a coincident about Steve's return to Apple. I can't say if there will be another person who is as innovative as, if not better than him, to be the future CEO of Apple, who is as persuasive, if not more, then Steve. In short term, I don't see Apple's existing CEO is that kind of material in my eyes. However, I lack the necessary intel on the internal of Apple to say anything bold.

I hope that I understood your question correctly.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
...Wow, this thread is amazing. I've learned, that AMD is giving away their GPU's, is the next 3dfx, has worse drivers than Nvidia, which hasn't been true for several years.

Lets do some math here(I'll use msrp):

HD 6970
Die size:389
Price:369$

GTX580
Die size:520(+33%)
Price:499$(+35%)

OH GOSH, assuming equal packaging costs and ignoring yield penalty, Nvidia has 2% higher margins!

HD 6950
Die size:389
Price:299$

GTX570
Die size:520(+33%)
Price:349$(+16%)

Wait...

HD 6870
Die size:255
Price:239$

GTX560Ti
Die size:360(+41%)
Price:249$(+4%)

?????

HD 6770
Die size:170
Price:120$ (iirc, might be wrong)

GTX550Ti
Die size:238(+40%)
Price:149$(+24%)

Nvidia might not have the amazing margins people in this thread are thinking it has. It goes without saying that the last four compared cards sell a magnitude more than the first four.

nVidia has strong margins. And this small die strategy really didn't work very well to me.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Your entire analysis of die sizes above vs. GPU selling prices rests on the assumptions that the cost to manufacture a die size for NV and AMD is similar, and that selling prices scale almost linearly with die sizes. I think that's too simplistic. There are too many factors involved such as NV vs. AMD's contract terms negotiating powers, economies of scale at play, other GPU specifics components (PCB, VRAM costs, coolers, etc.)

And also, you are comparing HD6950 vs. GTX570 and HD6870 vs. GTX560 Ti. Performance wise, HD6950 competes with the GTX560Ti, while HD6870 competes with the GTX560. So if you wanted to compare prices, GTX570 (Nvidia's 2nd fastest single-GPU sells for $80-100 more than AMD's 2nd fastest single-GPU card, the HD6950).

Not a single person in this thread has argued that NV has 2% higher margins than AMD based on die sizes. Like I said, NV die sizes, their packaging costs and R&D spending should be left out of the discussion since the firm sells its Fermi GPUs in Quadro, Tesla and consumer graphics lines. NV's gross margins, 59% desktop discrete market share, etc. speak for themselves.

But when talking about ATI vs. AMD, under AMD, die sizes increased and average selling price per card fell. At the same time, AMD's performance dominance from 9800XT, X800XT, X1800XT, X1900XT is no longer present. Instead, they resort to competing on price, and on price/performance. It's great for us gamers to buy an HD4850 for $199, HD4870 for $299 or HD6950 for $299. But I have my doubts the old mgmt at ATI would have been as enthusiastic. Afterall, they were selling X800Pros for $399 against a still faster 6800GT and X800XTs for $550.

That is the main issue I brought up. I have no idea why you keep talking about NV die sizes since we know nothing about NV's costs.

Care to explain this:

1) If AMD's small die strategy worked, how come their desktop discrete GPU die sizes increased, while the same segment's GPU prices fell? Surely, with over 50% dGPU market share, would the CEO start axing so many guys on the GPU side if they were making a lot of $? I mean really, lay off the guy behind Eyefinity?

2) Why did NV continue to successfully sell its high-end single-GPU cards for $500 but we haven't seen any single-GPU Nvidia card for more than $379 in 4 years? When ATI designed 9800XT, X800XT, X1800XT, X1900XT and their 'xx50' refreshes, ATI knew that a part of the allure of those cards is maintaining high-end enthusiast image. Average Joe walked into a store and he "heard ATI cards were the fastest". So he bought low and mid-range ATI cards too. Now, what's faster? Nvidia. Average Joe hears Nvidia makes the fastest cards, so what does he buy? Nvidia.

ATI thought that they didn't need a single high-end GPU card for "halo", "Marketing", enthusiast gamer mind-share. But if true, how come NV still has ~ 58-59% desktop discrete market share? AMD consistently tends to price their desktop cards lower than NV for a similar level of performance, and yet is still unable to have > 50% dGPU market share on the desktop. Talk about a 'small die' strategy that neither got you high profits nor high market share on the desktop. That's supposed to be successful? Most of AMD's success came in the notebook market. The small die strategy worked much better in that segment.

3) How is selling an HD6990 with 2x 389mm^2 die for $700 is not "giving it away" vs. ATI selling high-end single GPU cards of the past with die sizes ranging from 210-315mm^2 for $500-600?

Care to explain how an HD5870 and HD6970 with die sizes larger than previous ATI high-end cards are selling for $200-250 less? That's not "giving it away" from a business perspective?

4) When NV was in the lead, they released GTX260 for $399 and GTX280 for $649.
When AMD was in the lead, they released HD5850 for $269 and HD5870 for $379. HD5850/5870 was much much faster than GTX280/285 at the time and Fermi was nowhere in sight. Why wasn't AMD aggressive enough and priced them at $399 and $650? Does their team not believe enthusiasts will pay those prices for AMD GPUs? Brand value erosion?

5) If HD7970 launches way ahead of Kepler and smokes the GTX580, is AMD going to price it at $379 again? If yes, then AMD is giving up huge profit margins in favour of larger market share. This strategy often works in certain businesses (Walmart, Costco, etc.). But given that AMD is worth a hair over 4.2B and ATI on its own was worth 4B, it's clear to investors that the ATI part of AMD is no longer worth as much as it was in the past. So clearly, the only way for this to be true is if their future growth prospects have far diminished from ATI's days and/or their profits have fallen to the wayside resulting in far lower discounted cash flows.....

6) You know you have given up competing in the professional graphics space when you start selling your professional cards as low as $189.....Talking about competing on price....

It's extremely difficult to argue that AMD's GPU side is doing great when AMD as a whole has a market cap worth less than ATI was as a standalone entity - plain and simple.

I may not agree 100% with everything you said, but this is COMPLETELY true.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
You mean Steve Jobs, the man who's absence from Apple was marked by its decline and brush with obscurity, the same man who's return to Apple brought an explosion in innovation that elevated the company to the heights of envy by all competitors.

Yes that man is no more, and it remains to be seen if the second time around his predecessors do better than they did the first time he departed ways with Apple.

From what I have seen in the past weeks, I'd say its about to be dejavu all over again.

I always found it ironic that Steve Jobs preached about not caring about what he made, but how much profit the company pulled-in as a whole. VERY different things. As much as I dislike Apple (generally - not everything they do) I do somehow believe he did it because he enjoyed it like he said. The man was VERY sick and he kept working, when he could have MORE easily just left with him billions and took a break. Very interesting guy, regaredless of how you feel about him.

Your article links are telling, and it sounds like a cash-grab while the company is still 'hot'. Sad...
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
nVidia has strong margins. And this small die strategy really didn't work very well to me.

nVidia has tighter control of it's board makers and has created enough brand loyalty to muscle itself some nice margins. Being able to lock out a company from guaranteed sales can be quite a strong negotiating tool, just look at the fate of BFG. No telling if they also get a better deal from TSMC but it would not surprise me if they do.

That and margins on their HPC products are supposedly to thank for balancing out the lower end. You can see AMD has been wanting to get a larger piece of that action, but it requires pretty drastic changes and so they have announced Graphics Core Next as their new architecture.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Small die strategy did work for AMD - it at least enabled them to break even. If they hadn't followed that route chances are they would have lost a lot of money.

You've got to remember they are up against it trying to compete with nvidia because nvidia owns the high margin professional markets. This means they don't need to make half as much money as AMD in the consumer market.

That's AMD's real problem in the last few years - the high profit markets of professional on graphics and servers for cpu's. They've been loosing out in both and it's very hard to make much money selling only budget. What really dooms them is this doesn't look like changing - for all the marketing AMD don't seem any nearer to breaking into the professional markets. Equally BD looks like a failure and is unlikely to turn around their server decline.

Then you add to that the fact that AMD is missing in the fast growing ultra mobile and gpu compute markets and has no money to change that....
 
Feb 19, 2009
10,457
10
76
That and margins on their HPC products

NV has total dominance in the HPC sector. With fermi cards selling for >$3K each, its easy to bring in the $$.

Why do they have the HPC advantage? Their edge in dp performance is nonexistant so its not hardware, given AMD cards often have a huge lead, like x4 sp performance. CUDA is more mature and definitely more widespread than stream. AMD has no major investment on R&D on the software side.

Even if AMD develop an awesome HPC hardware, their poor software penetration won't allow them to capture that market away from NV.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
1) If AMD's small die strategy worked, how come their desktop discrete GPU die sizes increased, while the same segment's GPU prices fell? Surely, with over 50% dGPU market share, would the CEO start axing so many guys on the GPU side if they were making a lot of $? I mean really, lay off the guy behind Eyefinity?

2) Why did NV continue to successfully sell its high-end single-GPU cards for $500 but we haven't seen any single-GPU Nvidia card for more than $379 in 4 years? When ATI designed 9800XT, X800XT, X1800XT, X1900XT and their 'xx50' refreshes, ATI knew that a part of the allure of those cards is maintaining high-end enthusiast image. Average Joe walked into a store and he "heard ATI cards were the fastest". So he bought low and mid-range ATI cards too. Now, what's faster? Nvidia. Average Joe hears Nvidia makes the fastest cards, so what does he buy? Nvidia.

ATI thought that they didn't need a single high-end GPU card for "halo", "Marketing", enthusiast gamer mind-share. But if true, how come NV still has ~ 58-59% desktop discrete market share? AMD consistently tends to price their desktop cards lower than NV for a similar level of performance, and yet is still unable to have > 50% dGPU market share on the desktop. Talk about a 'small die' strategy that neither got you high profits nor high market share on the desktop. That's supposed to be successful? Most of AMD's success came in the notebook market. The small die strategy worked much better in that segment.

3) How is selling an HD6990 with 2x 389mm^2 die for $700 is not "giving it away" vs. ATI selling high-end single GPU cards of the past with die sizes ranging from 210-315mm^2 for $500-600?

Care to explain how an HD5870 and HD6970 with die sizes larger than previous ATI high-end cards are selling for $200-250 less? That's not "giving it away" from a business perspective?

4) When NV was in the lead, they released GTX260 for $399 and GTX280 for $649.
When AMD was in the lead, they released HD5850 for $269 and HD5870 for $379. HD5850/5870 was much much faster than GTX280/285 at the time and Fermi was nowhere in sight. Why wasn't AMD aggressive enough and priced them at $399 and $650? Does their team not believe enthusiasts will pay those prices for AMD GPUs? Brand value erosion?

5) If HD7970 launches way ahead of Kepler and smokes the GTX580, is AMD going to price it at $379 again? If yes, then AMD is giving up huge profit margins in favour of larger market share. This strategy often works in certain businesses (Walmart, Costco, etc.). But given that AMD is worth a hair over 4.2B and ATI on its own was worth 4B, it's clear to investors that the ATI part of AMD is no longer worth as much as it was in the past. So clearly, the only way for this to be true is if their future growth prospects have far diminished from ATI's days and/or their profits have fallen to the wayside resulting in far lower discounted cash flows.....

6) You know you have given up competing in the professional graphics space when you start selling your professional cards as low as $189.....Talking about competing on price....


If we consider packaging costs to be equal(which is already in favor of Nvidia, because, unless they really cheap out, they need beefier vrm's and more powerful cooling) and a price premium and die size increase ratio of 1, then AMD is already ahead, because yields decrease exponentialy(correct me if I'm wrong) with increase in die size. In reality, AMD can aford to compete on price simply because they have better yields AND get more chips per wafer.

//Yes, this is a horrid oversimplication and die size is just one of variables in how a chip will yield.

Now, onto the questions:

1. We aren't in Kansas anymore, a midrange computer doesn't cost 1,5-2k $, a lower-entusiast cpu doesn't cost 800$ anymore (q6600 comes to mind). Comparing pre-economic crash prices with current ones is pointless and stupid. As for the die size increase? TSMC canceling the 32nm node, and having to keep up with Nvidia might have something to do with it.

2. Huge amount of people are still siting on Nvidia's 8xxx and 9xxx, because there is next to no reason to upgrade before the next console gen. And your "Average Joe" analogy is flawed. If the average Joe can see the difference between single and dual chip GPU's, he sure as hell can look up reviews and benchmarks, so the GTX580 only point is to affirm Nvidia fanboys(shrug, I hate this word).

3. See 1.

4. GTX 2XX was released at the height of the economic bubble, and Ati/AMD had to fight a decade of built up prejudice against them(still has to, see "worse drivers than Nvidia").

5. Pricing your products sky high while there is no competition, then droping the price down 40-50% is a great way to burn your consumers. "Gee, AMD is asking for my firstborn in exchange for their GPU's, waiting n months for Nvidia to get their shit together suddenly seems not so bad.". AMD needs to push advantage instead of alienating their consumers. Also See Osbourne effect.

6. That is actually a response to Intel trying to inch into the workstation GPU market. Barest of bones with certified drivers.

This wall of text was created on a cellphone, so please excuse any horrid grammar and spelling mistakes.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
They've been loosing out in both and it's very hard to make much money selling only budget. What really dooms them is this doesn't look like changing - for all the marketing AMD don't seem any nearer to breaking into the professional markets. Equally BD looks like a failure and is unlikely to turn around their server decline.

Actually AMD has gained ground in the workstation market:

But AMD has been making significant gains of late, taking control of 18.5% of units in the second quarter

http://www.jonpeddie.com/publications/workstation_report/

Last quarter:

Now for our server business. The third quarter was the beginning of a move in the right direction, with server revenue up 27% sequentially.

http://seekingalpha.com/article/303...sses-q3-2011-results-earnings-call-transcript

The doom-and-gloom is not there.