Is this the end for AMD?

Will AMD still be relevant in 5 years?

  • Yes

  • Yes but not to enthusiasts/gamers

  • No

  • Dont know/Too early to say


Results are only viewable after voting.

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
As far as enthusiasts are concerned anyways? There was another post somewhere saying that AMD's new strategy sounds like "Intel wins, we give up, we will focus on low powered stuff and make what VIA make, whatever that is..." which sounds scarily feasible. It dosent look good over the past few years, phenom failing, phenom II was okay, bulldozer failing, all the current layoffs and key people leaving. GPUs have been the only success story.

Bulldozer might have been the last shot fired in the CPU performance war :\Will AMD still be relevant to us in say 5 years?
 

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
They aren't relevant now, nevermind 5 years from now. :eek:
 

micrometers

Diamond Member
Nov 14, 2010
3,473
0
0
CPU's stopped mattering around the time that the dual core pentiums started coming out. GPU has been the main bottleneck for a verrrry long time. You could probably play most modern games fine on a 5 yr/o CPU, many on even the original 1ghz athlon.
 

gmaster456

Golden Member
Sep 7, 2011
1,877
0
71
AMD is FAR from dead. Bulldozer wasn't THAT big of a failure and even if it was, its far from enough to kill AMD. The are still doing very well in the Mobile and Graphics department and I don't expect that to change. Even for enthusiasts I think they are still relevant in some areas.
 
Last edited:

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
AMD is FAR from dead. Bulldozer wasn't THAT big of a failure and even if it was, its far from enough to kill AMD. The are still doing very well in the Mobile and Graphics department and I don't expect that to change

They are doing well but market is shrinking, and 'new' AMD wants to move to make different things.
 

iCyborg

Golden Member
Aug 8, 2008
1,353
62
91
CPU's stopped mattering around the time that the dual core pentiums started coming out. GPU has been the main bottleneck for a verrrry long time. You could probably play most modern games fine on a 5 yr/o CPU, many on even the original 1ghz athlon.
I doubt a 5 yr-old CPU would be able to utilize GTX580 very well. And for SLI/3SLI even the fastest of the current CPUs might be holding it down if run at stock clock.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
It's already to the point that lower end CPUs need to be crippled and locked to maintain high end relevance.

Market demand and the willingness to spend on a CPU will keep high end CPUs relevant through crippling them to sell to the low end in order to continue to justify the prices at the high end.

Intel will continue it's practice of crippling the low end to be an option for gaming, but performance will still be there. See BF3 benchmarks of i3-2100 vs. i5-anything for reference. i3-2100 is there, will play it, but it's not quite perfect. If it could be overclocked to 4.5-5GHz? Probably would be perfect, but since it's crippled, Quads maintain their relevance and Intel moved a bunch of people who would have bought a $130 CPU into a $200+ CPU and more expensive motherboard.

Marketing won't let the CPU in general become irrelevant. There's too much money in forcing it to be relevant.

AMD will remain relevant. They're pretty relevant now (to the broader market, but not to "CPU enthusiasts").
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
All rumors point to AMD's board/new CEO wanting to enter "consumer-oriented" designs. Servers, high-end GPUs/high-end CPUs are not what they are talking about when they refer to "consumer devices".

1. AMD's server market share has dwindled from mid 20s to just under 5% in Q3 2011.

2. AMD's discrete GPU division barely makes any $$ (see Q3 2011 earnings report).

3. Share of APUs now accounts for 73% of all PC microprocessors shipped. So APUs are clearly more important to AMD than high-end desktop x86 CPUs.

4. There is currently a huge brain drain at AMD, especially at the ex-ATI campuses. I keep reading how for months, some of the higher level engineers have been leaving. Dr. Gamal Refai-Ahmed, AMDs thermal management guru, Rich Bergman, AMD Products Group.

5. Current lay-offs targeted a lot of VPs/Fellows on the GPU side. Killbrew and Patrick Moorheah, corporate vice president of strategy were laid off....Carrell Killebrew, the man credited with Eyefinity, is probably the largest advantage AMD has in the graphics marketright now.

6. Not long ago, AMD hired Paul Struhsaker, Comcast's former Senior VP of Engineering, has joined AMD's newly formed Commercial Business Division. Struhsaker's job will be to "oversee product management and roadmap planning for AMD's server, high performance computing and embedded products." AMD spun Struhsaker's arrival as a show of its "commitment to profitably" in the server space. However, prior to his stint at Comcast, Struhsaker worked for Motorola, where he "helped lead development of all handset, modem/stack and application processor platforms."

The writing is on the wall. Some of the most talented/key guys on the GPU team are either leaving or getting laid off. On the CPU side, AMD hasn't been relevant for high-end CPUs since Core 2 Duo to be honest (I'll even go as far as to say for any CPUs > $130 range), unless you specifically needed a 6-core X6 for your multi-threaded programs. I believe Read won't allow AMD to spend another 4-5 years and hundreds of millions of dollars to try and redeem Bulldozer. He probably saw that AMD's best engineers and technical managers were unable to beat Intel with Phenom I, Phenom II and Bulldozer under previous CEOs. So given 3 consecutive unsuccessful high-end CPU launches, I doubt Read is the gambling type who'll go for a 4th in "hopes" of de-throwing Intel.

Since Fusion doesn't need high-end GPU designs to be successful and AMD's GPU division hasn't been exactly profitable, high-end AMD GPUs will become less important.

AMD will need a strong GPU road-map for embedded devices, Fusion, and all-in-one chips for tablets/smartphones.

I still think because Dirk was so passionate about producing a winning CPU, he was ousted because the board doesn't believe in that direction for the company.

I believe that AMD is committed to a "massive restructuring plan". I believe this is so because none of the previous CEOs was onboard with this direction from the Board of Directors. As such, I believe that AMD of tomorrow will look completely different than AMD of today.

My personal prediction:

1) AMD will completely remove itself from high-end x86 CPU designs,
2) AMD will completely remove itself from high-end GPU designs, and,
3) The savings from #1 and #2 will be redirected to Fusion/Bobcat/low-power devices with a focus on low-power all-in-one CPUs with embedded graphics, and/or possibly a development of a new CPU architecture designed by AMD to compete in smartphones/tablets.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
CPU's stopped mattering around the time that the dual core pentiums started coming out. GPU has been the main bottleneck for a verrrry long time. You could probably play most modern games fine on a 5 yr/o CPU, many on even the original 1ghz athlon.

CPUs stopped mattering if you have anything from Nehalem and faster in your rig. There are plenty of games where a 3.0ghz Core 2 Duo is too slow. FX-8150 even at 4.8ghz still bottlenecks GPUs.

The part about using a 1ghz Athlon CPU to play modern games is pure FUD. Even a stock Core 2 Duo E6600 2.4ghz severely bottlenecks anything from WOW to Starcraft 2 to Bad Company 2, etc. I remember I upgraded my Athlon XP 1600+ to a Pentium 4 C 2.6 @ 3.2ghz. My performance more than doubled with the same GPU in Unreal Tournament 2004.
 
Last edited:
Aug 11, 2008
10,451
642
126
They are doing well but market is shrinking, and 'new' AMD wants to move to make different things.

I think AMD will survive and be relevant simply because Intel does not want them to go out of business due to anti-trust considerations.

However, I think to say that AMD is doing "quite well" is an overstatement. According to toms hardware, in the last quarter, percentages for intel were:
overall-80.2
server-95.1
mobile-82.3
desktop-75.8
Intel gained market share in every category except mobile. A hopeful sign for AMD though was that the mobile share was up 2.4 percent. However, it remains to be seen how the AMD mobile chips will do against Ivy Bridge which will have better graphics, lower power use, and are already superior in CPU performance.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
2.. AMD's discrete GPU division barely makes any $$ (see Q3 2011 earnings report).
3. Share of APUs now accounts for 73% of all PC microprocessors shipped. So APUs are clearly more important to AMD than high-end desktop x86 CPUs.

[...]

Since Fusion doesn't need high-end GPU designs to be successful and AMD's GPU division hasn't been exactly profitable, high-end AMD GPUs will become less important.

AMD will need a strong GPU road-map for embedded devices, Fusion, and all-in-one chips for tablets/smartphones.

Are they really going to throw in the towel on GPU when it's such a large portion of the APU performance? They already know they're behind on CPU, but the GPU side shows that this is likely to be a foundry and manufacturing technology issue. They compete well with nVidia, when they both have equal access to manufacturing technology.

Surely the discrete graphics segment can be seen internally to have significantly higher value than a small profit, given how much of the APU design was leveraged from existing discrete designs.

I'm not so sure. I see it much more likely they would work towards competing in an area where other people have equal access to manufacturing technology.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
I doubt a 5 yr-old CPU would be able to utilize GTX580 very well. And for SLI/3SLI even the fastest of the current CPUs might be holding it down if run at stock clock.

I'm really doubt a user with top of the line multi-GPU can't upgrade their CPU for 5 years !!!!!!!!
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,736
156
106
If their business strategy becomes that of VIA or Transmeta then they are off to a good start with these chip delays, layoffs, and the fact that they went fabless. Looking at the size of VIA and what happened to Transmeta tho, I can't say it's at all a good business decision to downsize, lose in all market segments, and not compete. I voted "YES" because their board of directors are a bunch of clowns and they are driving AMD out of business IMO. Let's hope i'm wrong .... but hope doesn't carry much weight in a free market economy.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Are they really going to throw in the towel on GPU when it's such a large portion of the APU performance? They already know they're behind on CPU, but the GPU side shows that this is likely to be a foundry and manufacturing technology issue. They compete well with nVidia, when they both have equal access to manufacturing technology.

For us gamers, AMD's GPU division is great. Their market share #s (overall 50% share) are great. But their profitability is almost non-existent. So as far as a business case, it makes little sense to pour millions of dollars to get HD8970 to compete with a GTX780, unless AMD is able to raise prices on discrete GPUs and there is a huge influx of new PC gamers who'll be buying this expensive hardware. Now, look at what's happening: APUs are taking over, less and less people are interested in buying high-end GPUs. It's becoming too costly to compete with NV on price and GloFo and TSMC continue to have nanometer shrinkage delays generation after generation. So much risk for so little financial reward.

Surely the discrete graphics segment can be seen internally to have significantly higher value than a small profit, given how much of the APU design was leveraged from existing discrete designs.

Why were so many high-end VPs and product development guys laid off / left the firm on that side of the business? :hmm: AMD can have a lean graphics division that focuses almost exclusively on embedded and SoC graphics, can't they? It seems to me firing Killebrew who pretty much made Eyefinity happen is a sign Read doesn't care about us gamers because he doesn't care to innovate the high-end graphics card market or have any desire for a graphics holodeck by 2016......

I'm not so sure. I see it much more likely they would work towards competing in an area where other people have equal access to manufacturing technology.

From Read's letter: ".....address the needs of our global customer base and stake leadership positions in lower power, emerging markets and the cloud.”

None of that tells me any priority on high-end /premium market for desktop CPUs, discrete GPUs. :'(

Put it this way, how can you have a "drastic" or "massive" restructuring plan without changing anything about the business?

Either high-end CPUs are toast or high-end GPUs or a combination. AMD's current strategy has focused around those 2 areas. So how are you changing the strategy if you still keep focusing on those 2 areas?

*** I am not stating my ideas as facts, just my predictions of what I think might happen *** Of course, if I had to choose, I'd rather AMD abandon their high-end x86 CPUs than GPUs.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I doubt a 5 yr-old CPU would be able to utilize GTX580 very well. And for SLI/3SLI even the fastest of the current CPUs might be holding it down if run at stock clock.

If the game is demanding a quad core even a 4 year old dual like a e8400 for example can hold back even a mid range gtx560:awe:

That would be BF3 and i know it requires a quad but yeah a gtx580 should be matched with only the latest i7/i5 quads...any other processor and its more then likely gonna be a bastard case unless you clock your cpu skyhigh then again for the same amount of money one spent on a gpu with a 5 year old cpu you could have gotten a 2500k a new mobo and 8gb ram ;)
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
CPU's stopped mattering around the time that the dual core pentiums started coming out. GPU has been the main bottleneck for a verrrry long time. You could probably play most modern games fine on a 5 yr/o CPU, many on even the original 1ghz athlon.
I take it that you do not actually even play modern games. a cpu from 5 years ago is hardly any more relevant than a gpu from five years so to say that a cpu stopped mattering is nonsense. a freaking dual core Pentium cannot even deliver a decent framerate in many games and would not allow a modern high end gpu to reach half its potential. and a 1ghz Athlon does not even meet minimum requirements for most 6-7 year old games nevermind modern games. :rolleyes:
 

Gigantopithecus

Diamond Member
Dec 14, 2004
7,664
0
71
My personal prediction:

1) AMD will completely remove itself from high-end x86 CPU designs,
2) AMD will completely remove itself from high-end GPU designs, and,
3) The savings from #1 and #2 will be redirected to Fusion/Bobcat/low-power devices with a focus on low-power all-in-one CPUs with embedded graphics, and/or possibly a development of a new CPU architecture designed by AMD to compete in smartphones/tablets.

I entirely agree with you on this. Except it hadn't crossed my mind that they might try to come up with a new arch for use in mobile devices.

I spent much of last night and this morning benchmarking the A4-3300 APU for the upcoming budget buyer's guide. This is the least capable Llano Fusion APU, and it's impressive what you can actually do with it - from a mainstream point of view. I already think Fusion is a game-changer for the mainstream, and its earliest iterations are impressive.

So, will AMD be relevant in 5 years? I think the answer at the enthusiast level, no, they won't be. But at the mainstream/consumer level? Yes, definitely.

Intel put a GPU on the die in their latest chips for a reason.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
CPU's stopped mattering around the time that the dual core pentiums started coming out. GPU has been the main bottleneck for a verrrry long time. You could probably play most modern games fine on a 5 yr/o CPU, many on even the original 1ghz athlon.

civ 5
SCII
Metro 2033
Flight Sim X
plus many others are now at least partially cpu-bound. Also, many games that aren't cpu-bound on, say, an i7 950 might be on an x2 555.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
The writing is on the wall. Some of the most talented/key guys on the GPU team are either leaving or getting laid off. On the CPU side, AMD hasn't been relevant for high-end CPUs since Core 2 Duo to be honest (I'll even go as far as to say for any CPUs > $130 range), unless you specifically needed a 6-core X6 for your multi-threaded programs. I believe Read won't allow AMD to spend another 4-5 years and hundreds of millions of dollars to try and redeem Bulldozer. He probably saw that AMD's best engineers and technical managers were unable to beat Intel with Phenom I, Phenom II and Bulldozer under previous CEOs. So given 3 consecutive unsuccessful high-end CPU launches, I doubt Read is the gambling type who'll go for a 4th in "hopes" of de-throwing Intel.

Are "successful" CPUs, only those CPUs that beat Intel? I thought that AMD's GPU strategy, was the opposite of that. That they wouldn't try to create a bigger, hotter, faster GPU than NV, rather, they would build smaller, cooler-running, and scalable GPUs, so that they could take the lead by doubling-up on their GPUs on one card for a flagship.

So by that metric, Phenom II was in fact rather successful, in my mind.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
For us gamers, AMD's GPU division is great. Their market share #s (overall 50% share) are great. But their profitability is almost non-existent. So as far as a business case, it makes little sense to pour millions of dollars to get HD8970 to compete with a GTX780, unless AMD is able to raise prices on discrete GPUs and there is a huge influx of new PC gamers who'll be buying this expensive hardware. Now, look at what's happening: APUs are taking over, less and less people are interested in buying high-end GPUs. It's becoming too costly to compete with NV on price and GloFo and TSMC continue to have nanometer shrinkage delays generation after generation. So much risk for so little financial reward.



Why were so many high-end VPs and product development guys laid off / left the firm on that side of the business? :hmm: AMD can have a lean graphics division that focuses almost exclusively on embedded and SoC graphics, can't they? It seems to me firing Killebrew who pretty much made Eyefinity happen is a sign Read doesn't care about us gamers because he doesn't care to innovate the high-end graphics card market or have any desire for a graphics holodeck by 2016......



From Read's letter: ".....address the needs of our global customer base and stake leadership positions in lower power, emerging markets and the cloud

None of that tells me any priority on high-end /premium market for desktop CPUs, discrete GPUs. :'(

Put it this way, how can you have a "drastic" or "massive" restructuring plan without changing anything about the business?

Either high-end CPUs are toast or high-end GPUs or a combination. AMD's current strategy has focused around those 2 areas. So how are you changing the strategy if you still keep focusing on those 2 areas?

*** I am not stating my ideas as facts, just my predictions of what I think might happen *** Of course, if I had to choose, I'd rather AMD abandon their high-end x86 CPUs than GPUs.

They make almost no profits from their GPU business even in the best of times. Any new CEO is going to see this as a huge red flag.

AMD abandoned the high-end for CPU's a long time ago, not willingly though.

If they came out and claimed it was a "strategy" to bail on the high-end CPU market then it would merely be an acknowledgement of reality for the past year or two.

The layoffs are telling that they are in GPU.

What I'm curious about is the scenario where AMD just pulls out of x86 across the board and tells Intel to sit on it and rotate when it comes to the x86 IP that Intel needs to license for their existing products.

What degree of exhorbinant licensing fees could AMD extract from Intel in such a scenario? Given the volume of Intel's revenue that is dependent on having access to AMD x86 IP, could AMD stand to generate more x86 revenue just by holding Intel's revenue hostage with licensing fees?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
CPU's stopped mattering around the time that the dual core pentiums started coming out. GPU has been the main bottleneck for a verrrry long time. You could probably play most modern games fine on a 5 yr/o CPU, many on even the original 1ghz athlon.

:rolleyes:

believe or not, the world's computing needs do not revolve around gaming
 

Gigantopithecus

Diamond Member
Dec 14, 2004
7,664
0
71
What I'm curious about is the scenario where AMD just pulls out of x86 across the board and tells Intel to sit on it and rotate when it comes to the x86 IP that Intel needs to license for their existing products.

What degree of exhorbinant licensing fees could AMD extract from Intel in such a scenario? Given the volume of Intel's revenue that is dependent on having access to AMD x86 IP, could AMD stand to generate more x86 revenue just by holding Intel's revenue hostage with licensing fees?

Those are good questions. The bottom line (pun intended) is that those revenues would cost AMD nothing. No r and d expenditures, no marketing expenditures, no nothing. The wealth has already been generated.

...So what would they spend that money on? Use it to redefine themselves and their strategies?