• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Will AMD ever be able to compete with Intel again?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I think a shift to gpu processing and other similar techs could slowly erode Intel's power. On one scene in Lightwave 3D rendering takes about 1 hour 30 mins, but with gpu rendering on a Titan it takes roughly 14mins for the same scene. (six core 4930K vs 1 Titan running at stock).

I think that gpu processing is only the start and that other devices other than gpus will be made for processing, slowly shifting the emphasis from needing a fast powerful CPU processor to using other devices. AMD could have a huge win if they make excellent gpu processing devices and capitalise on that with developers. Ok AMD are great for mining, but for 3D Rendering using Octane in Lightwave 3d uses CUDA. They really need to get a lot more developers shifting to a OpenCl and help in the development more of OpenCl to make it more stable and mature.
 
I guess loosing is competing too, so there is that.
If there is any momentum that has a chance at shifting the industry x86 standard for general compute I suppose it is arm and the eco system around android, however AMD is locked in the same x86 niche as Intel so I'd imagine that they'll feel the arm pressure well before Intel (I know they're moving into ARM, but at what cost?). Besides the console wins and apu / huma strategy I dont see many moves left for AMD? Maybe a crazy Ivan -> do a 180 and make a 250 watt excavator to regain ST king of the hill performance. Cant be copying intel with lesser IP and lesser fab tech and be expecting to win.
 
Potentially yes, but they would need to innovate again, something they haven't really done.

They need to jump the train and take a risk or two and create an overall better product, not try and follow Intel.

We've already reached somewhat of a ceiling with CPU speeds, so yeah AMD has a chance in the next year I would say to catch up to Intel, since I don't know where the next big performance is going to come from.

I mean Intel have perfected their current architecture, they already are running extremely high clock speeds, so where is the next big performance improvement going to come from?

I mean adding more cache and frequency has diminishing returns and we are hitting a ceiling there as well, more cores is the obvious choice, but then the prices are rising and its still not a linear performance increase and we have to have applications that actually take advantage of the cores, so there is that problem.

I really feel they have a chance to catch up in one year time to Intel and if they manage to do something really innovative maybe even pass Intel.
 
I really dont see it being possible at this point as much as i would like it to happen.

Intel is so far ahead of the game that its like staring into the grand canyon.

Intel has its own FAB's and some of them are designed purely for theoretical stuff that they can keep on tweaking the node and innovate.

The only way i see it possible is if we somehow magically get off the x86/AMD64 arch, and somehow jump into a new form of computing where everything is resetted.

Even then it will only be a matter of time before intel has outpaced its competition due to is enormous and vast resources which they have at disposal.

I really feel they have a chance to catch up in one year time to Intel and if they manage to do something really innovative maybe even pass Intel.

which is not possible because even if they did innovate, intel would find a better way to do it, and would also have the node / processes refined even further.
AMD really lost a lot when the sold there FAB's way back, and decided to out source the FAB's process to TSMC.

AMD has to wait for TSMC to be able to produce the said product, even if they had a working model, vs, intel which needs to roll out a new fab, or convert a retired one out to accept a new node.

And Intel can also further tweak the process on a specialized level at the FAB level without having to worry about delays hurting production. Basically intel can halt production on the product and refine the process until they feel they got the most efficiency, where as, a halt on TSMC would cost AMD a lot of money due to backlog orders TSMC must fill.
 
Last edited:
Intel has its own FAB's and some of them are designed purely for theoretical stuff that they can keep on tweaking the node and innovate.

There are pros and cons to having your own FAB. For example it is costing Intel huge amounts of R&D money.

Also, Intel have no alternative to turn to if the development of their next process tech node fails, or becomes seriously delayed as we've seen with their 14 nm process. (Currently nobody is ahead of Intel that they could turn to anyway, but that may change in the future.)

AMD instead has an external company producing its wafers, which effectively means they share the fab with other companies. That means the R&D costs for new process tech can be shared among more companies. So it can become more cost efficient. As R&D costs for new nodes are getting ever more expensive, this aspect also becomes more and more important.
 
Last edited:
Also, Intel have no alternative to turn to if the development of their next process tech node fails, or becomes seriously delayed as we've seen with their 14 nm process. (Currently nobody is ahead of Intel that they could turn to anyway, but that may change in the future.)

That's not really a differentiator between IDM's and foundries though because the IC development leadtime itself is many years as well.

Just because you are a fabless company doesn't mean you have alternatives to turn to if the foundry of your choice is late in bringing their next node to production. It would take you a solid 2+ yrs to migrate your designs to the process rules for a competing foundry.

Remember what happened to AMD with Wichita and Krishna?
 
That's not really a differentiator between IDM's and foundries though because the IC development leadtime itself is many years as well.

Just because you are a fabless company doesn't mean you have alternatives to turn to if the foundry of your choice is late in bringing their next node to production. It would take you a solid 2+ yrs to migrate your designs to the process rules for a competing foundry.

Remember what happened to AMD with Wichita and Krishna?

Well, taking the penalty of moving your design to another foundry is still better than having nowhere to turn to at all, right?

But in addition, I'm thinking long term too. If one foundry fails and is out of the race completely (or for a very long time), you can move your next design to another one.

Intel could of course do that too if their process tech development eventually runs into trouble. But it would take a huge penalty closing down its fabs and all the R&D money invested into that.
 
Last edited:
Well, taking the penalty of moving your design to another foundry is still better than having nowhere to turn to at all, right?

But in addition, I'm thinking long term too. If one foundry fails and is out of the race completely (or for a very long time), you can move your next design to another one.

Intel could of course do that too if their process tech development eventually runs into trouble. But it would take a huge penalty closing down its fabs and all the R&D money invested into that.

Did GloFo take a huge penalty and have to shut down its fabs and all the R&D money invested into 14nm-XM?
 
It is interesting to see that so many believe Intel is unbeatable just because of their size.
Big R&D departments at successful companies are very good at what they do. But they are very inflexible and have big difficulties to handle major changes.
Look at the history, many dominating companies from 10, 20 years ago or more are not dominating any more. E g IBM, Nokia, Kodak, GM etc.
The fact that Intel has very strong products basically kills their innovation, they will simply improve what they have instead of innovate. This leaves the field open for more radical innovations by other companies
 
Just because you are a fabless company doesn't mean you have alternatives to turn to if the foundry of your choice is late in bringing their next node to production. It would take you a solid 2+ yrs to migrate your designs to the process rules for a competing foundry.

Remember what happened to AMD with Wichita and Krishna?
The cancelling of Wichita/Krishna or the Deccan platform had nothing to do with GlobalFoundries. The cancelling is do to timing and competition with the already completed Brazos 2.0 and Brazos-T.

http://extranotebook.cnews.cz/sites...brazos/amd-roadmapa-wichita-krishna-hondo.jpg
Look how filled 2012 is compared to 2011 and 2013 for "ultrathins"

AMD cancelled Krishna, Samara, Wichita, and Kabini(on the FT2 socket). None of this was because of GlobalFoundries but with AMD's business.
 
Last edited:
Well, taking the penalty of moving your design to another foundry is still better than having nowhere to turn to at all, right?

But in addition, I'm thinking long term too. If one foundry fails and is out of the race completely (or for a very long time), you can move your next design to another one.

Intel could of course do that too if their process tech development eventually runs into trouble. But it would take a huge penalty closing down its fabs and all the R&D money invested into that.
But that obviously won't ever happen because no one even comes close to Intel's revenue, which is proven by their extensive manufacturing lead.
 
It is interesting to see that so many believe Intel is unbeatable just because of their size.
Big R&D departments at successful companies are very good at what they do. But they are very inflexible and have big difficulties to handle major changes.
Look at the history, many dominating companies from 10, 20 years ago or more are not dominating any more. E g IBM, Nokia, Kodak, GM etc.
The fact that Intel has very strong products basically kills their innovation, they will simply improve what they have instead of innovate. This leaves the field open for more radical innovations by other companies

That's not true. In the semiconductor industry it helps a lot if you have money to invest in R&D etc. and that's just what Intel does. Unless something extraordinary happens, Intel isn't in danger, and certainly not more than AMD, ARM, Qualcomm, TSMC or anyone else.
 
But that obviously won't ever happen because no one even comes close to Intel's revenue, which is proven by their extensive manufacturing lead.


Will their margins always be this good though? I'm thinking during this next 10 years their margins will decrease as they attempt to compete with ARM. Within the next 5 years, people will be able to do the majority of their computing needs on an ARM powered device. Hell, sometimes I google something on my phone while sitting in front of my PC and just forget that I can do it on my PC.

With the desktop market slowly declining and the laptop sales having to compete with ARM devices, intel's profit margins can't stay the same and will eventually decline. The Desktop market will always be an intel safe haven and I really don't think AMD can, or has the desire to really compete there (especially at the high end). But at the low end with a processor like Kabini? And future processors like it? I definitely can see AMD competing, but they definitely need to get Android support, and they need to fix some of their drivers as AMD always has less support than intel for their drivers in the SFF market, especially on Linux. Can't even get HD Audio with AMD right now on Linux which sucks.
 
But that obviously won't ever happen because no one even comes close to Intel's revenue, which is proven by their extensive manufacturing lead.

I'm talking about pros and cons of having your own fab in general, not about the current specific state and not about some specific company.
 
Will their margins always be this good though? I'm thinking during this next 10 years their margins will decrease as they attempt to compete with ARM. Within the next 5 years, people will be able to do the majority of their computing needs on an ARM powered device. Hell, sometimes I google something on my phone while sitting in front of my PC and just forget that I can do it on my PC.

With the desktop market slowly declining and the laptop sales having to compete with ARM devices, intel's profit margins can't stay the same and will eventually decline. The Desktop market will always be an intel safe haven and I really don't think AMD can, or has the desire to really compete there (especially at the high end). But at the low end with a processor like Kabini? And future processors like it? I definitely can see AMD competing, but they definitely need to get Android support, and they need to fix some of their drivers as AMD always has less support than intel for their drivers in the SFF market, especially on Linux. Can't even get HD Audio with AMD right now on Linux which sucks.

Intel with its manufacturing lead and being an IDM will have much higher margins on the same products than Qualcomm on an inferior TSMC process. The phone market will eventually consolidate too, and price wars are then likely to end, like you can see in the HDD market, and the desktop market is quite stable like you say, and Intel will use their proficiency to dominate all meaningful markets of computing.

So within the next 5 years, unlike what you say, people's phones will be powered for a large part by Intel inside. On a 7nm process with post-silicon materials, in H1'2018 for example, while TSMC's still shipping their 20nm FinFETs with a 5x lower density. Intel will have moved on to 450mm wafers, effectively acting as another node improvement in costs, which, after adding the fabless foundry tax into the equation, add up to a 4 node -- or 8-10 years -- cost deficiency. This supremacy will go on for a few months after which TSMC reduces the gap to 3 nodes with their 14 "10" nm node (power/performance will be only 3 nodes behind).

But 5nm will be worse. If Intel executes flawless -- which I doubt -- on a 2 year tick-tock cadence, and TSMC on 2 + a few months, Intel's 3-4 node dominance will be longer at 5nm, on top of III-V materials. So, in 2020, Intel might be at 5nm with 450mm with III-V tech for 6 months while TSMC has still only 14nm (3 nodes) transistors (which they conveniently call 10nm) with their 2nd/3rd generation FinFETs (3 nodes -- maybe even more if III-V are really that stellar and blow silicon away) on 300mm wafers (1 node) + IDM (half a node or so).

Intel isn't really afraid I guess, probably just paranoid.
 
Last edited:
That's not true. In the semiconductor industry it helps a lot if you have money to invest in R&D etc. and that's just what Intel does. Unless something extraordinary happens, Intel isn't in danger, and certainly not more than AMD, ARM, Qualcomm, TSMC or anyone else.

please read my post again. Big R&D is good if you can follow your own roadmap and have a fine tuned organisation. But not when unexpected things happens, then you have a huge amount of confused managers that must agree how to change the organisation, and this causes huge problems everywhere in the company.

Look at Intel and tablets. Tablets have existed for 4 years and have already almost the same sales volume as PC's. Intels part of this? Well, they are subsidizing tablet SOC's in China in order to reach 15% market share this year. And next year they will let Rockchip/TMSC make many of these chips instead. Not really a success story.

And there are many other examples where Intel has failed despite their big R&D.
A few examples:
-Larabee,
-Intel740
-smartphones ( Medfield sold very bad, Merrifield has no revealed design wins so far)
-They are still very small in automotive after many years of promotion.
-Rambus
-P4

In fact, Intel is only dominating in the same areas as 15-20 years ago: CPU's for desktops, laptops, workstations and general purpose servers.

So a simple proof by examples above shows that big R&D often fails when trying to do new things or adapt to other external changes.

And I could discuss more about all general organisational issues in big R&D (>5000 engineers where I used to work) but this post is already too long
 
AMD is crushing Intel in gpus and NVIDIA in cpus...

thats because intel really gave up on the dedicated gpu sector back with i740.

they also dont see the need of anything faster then what the current igp can do because they realize if one needed a faster gpu, you would get a dedicated one.

Intel could of course do that too if their process tech development eventually runs into trouble. But it would take a huge penalty closing down its fabs and all the R&D money invested into that.

do you know how far ahead they are due to development?
Lemme tell you this..
When i was ES testing, i had my first sample gulftown 6 months before the production ES.
Then i had my production ES 6 months ahead of release.
Thats a total of 1yr they had samples out to vendors for them to refine the board for said CPU's before the public even got them.
They could afford to NOT rush these chips out because they knew they had no competition, and instead decided to tweak yields at the FAB.

So how much do you want to bet the same is not said for the next coming cpu's?

Infact when i was ES testing, i would get so many ES's that it would litterally drive me crazy because 2-3 months where i thought i had the best cpu, a better one would be mailed to me with a card saying "try to break me" from my sponsor.

In fact, Intel is only dominating in the same areas as 15-20 years ago: CPU's for desktops, laptops, workstations and general purpose servers.

So a simple proof by examples above shows that big R&D often fails when trying to do new things or adapt to other external changes.

Do you know how large the enterprise market is?
The first 3 catigories you listed combined would not equal the vol. on the pure enterprise market.

Also you failed to note intel can spread out there RnD over many many areas like a big blanket and just close the ones off which they dont see as profitable.
The way you worded it, it makes it sound like Intel has to do things 1 at a time, which isn't the case.
They have the resource to be spread out and diversify over many areas to see what is profitable.

Also i do agree the P4 was a flomp of fail, however the Conroe, which slaughtered AMD, was a product of Yonah, which was heavily researched at the laptop sector, which they brought to the desktop sector.
And the P4 was not all fail, it allowed today's cpu to get what most high end cpu's have now called "hyper threading".
 
Last edited:
please read my post again. Big R&D is good if you can follow your own roadmap and have a fine tuned organisation. But not when unexpected things happens, then you have a huge amount of confused managers that must agree how to change the organisation, and this causes huge problems everywhere in the company.

Look at Intel and tablets. Tablets have existed for 4 years and have already almost the same sales volume as PC's. Intels part of this? Well, they are subsidizing tablet SOC's in China in order to reach 15% market share this year. And next year they will let Rockchip/TMSC make many of these chips instead. Not really a success story.

And there are many other examples where Intel has failed despite their big R&D.
A few examples:
-Larabee,
-Intel740
-smartphones ( Medfield sold very bad, Merrifield has no revealed design wins so far)
-They are still very small in automotive after many years of promotion.
-Rambus
-P4

In fact, Intel is only dominating in the same areas as 15-20 years ago: CPU's for desktops, laptops, workstations and general purpose servers.

So a simple proof by examples above shows that big R&D often fails when trying to do new things or adapt to other external changes.

And I could discuss more about all general organisational issues in big R&D (>5000 engineers where I used to work) but this post is already too long

Intel just follows the money, and the very important parts of Moore's roadmap: from supercomputers to the internet of things. Of course Intel can make mistakes like totally ignoring tablets and smartphones, but that doesn't automatically equal a doom scenario, they're just a bit late, missing the whole price wars, consolidation and rapid performance improvements, but they're still able to catch up, like AMD does too. If they're still busy porting the design to their own fabs and found a great way to reach faster TTM by simply putting Silvermont on TSMC's fabs, and making an agreement with Rockchip, I think there's nothing wrong with that.

Secondly, they also have priorities. Their 10B$ R&D budget is split between multiple things, so it isn't exactly clear that all you examples were supported by big R&D budgets.

Thirdly, a failed product for end users doesn't necessarily mean time and money were wasted. This applies to your 2 examples of P4 and smartphones (+ smartphones isn't yet failed, and I highly doubt it will), and I guess also Larrabee.
 
And there are many other examples where Intel has failed despite their big R&D.
A few examples:
-Larabee,
-Intel740
-smartphones ( Medfield sold very bad, Merrifield has no revealed design wins so far)
-They are still very small in automotive after many years of promotion.
-Rambus
-P4

Did any of these "failures" as you call them result in Intel losing their lead in the market they compete in?

Did any of them affect Intel to the point that it affected their execution on process node development? Development of other products?

Sounds like the company you work for had bad management that couldn't respond to changing conditions and you think that's true for all companies.
 
Then why will they release a Skylake GT4 APU with Gen9 graphics?

i am assuming because they are a gateway in low wattage portables.

u dont put dedicated vidoecards inside portables which are low wattage.

And the resolution standards in portables are increasing with larger screen resolution.

However intel never had the ideology that a IGP will crush a 780GTX Ti...
Nor do they ever think that, because they know if we need a 780GTX Ti, you will buy a 780GTX Ti.

Did any of these "failures" as you call them result in Intel losing their lead in the market they compete in?


i dont think intel has made a dedicated videocard after the i740. lol...
I think they made this card to see if they could infact gobble the videocard market, only to lose horribly to Riva which later became the geforce.
 
Last edited:
Back
Top