[Canard PC Hardware] Intel prepares Ryzen's response behind the scenes

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
This may come as a shock to you (because it doesn't make sense) but shareholders actually prefer to see Intel lose the sale outright.
Dont think it is that simple, one way or the other. It is a balance between how many sales are lost and how big the price cut has to be to prevent loss of the sale.
 
  • Like
Reactions: Drazick

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
People really need to get out of the PC mentality. Desktop is no longer a focus for Intel.

All this doom and gloom that AMD is going to put Intel out of business, or the fall is coming. But yet look where Intel is investing - Altera, DCG, IOT, Mobile, Deep Learning. These are all growth markets where AMD doesn't play.

GM tried the diversification strategy in the 1980s to get away from weaknesses in its core markets. It didn't work, and wasted many billions of dollars. We've already seen one of Intel's big-name acquisitions (McAfee) turn out to be a total flop. $7.68 billion down the drain.

And I don't see how Intel is going to sustain its current revenue and profitability in the markets you mention. IoT? That's tiny, low-margin chips for the most part. x86 has no advantage here, nor are cutting-edge manufacturing processes usually needed. Mobile? Intel already left the smartphone and tablet market in disgrace. I doubt a second attempt would be any more successful, as the market has already standardized on ARM. What works in Intel's favor on desktops and servers works against them here. Intel assumed last time that their process advantage would let them steamroll the ARM upstarts, but it didn't happen, and it will be even harder this time with the process gap narrowing (it should close altogether in the next 1-3 years). Deep Learning? There's no reason to think Intel will be able to beat Nvidia in that market. Both Intel and AMD will have a very hard time overcoming the CUDA lock-in, but AMD is at least trying; Intel seems to expect everyone to come to them. The only reason that Intel's offering is even in the same ballpark as GP100 is a slightly denser 14nm process compared to TSMC 16FF+, and once Nvidia follows TSMC to 7nm, Xeon Phi is dead.

Intel's bread and butter is big server CPUs. That's where the massive profit margins come from. That's where they get the revenue to develop new architectures and processes, and to subsidize the other lines of business. And this is where Zen could be especially disruptive. If Naples undercuts Intel's midrange server offerings, it will make a big difference. Intel can't afford to lose too much of the server market, and can't afford to cut margins, since those margins are the bedrock on which all else rests. Not only that, but it will be easier for AMD to make big inroads here than in the OEM desktop/laptop market, because brand recognition counts for less. There will be a bunch of clueless end users who want "an Intel" in their laptop or desktop even if a similarly priced AMD chip is superior in almost every respect, but server admins know who AMD is and that they were quite competitive in the past. If TCO is lower for AMD Naples compared to Broadwell-EP and its successors, then AMD is going in and Intel is out.
 

lopri

Elite Member
Jul 27, 2002
13,317
691
126
I agree the doom and gloom is highly premature, ironically being propogated by the same contingent that used to be so outraged when it was applied to their favorite team.
I know, right? Some of them are highly amusing and a few of them are downright offensive. I do not think we are at a point of concern yet, however. There was a time when there used to be a new thread on a weekly basis here discussing how AMD is doomed based on same rehashed concerns over and over again. That had gone on for so long (read: years) and it was simply not worthwhile to participate in such threads, but nevertheless the same concern trolls kept pushing and renewing those threads to the top every freaking week, annoying and irritating.

It may be time for AMD to have its moment under the sun. But it is best for the supporters to remember that nothing stays still in technology and that insults do not win minds in the long term (nor in the short term).
 
Mar 10, 2006
11,715
2,012
126
GM tried the diversification strategy in the 1980s to get away from weaknesses in its core markets. It didn't work, and wasted many billions of dollars. We've already seen one of Intel's big-name acquisitions (McAfee) turn out to be a total flop. $7.68 billion down the drain.

You really love the GM comparison. Okay, let's play :)

First off, McAfee was not $7.68 billion down the drain (though it was a boneheaded and utterly stupid acquisition). The division was actually modestly profitable for quite a while, and Intel ended up selling a big chunk of it, getting a partial refund (while maintaining a 49% stake which it can further monetize if it wants to).

And I don't see how Intel is going to sustain its current revenue and profitability in the markets you mention. IoT? That's tiny, low-margin chips for the most part.

I wish you would cite your sources. IoT business for Intel generated $2.6 billion in revenue last year (larger than AMD's entire x86 CPU business, FYI) and it generated $585 million in profit.

And I don't think expensive Xeon/Xeon Phi chips sold into self-driving cars in the future will be "low margin" or "tiny" ;)

x86 has no advantage here, nor are cutting-edge manufacturing processes usually needed.

x86 is not a disadvantage here, and cutting edge mfg processes will be valuable in the more performance sensitive markets (i.e. self driving cars).

Mobile? Intel already left the smartphone and tablet market in disgrace. I doubt a second attempt would be any more successful, as the market has already standardized on ARM.

It's true that they're out, but this little startup in Cupertino, California thinks its modems are good enough to use in the iPhone 7 and they will likely be used in at least two more generations of iPhones. Intel has decent (and improving) cellular modem technology and it's already generating them hundreds of millions of dollars per quarter.


Deep Learning? There's no reason to think Intel will be able to beat Nvidia in that market. Both Intel and AMD will have a very hard time overcoming the CUDA lock-in, but AMD is at least trying; Intel seems to expect everyone to come to them. The only reason that Intel's offering is even in the same ballpark as GP100 is a slightly denser 14nm process compared to TSMC 16FF+, and once Nvidia follows TSMC to 7nm, Xeon Phi is dead.

Intel already dominates the market for machine learning/deep learning with Xeon and they're pretty much throwing every asset they have at it going forward:
eLlecg2.png


L0ucWBh.png


NVIDIA is awesome and builds really great stuff, but the "CUDA lock in" that you cite -- which is definitely a headache for AMD -- has an effective counter in the form of, "oh you're building your software to take advantage of those shiny AVX-512 units on our Xeon E5 processors anyway? Well, good thing this Xeon Phi is software compatible!"

Intel's bread and butter is big server CPUs. That's where the massive profit margins come from. That's where they get the revenue to develop new architectures and processes, and to subsidize the other lines of business.

They make big profits in server CPUs, but PC processors actually generate far more revenue and total overall profit dollars than server CPUs do.

Anyway, profit margins don't "subsidize" other lines of businesses for Intel, not really. Of Intel's major business units (Client, Data Center, Altera, IoT, and memory), only one is not profitable -- memory. The rest are all immensely profitable, which means that not only do they generate enough gross profit to cover R&D, but they generate far in excess of what Intel invests -- that's profit.

Cutting margins means that Intel's profits will be lower (which is a problem), but there is a long way between "oh crap, can't fund future R&D" and "oh, we reported less profit than we did last year and investors are not happy."

And this is where Zen could be especially disruptive. If Naples undercuts Intel's midrange server offerings, it will make a big difference. Intel can't afford to lose too much of the server market, and can't afford to cut margins, since those margins are the bedrock on which all else rests.

It's a good thing Intel develops really good products.

If TCO is lower for AMD Naples compared to Broadwell-EP and its successors, then AMD is going in and Intel is out.

If I win the lottery tomorrow, I'll be a multi-millionaire.
 
Last edited:

sirmo

Golden Member
Oct 10, 2011
1,014
391
136
Diversifying Intel is doing is smart in my opinion. Every company needs to grow and they have long saturated the CPU performance market. With AMD and others catching up, it's really their only opportunity to grow their business.

I don't even think Intel sees AMD as a threat anymore. Sure AMD will take some marketshare back, but there are far bigger markets Intel is going after. If anything competitiveness in x86 helps both Intel and AMD. As it keeps the ARM and Power8/9 at bay. Customers need choice, and now that they will be getting that choice back, really hurts ARM and Power8/9 the most imo.
 
  • Like
Reactions: CHADBOGA

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
I wish you would cite your sources. IoT business for Intel generated $2.6 billion in revenue last year (larger than AMD's entire x86 CPU business, FYI) and it generated $585 million in profit.

And I don't think expensive Xeon/Xeon Phi chips sold into self-driving cars in the future will be "low margin" or "tiny" ;)

I guess we're talking at cross-purposes. I wasn't really considering self-driving cars as part of IoT - I see that as being closer to Deep Learning, though it will probably become important enough in the next few decades to be considered its own sector. By IoT, I was thinking along the lines of stuff like Hue lightbulbs and Nest thermostats.

It makes sense for Intel to try to get into this burgeoning market, as it's not too far from their typical core competency and is growing fast. But I think they're going to have a tough fight against Nvidia. It's almost inconceivable that they could get the kind of market share that they have currently in x86 desktops and servers.

It's true that they're out, but this little startup in Cupertino, California thinks its modems are good enough to use in the iPhone 7 and they will likely be used in at least two more generations of iPhones. Intel has decent (and improving) cellular modem technology and it's already generating them hundreds of millions of dollars per quarter.

Intel's modems are clearly inferior to Qualcomm's, which caused some controversy when Apple had to artificially cripple the Qualcomm modems in their iPhones so they wouldn't make the Intel ones look bad.

I don't doubt that this contract is gaining Intel some revenue, but is it enough to offset the whole cost of the largely failed SoFiA program? Serious question. My intuitive guess would be no, but I could be wrong.

NVIDIA is awesome and builds really great stuff, but the "CUDA lock in" that you cite -- which is definitely a headache for AMD -- has an effective counter in the form of, "oh you're building your software to take advantage of those shiny AVX-512 units on our Xeon E5 processors anyway? Well, good thing this Xeon Phi is software compatible!"

I suspect that GPGPU is more likely to be the future of Deep Learning than standard x86 CPUs. If you're doing massively parallel stuff, it makes the most sense. But this is an area of which I don't have extensive knowledge. I guess the future will determine which of us is right.

They make big profits in server CPUs, but PC processors actually generate far more revenue and total overall profit dollars than server CPUs do.

Brand recognition will help in the desktop/laptop market, which will delay the reckoning some. However, AMD pretty much has nowhere to go from here but up - we are going from a situation where AMD has essentially nothing competitive at all in x86, to one where (assuming all the leaks aren't 100% BS) they will be quite competitive with Intel's products on HEDT and mainstream desktop. And AMD has future strategic options not available to Intel (e.g. a solid APU with HBM2 that can really take the place of a discrete card). I don't see Apple staying with Intel's Iris Pro once Raven Ridge hits.

Cutting margins means that Intel's profits will be lower (which is a problem), but there is a long way between "oh crap, can't fund future R&D" and "oh, we reported less profit than we did last year and investors are not happy."

The thing is, managers in large corporations (especially ones with a sense of entitlement) often aren't thinking about these things rationally. They are prone to make stupid decisions when revenue drops or Wall Street panics. This can result in cuts to R&D, termination of existing programs that could have made a profit (or even a strategic difference) if completed, and more petty things such as poor treatment of high-value employees which causes the best to leave for greener pastures. If the CanardPC article is to be believed, we're already seeing some of these symptoms, and Intel's revenue hasn't even suffered yet... imagine how Brian Krzanich will react if profits actually drop. He'll probably start monitoring the engineers' bathroom breaks or something. Again, this sort of thing is a common disease of large, arrogant companies.

If I win the lottery tomorrow, I'll be a multi-millionaire.

Oh, I think the chance of AMD taking back market share with Naples is somewhat better than that. It won't be an overnight decimation, of course (AMD wouldn't have the production capacity to do that even if Naples outright beat Skylake on all metrics, which it won't). But as with desktops, they're bound to gain some decent market share once they have a competitive offering again. My horseback guess would be that in 2 to 3 years, AMD's market share of x86 servers will be more than 10 percent but less than 20 percent. (It's currently under 1 percent.) That won't decimate Intel's revenue by itself, but might well be enough to panic Intel into doing something stupid. And the added revenue will put AMD in a better position to pump money into future R&D to make itself more competitive.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
Will it allow them to regain market share? Absolutely.

Does it mean they will gain significant market share? That's TBD. Build it and they will come only works in the movies. It takes more than a good CPU core to be successful - see IBM, DEC, HP, SUN et. al..

I think one can safely predict a range of market share gain for AMD.

It looks like Ryzen will make AMD's competitive position against Intel not as good as they had in 2005/6, but better than what AMD had in 2010.

So look for AMD to gain marketshare higher than they had in 2010, but not as high as they had in the first 6 months of 2006.
 

Nothingness

Diamond Member
Jul 3, 2013
3,356
2,443
136
NVIDIA is awesome and builds really great stuff, but the "CUDA lock in" that you cite -- which is definitely a headache for AMD -- has an effective counter in the form of, "oh you're building your software to take advantage of those shiny AVX-512 units on our Xeon E5 processors anyway? Well, good thing this Xeon Phi is software compatible!"
Sorry that's marketing BS at its best. Getting the most out of Xeon Phi is certainly not just a matter of instruction set and requires a lot of dedicated tuning (even choosing the right topology on Phi is a pain). In the end the dev cost is very similar.

The "x86 everywhere is the best thing since sliced bread" is a fallacy as soon as performance matters.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
I think one can safely predict a range of market share gain for AMD.

It looks like Ryzen will make AMD's competitive position against Intel not as good as they had in 2005/6, but better than what AMD had in 2010.

So look for AMD to gain marketshare higher than they had in 2010, but not as high as they had in the first 6 months of 2006.

Don't forget that they now have the consoles to boost their share of the x86 market as well.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
You actually believe that Intel has been withholding ipc gains from us?
That's an emphatic, YES! What do you think core-count is? Hehe.
I'll be quick to propose, however, that the only reason why we do get the crumbs from Intel is because Intel necessarily needs to segment the market in order to protect it's lucrative server and workstation business. In other words, Intel is probably more worried about the upper two-tiers moving down to shop for 'cheap' components and parts than the lower tier moving up to shop what would essentially be the same chips at the same prices. It is a fine balancing act. HEDT, as 'expensive' as it is, is actually a compromise. It's Intel addressing a niche they created out of the necessity of market segmentation. In effect, what is essentially a well-balanced market strategy was always Achilles heel in that it could be exploited by a resurgent AMD that is all but absent from the x86 server ecosystem.
 
Mar 10, 2006
11,715
2,012
126
That's an emphatic, YES! What do you think core-count is? Hehe.
I'll be quick to propose, however, that the only reason why we do get the crumbs from Intel is because Intel necessarily needs to segment the market in order to protect it's lucrative server and workstation business. In other words, Intel is probably more worried about the upper two-tiers moving down to shop for 'cheap' components and parts than the lower tier moving up to shop what would essentially be the same chips at the same prices. It is a fine balancing act. HEDT, as 'expensive' as it is, is actually a compromise. It's Intel addressing a niche they created out of the necessity of market segmentation. In effect, what is essentially a well-balanced market strategy was always Achilles heel in that it could be exploited by a resurgent AMD that is all but absent from the x86 server ecosystem.

None of this is correct.
 
  • Like
Reactions: CHADBOGA
Mar 10, 2006
11,715
2,012
126
Now that I have some time, let me deconstruct this...
That's an emphatic, YES! What do you think core-count is? Hehe.

IPC is a measure of the single-threaded performance of a single CPU core.

I'll be quick to propose, however, that the only reason why we do get the crumbs from Intel is because Intel necessarily needs to segment the market in order to protect it's lucrative server and workstation business.

No. The reason that mainstream DT is based on the notebook chips is simple: most desktops are sold either to business customers or to non-enthusiast consumer, and both demographics require integrated graphics and relatively low power consumption.

The HEDT chips, which are derived from server and workstation chips, do have high core counts and are unlocked (since the enthusiast demographic tends to want to overclock).

In other words, Intel is probably more worried about the upper two-tiers moving down to shop for 'cheap' components and parts than the lower tier moving up to shop what would essentially be the same chips at the same prices.

Could you provide an example of this to illustrate?

It is a fine balancing act. HEDT, as 'expensive' as it is, is actually a compromise. It's Intel addressing a niche they created out of the necessity of market segmentation. In effect, what is essentially a well-balanced market strategy was always Achilles heel in that it could be exploited by a resurgent AMD that is all but absent from the x86 server ecosystem.

HEDT motherboards are expensive because they tend to be more feature rich on average and they need to be designed, at a minimum, to support somebody buying a "140W TDP" chip, strapping a big ol' cooler on it, and overclocking it "to da moon" to the point where it's going to consume 200W+.

The gamer/enthusiast demographic is also likely to be interested in better audio, better NICs, etc. -- all things that add more cost, but that people who are DIY enthusiasts interested in performance will pay for anyway.

And, finally, the "mainstream" boards simply enjoy more scale than the "HEDT" boards, so mobo makers need to price each unit higher to make sure that they can get a reasonable return. Packing in more features and limiting the # of SKUs to "higher end" price points is one way to do that.

The validity of this strategy can be debated for sure, I'm not saying that it's perfect (I feel like I could write a small book on how Intel is failing to take full advantage of the opportunities in HEDT/gaming, but that's another discussion for another thread), but the idea that Intel is just "holding back" performance, especially IPC, that it could bring to the table, IMO, just doesn't hold water. You *always* want to move as quickly as you reasonably can in technology, holding back is downright silly.
 
  • Like
Reactions: Phynaz and CHADBOGA

jpiniero

Lifer
Oct 1, 2010
17,026
7,419
136
I am kind of wondering if Intel might release "desktop" versions of the QC Kaby Refresh-U parts. Not sure how it would be branded.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Now that I have some time, let me deconstruct this...


IPC is a measure of the single-threaded performance of a single CPU core...... snip

I know what IPC is. This is basic stuff. One strong core doesn't tell the whole story though, does it? The question is, how many strong cores you can put together on a package under an acceptable tdp. So yes, I was making a throughput argument as a counter to a throughput threat from Zen. Intel always had more performance on the table but they kept it away from the mainstream. The main reason the enthusiast community is so excited about Zen is because it promises to change the computing landscape by doubling the cores in mainstream computing at affordable rates. Intel couldn't do this. They made you believe four cores were all you needed, and charged you much more if you tried to move out of that segment. Now that Zen is here, let's see how the 'four cores is enough' argument play out, shall we?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
JDG,

While servers need to be more competitive and less effective to marketing and segmentation tactics, Intel itself is also much more competitive and rational there. Even their branding is quite straightforward.

What Zen will likely affect most is AMD itself.

Their strategy in HPC is that while there's less ISA lock-in in theory, there's still the cost of porting to a new one. While everyone tends to look at things black and white, it's not that clear. The reason I believe they focus on heavy vector extensions to their "general purpose" CPUs is quite simple. Considering the cost and difficulty of porting, faster vector units keeps the buyers coming.

With AVX-512 and Skylake, it's rumored that it'll have close performance to the Xeon Phi in FP intensive applications. And you still have the advantage of being the fastest CPU too.

Separately in networking, there was significant work on the part of Intel to replace network specific processors to Intel cores instead. With some engineering work done on the IO side, bit of software tuning, and the results came out pretty darn good.

So the server focus isn't just on the traditional enterprise, or cloud, or even HPC, but in lot of places. Considering how many markets they are trying to cover, the results are quite amazing.*

Their strategy seems to be working, though eventually they need a direct counter. That we will see.

*Regarding this it seems server guys aren't handicapped as on PC. Yes, they don't tell you this but there's too much segmentation and inflated pricing on the PC no longer matching consumer requirements.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
Don't forget that they now have the consoles to boost their share of the x86 market as well.

I don't think any aggregator of PC marketshare/sales counts the consoles, nor should they, when the topic at hand is PC marketshare/sales.
 

Spartak

Senior member
Jul 4, 2015
353
266
136
We have this gem, too. I
C4PU6KzWMAEglWH.jpg

This graph is factually wrong. First of all it states Intel HVM wafer start date for 14nm but it was so crappy they only got enough volume to launch a niche low volume ultramobile part in 15Q1; a whole year later. Samsung HVM start date is NOT Q1 17 but I believe already started in Q3 16? They have a launch product (GS8) with much, much more volume in Q1/Q2 17, two years later. Then, finally they draw Samsung's 10nm as if it's almost the same feature size as Intel's 14nm, but it's about 10% smaller. So instead of a 3 year lead, it's a two year lead and shrinking.
 

jpiniero

Lifer
Oct 1, 2010
17,026
7,419
136
This graph is factually wrong. First of all it states Intel HVM wafer start date for 14nm but it was so crappy they only got enough volume to launch a niche low volume ultramobile part in 15Q1

The U parts aren't niche; if anything they are the majority of Intel's mobile sales. Broadwell Core M was a paper launch but that was in Q3 2014. So that doesn't look too far off TBH.
 

Lodix

Senior member
Jun 24, 2016
340
116
116
This graph is factually wrong. First of all it states Intel HVM wafer start date for 14nm but it was so crappy they only got enough volume to launch a niche low volume ultramobile part in 15Q1; a whole year later. Samsung HVM start date is NOT Q1 17 but I believe already started in Q3 16? They have a launch product (GS8) with much, much more volume in Q1/Q2 17, two years later. Then, finally they draw Samsung's 10nm as if it's almost the same feature size as Intel's 14nm, but it's about 10% smaller. So instead of a 3 year lead, it's a two year lead and shrinking.
And they "forgot" to include the incoming 7nm from competitors on the graph and left the line to look like they will have lead for some time...
 
Feb 11, 2017
34
3
16
You really love the GM comparison. Okay, let's play :)

First off, McAfee was not $7.68 billion down the drain (though it was a boneheaded and utterly stupid acquisition). The division was actually modestly profitable for quite a while, and Intel ended up selling a big chunk of it, getting a partial refund (while maintaining a 49% stake which it can further monetize if it wants to).



I wish you would cite your sources. IoT business for Intel generated $2.6 billion in revenue last year (larger than AMD's entire x86 CPU business, FYI) and it generated $585 million in profit.

And I don't think expensive Xeon/Xeon Phi chips sold into self-driving cars in the future will be "low margin" or "tiny" ;)



x86 is not a disadvantage here, and cutting edge mfg processes will be valuable in the more performance sensitive markets (i.e. self driving cars).



It's true that they're out, but this little startup in Cupertino, California thinks its modems are good enough to use in the iPhone 7 and they will likely be used in at least two more generations of iPhones. Intel has decent (and improving) cellular modem technology and it's already generating them hundreds of millions of dollars per quarter.




Intel already dominates the market for machine learning/deep learning with Xeon and they're pretty much throwing every asset they have at it going forward:
eLlecg2.png


L0ucWBh.png


NVIDIA is awesome and builds really great stuff, but the "CUDA lock in" that you cite -- which is definitely a headache for AMD -- has an effective counter in the form of, "oh you're building your software to take advantage of those shiny AVX-512 units on our Xeon E5 processors anyway? Well, good thing this Xeon Phi is software compatible!"



They make big profits in server CPUs, but PC processors actually generate far more revenue and total overall profit dollars than server CPUs do.

Anyway, profit margins don't "subsidize" other lines of businesses for Intel, not really. Of Intel's major business units (Client, Data Center, Altera, IoT, and memory), only one is not profitable -- memory. The rest are all immensely profitable, which means that not only do they generate enough gross profit to cover R&D, but they generate far in excess of what Intel invests -- that's profit.

Cutting margins means that Intel's profits will be lower (which is a problem), but there is a long way between "oh crap, can't fund future R&D" and "oh, we reported less profit than we did last year and investors are not happy."



It's a good thing Intel develops really good products.



If I win the lottery tomorrow, I'll be a multi-millionaire.
I thought Intel is selling McAfee from the news articles I read, am I wrong?
 
Mar 10, 2006
11,715
2,012
126
And they "forgot" to include the incoming 7nm from competitors on the graph and left the line to look like they will have lead for some time...

Yes, that is the biggest issue with the graph. So misleading, but smoke and mirrors with respect to process tech is Intel's game now.
 

2Dtails

Junior Member
Jan 17, 2017
3
2
16
I know what IPC is. This is basic stuff. One strong core doesn't tell the whole story though, does it? The question is, how many strong cores you can put together on a package under an acceptable tdp. So yes, I was making a throughput argument as a counter to a throughput threat from Zen. Intel always had more performance on the table but they kept it away from the mainstream. The main reason the enthusiast community is so excited about Zen is because it promises to change the computing landscape by doubling the cores in mainstream computing at affordable rates. Intel couldn't do this. They made you believe four cores were all you needed, and charged you much more if you tried to move out of that segment. Now that Zen is here, let's see how the 'four cores is enough' argument play out, shall we?
Don't confuse the enthusiast market with the mainstream market. I do personally believe that currently (and some years to that) 4 cores is more than enough for the mainstream market. The mainstream market just isn't starved for CPU performance. The additional 4 cores aren't going to make Microsoft Office run any better, nor is it going to make Facebook run any smoother. The total CPU performance just isn't the determining factor in the mainstream market right now. Maybe in some years, but that is hard to tell.

Raven Ridge will be the big mainstream competitor to Intels mainstream lineup.
 
  • Like
Reactions: Burpo