[WSJ] TSMC is shipping 20nm SOCs to Apple

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
What do you mean? Microsoft's Surface Pro 3 is also 12". Phones? Sure, Llama Mountain is 7.2mm thick, less than iPhones 5s' 7.6mm. Core M isn't made for phones, though.

You don't seem to be getting the point here - thickness is meaningless without also comparing the other two dimensions. You were implying that the Core-M reference tablet generates less heat than iPad Air because it's slightly thinner, while ignoring that it has much more surface area to remove heat.

Intel has only been promoting 14nm. They only mentioned 10nm for their cost/transistor slides.

14nm products haven't even launched. They'll talk about 10nm when it's appropriate. I'm also not saying power consumption will be an order of magnitude less, but from what I've read it's possible, depending on the materials. My point was that when we're talking about an order of magnitude with those materials, Intel could take a massive lead that would give them an unambiguous leadership position.

That link doesn't say that 10nm won't use III-V. First it seemed that 10nm would use III-V, but then I saw this roadmap from Applied Materials, saying Intel will use Germanium. But those 2 slides, from ASML and Intel, say 10nm will use III-V and Germanium.

http://download.intel.com/newsroom/...esearch_Enabling_Breakthroughs_Technology.pdf
http://www.sokudo.com/event/images/130710/5_ASML.pdf

Okay, so:

Mayberry said that everything up to the 10-nm CMOS node – which is in development at Intel and will ramp production in 2015 - is effectively done. However, he said his job depends on being able to continue to double density and performance every two years beyond that, something for which the way forward is much less clear.

There are numerous ideas that may provide a continuation of silicon such as the introduction of germanium, III-V materials into the transistor channel and the move from fins to vertical wires or dots with gate-all-around (GAA) structures. However, once all of that has been worked through, at great cost, where do you go next, he asked the audience.
An Intel process researcher says that 10nm is done but they're investigating III-V with respect to 7nm and beyond - and you don't see how this is saying they're not using III-V in 10nm? It doesn't matter what ASML says can be used at nodes. I'll bet you think Intel will be using EUV at 10nm too based on what ASML says. And as for the 2011 Intel slides you linked (also by Mike Mayberry!), I can't find anywhere where it says III-V is slated for 10nm.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I dont understand why apple has dibs on the first 20nm production. They're only paying like $15 per SoC so that's around $10,000 per wafer. It doesnt seem like good business for TSMC. NVidia would surely be willing to pay much more than that. Especially since they can make use of 95% of the wafer rather than 60% like apple, based on expected yields. NVidia should be able to get 200 dies @ ~260 mm^2 each. In order to outbid apple, they would only need to pay $60 per die. Sounds like a reasonable price for a 20nm maxwell that has the performance of a R9 290X and the power consumption of a GTX760 ti. Yes? Hell yes. So why did TSMC serve apple first?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I dont understand why apple has dibs on the first 20nm production. They're only paying like $15 per SoC so that's around $10,000 per wafer. It doesnt seem like good business for TSMC. NVidia would surely be willing to pay much more than that. Especially since they can make use of 95% of the wafer rather than 60% like apple, based on expected yields. NVidia should be able to get 200 dies @ ~260 mm^2 each. In order to outbid apple, they would only need to pay $60 per die. Sounds like a reasonable price for a 20nm maxwell that has the performance of a R9 290X and the power consumption of a GTX760 ti. Yes? Hell yes. So why did TSMC serve apple first?

The bigger die will have lower Yields,

Also, Apple may have booked early 20nm production before anyone else ??
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
nVidia has been complaining about TSMC 20nm for years.. according to them the wafer cost and/or yield is so much worse that it negates the density advantage. And that the power and performance advantages aren't enough to be compelling otherwise. Apple might feel differently because their products are more power sensitive and their release cycle is more aggressive. Or the process is more favorable to lower power parts in general. Or maybe their price balance is different because they have smaller chips. Or maybe they're better at engineering higher yield.

Apple wants to get away from Samsung, so they had to learn how to develop for TSMC regardless, vs nVidia having to take in new costs for double patterning. And going for 20nm ASAP will help them transition better for 16nm, probably mitigates some risk there..
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
You don't seem to be getting the point here - thickness is meaningless without also comparing the other two dimensions. You were implying that the Core-M reference tablet generates less heat than iPad Air because it's slightly thinner, while ignoring that it has much more surface area to remove heat.
Your in my opinion wrong assumption is that Intel made their 14nm reference tablet 12.5" because 10" like the iPad wouldn't be possible. I don't think there's any problem with a 10" Core M tablet, just like Apple is able to put their A7 in a phone, too.

DSC_3319.jpg


I think you overestimate the heat of a 4.5W TDP. Bay Trail is also about that much with a 2.8W SDP or so, IIRC.

maxpower2.png


Okay, so:

An Intel process researcher says that 10nm is done but they're investigating III-V with respect to 7nm and beyond - and you don't see how this is saying they're not using III-V in 10nm? It doesn't matter what ASML says can be used at nodes. I'll bet you think Intel will be using EUV at 10nm too based on what ASML says. And as for the 2011 Intel slides you linked (also by Mike Mayberry!), I can't find anywhere where it says III-V is slated for 10nm.

You should really read you quote again, carefully. First he says 10nm is done, which is no surprise because Mark Bohr told us that at IDF 2012.

The second thing that is said is just a number of technologies that will be introduced in chips in the future, including Germanium and III-V. There are 2 possibilities for 10nm, Germanium and/or III-V, and both are mentioned in your quote. If he didn't mention Germanium, you would have had a point. Also note that he might not yet be able to disclose information about 10nm.

About the Intel slide: first they're talking about Tri-Gate at 22nm, and they they're talking about a successor of Tri-Gate, which is III-V. Common logic tells us this is the immediate successor of Tri-Gate.

I don't know if ASML ever said Intel's going to use EUV at 10nm. From what I know, again from IDF 2012, it's going to be close. In my slide ASML says "Only EUV can enable 50% scaling for the 10 nm node", so when I look at the claims they made at their Investor Meeting, they apparently are going to use EUV at 10nm.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Your in my opinion wrong assumption is that Intel made their 14nm reference tablet 12.5" because 10" like the iPad wouldn't be possible. I don't think there's any problem with a 10" Core M tablet, just like Apple is able to put their A7 in a phone, too.

No, I didn't assume that, nowhere did I say anything like that. You were the one assuming that it being thinner meant it put out less heat. I was correcting that faulty assumption by drawing attention to the other dimensions.

You should really read you quote again, carefully. First he says 10nm is done, which is no surprise because Mark Bohr told us that at IDF 2012.

The second thing that is said is just a number of technologies that will be introduced in chips in the future, including Germanium and III-V. There are 2 possibilities for 10nm, Germanium and/or III-V, and both are mentioned in your quote. If he didn't mention Germanium, you would have had a point. Also note that he might not yet be able to disclose information about 10nm.

So 10nm is done already, but there are multiple possibilities for what it could be using? The entire context of his talk is technologies beyond 10nm, and you think they're talking about what might be used in 10nm? This is beyond belief.

About the Intel slide: first they're talking about Tri-Gate at 22nm, and they they're talking about a successor of Tri-Gate, which is III-V. Common logic tells us this is the immediate successor of Tri-Gate.

III-V isn't a "successor" to tri-gate in any meaningful sense of the word, it's a totally different and orthogonal improvement. You really read some interesting things into this. The only thing those slides say is that it's one of several potential future technologies.

I don't know if ASML ever said Intel's going to use EUV at 10nm. From what I know, again from IDF 2012, it's going to be close. In my slide ASML says "Only EUV can enable 50% scaling for the 10 nm node", so when I look at the claims they made at their Investor Meeting, they apparently are going to use EUV at 10nm.

Right, so you do think they'll be using EUV so soon...

I guess we'll just have to wait for those 10nm products then, you'll probably insist on this until the day they're out (several months after mid-2016 :p)
 

jpiniero

Lifer
Oct 1, 2010
16,989
7,393
136
EUV isn't going to be realistically doable until 2018, so Intel obviously isn't using it at 10 nm. My guess has been that it's going to be Silicon using Quad Patterning and Intel will just try to make it work.

They could always do 10 nm again with EUV and 450 nm and Post Silicon and call it 7 nm; only fair since TSMC seems to be doing funny things about naming.

Apple might feel differently because their products are more power sensitive and their release cycle is more aggressive. Or the process is more favorable to lower power parts in general. Or maybe their price balance is different because they have smaller chips. Or maybe they're better at engineering higher yield.

Apple's obviously getting a volume discount.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
The expectation is that Apple will stick with a dual core Cyclone-derived CPU in the iPhone 6 (with max CPU clock operating frequencies close to ~ 2 GHz), while the GPU will likely be the G6630 or GX6650 (with max GPU clock operating frequencies close to ~ 600 MHz).
That would probably be the better route to take (for CPU), as it would improve their already strong single threaded performance. Not the best for power efficiency, but going quad comes at a big area penalty.
10nm, EUV, etc.
For what it's worth, you're right. I think what Witiken's getting at however is that even if Intel's 10nm "recipe" has already been finalized, we enthusiasts don't know what the ingredients are, and therefore it's still open to speculation.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
The expectation is that Apple will stick with a dual core Cyclone-derived CPU in the iPhone 6 (with max CPU clock operating frequencies close to ~ 2 GHz), while the GPU will likely be the G6630 or GX6650 (with max GPU clock operating frequencies close to ~ 600 MHz).

Sounds reasonable. PowerVR G6630 @ 600MHz should offer ~2x A7's iGPU performance (230 vs 115 Gflops) and a 2GHz dual-core Cyclone would be ~50% faster than the A7 (not the performance doubling we're used to but A7 is already pretty fast).
If they can hit those clocks with reasonable battery life numbers then they will have a very competitive chip. Probably quite a bit faster than Snapdragon 805 but Tegra K1 comparisons should be interesting. :)
 

jpiniero

Lifer
Oct 1, 2010
16,989
7,393
136
The problem with 2 Ghz (up from 1.3 Ghz on the 5S) is that Apple isn't increasing the battery much (because they are making the 6 thinner) but also is putting in a much larger screen. So the power draw of the CPU can't increase much or at all or the battery life will be much worse.
 

Khato

Golden Member
Jul 15, 2001
1,321
391
136
So 10nm is done already, but there are multiple possibilities for what it could be using? The entire context of his talk is technologies beyond 10nm, and you think they're talking about what might be used in 10nm? This is beyond belief.

If you're Intel and want to avoid the 'mistake' of giving your competition any information regarding your future process direction then why precisely would any hints regarding major advancements in the 10nm process be given? Your argument is that an article which is over a year old - http://www.eetimes.com/document.asp?doc_id=1263255 - summarizes his statements in such a way that implies that 10nm is done and there are still all these other methods for advancing process technology beyond that.

But what exactly would you expect him to say if Intel's 10nm process actually did use III-V materials? Would you really expect him to so conspicuously leave the semiconductor materials advancements that the whole industry is looking towards out of his list and provide a hint at Intel's direction over three years in advance of the likely product release schedule? Or what he did and imply that Intel's 10nm node has nothing special, just further scaling over 14nm.
 

ams23

Senior member
Feb 18, 2013
907
0
0
nVidia has been complaining about TSMC 20nm for years.. according to them the wafer cost and/or yield is so much worse that it negates the density advantage.

They weren't "complaining" per se (that was an erroneous analysis by the Extremetech author), they were just pointing out the reality of the situation that normalized transistor cost is really not significantly lower with more advanced process nodes taking into account a fixed wafer size but increasing wafer (and other) cost.

And that the power and performance advantages aren't enough to be compelling otherwise.

I don't think NVIDIA ever said that power and performance advantages of 20nm are not enough to be compelling once the fab process matures. In fact, the expectation is that Tegra Erista will be fabricated using TSMC's 20nm fab process.

Apple might feel differently because their products are more power sensitive and their release cycle is more aggressive. Or the process is more favorable to lower power parts in general. Or maybe their price balance is different because they have smaller chips. Or maybe they're better at engineering higher yield.

Apple's A series SoC's are not any more power sensitive than NVIDIA's Tegra SoC's, nor are the die sizes any smaller in general. In fact, historically Apple has never been at the bleeding edge of fabrication process with their A series SoC's. That said, since Apple is taking the time and effort to switch some SoC production to TSMC, it makes sense that they would go for a more advanced fab process node.
 
Last edited:

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
The problem with 2 Ghz (up from 1.3 Ghz on the 5S) is that Apple isn't increasing the battery much (because they are making the 6 thinner) but also is putting in a much larger screen. So the power draw of the CPU can't increase much or at all or the battery life will be much worse.

Agreed. Despite the small clock difference (1.3GHz vs 1.4GHz) iPads are considerably faster than the iPhones 5S in stressful tests due to thermal headroom. Let's see how A8 manages the rumoured higher clocks when it comes to heat and battery life.

thermalssm.png


Being the largest device (and the only device with a metal heat spreader and no DRAM stacked on top), the iPad Air obviously maintains the highest frequencies for the duration of the test. The iPhone 5S, with a significant reduction in internal volume (and a PoP SoC) reduces its CPU frequencies early on in order to keep skin temperature down and properly manage thermals. The iPad mini with Retina Display falls between the two, with its performance curve more closely following that of the iPhone 5S.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Agreed. Despite the small clock difference (1.3GHz vs 1.4GHz) iPads are considerably faster than the iPhones 5S in stressful tests due to thermal headroom. Let's see how A8 manages the rumoured higher clocks when it comes to heat and battery life.

thermalssm.png
Apple would probably be okay with battery life. They're going from 28nm gate first to 20nm gate last, and as a result, there should be a pretty hefty frequency boost at the same power draw.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
This thread is a really good example of how corrupted with Intel shills AT has become.

Intel is no more than a footnote in the mobile space.

Why don't you talk about something relevant, like what Qualcomm's answer to A8 might or will be. The 801 isn't likely to cut it. Multi-threaded performance isn't very useful on a phone - extra cores is just window dressing for the uninformed masses.

Or better yet, how can Apple get enough 20nm SoC for a big iPhone release? Simple! They'll release 2 phones, one 4.7" with an A7 and one 5.5" with an 20nm A8 (and a +$100 premium).

If you have a problem with the forums, bring it to our attention. Call outs, attacks, and other meta-commentary on other posters is off topic at best and highly disruptive at worst. It is not conductive to a good technical discussion.
-ViRGE
 
Last edited by a moderator:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
They weren't "complaining" per se (that was an erroneous analysis by the Extremetech author), they were just pointing out the reality of the situation that normalized transistor cost is really not significantly lower with more advanced process nodes taking into account a fixed wafer size but increasing wafer (and other) cost.

Okay, they weren't complaining, they presented this "reality" in a completely neutral fashion without expressing any kind of dissatisfaction. Gotcha. Not that it really matters.

I don't think NVIDIA ever said that power and performance advantages of 20nm are not enough to be compelling once the fab process matures. In fact, the expectation is that Tegra Erista will be fabricated using TSMC's 20nm fab process.

No, this part is evident by TSMC's own numbers.

Apple's A series SoC's are not any more power sensitive than NVIDIA's Tegra SoC's, nor are the die sizes any smaller in general. In fact, historically Apple has never been at the bleeding edge of fabrication process with their A series SoC's. That said, since Apple is taking the time and effort to switch some SoC production to TSMC, it makes sense that they would go for a more advanced fab process node.

This was specifically about nVidia launching dGPUs on 20nm, it has nothing to do with Tegra. Erista may very well be 20nm, I have no idea - what I do know is that it won't be coming out any time soon so the question of why Apple got priority for 20nm over Erista would make no sense.
 

jdubs03

Golden Member
Oct 1, 2013
1,333
935
136
From everything I have seen iii-v is a distinct possibility at 10nm, they might not but you can't rule it out. say maybe mayberry was saying there wasn't a materials advancement at 10nm, there still could've been better than expected results which make the transition quicker. or he was concealing his information like he should. IDF/or Analyst Day would be the announcement time.

i1.PNG


if the pattern continues 10nm is the next advancement point. 7nm would be a massive (1.5x the normal length) delay on intel's process scheduling.

for nvidia, i hope they just skip 20nm at least with erista. maxwell will prob be 28/20nm, then pascal at 16FF+.

imo the A8 should be around the same power draw, but overall battery life could decrease due to the higher resolution screens.
 
Last edited:

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
This thread is a really good example of how corrupted with Intel shills AT has become.

Intel is no more than a footnote in the mobile space.

Why don't you talk about something relevant, like what Qualcomm's answer to A8 might or will be. The 801 isn't likely to cut it. Multi-threaded performance isn't very useful on a phone - extra cores is just window dressing for the uninformed masses.

Or better yet, how can Apple get enough 20nm SoC for a big iPhone release? Simple! They'll release 2 phones, one 4.7" with an A7 and one 5.5" with an 20nm A8 (and a +$100 premium).

I mostly agree with your points about the pro-intel talk. Its kinda crazy how every single thread gets led back to Intel products and insane claims about their future abilities by a certain member of the boards- In the mobile space Intel is nothing at this point, and my opinion is that unless they get a huge ~100% perf/W advantage over Apple they will never EVER get in an iphone. Hell it will be very hard for them to get a foothold in the phone space at all considering the CPU/GPU is quickly becoming inconsequential to nearly everyone.

Why would Apple go through the hassle of switching from Arm to x86 and give up some profit margin in the mix, without a HUGE incentive of some sort? People who talk like Apple will switch to intel eventually are nuts. I think it is far more likely Apple will move its CPU's up rather than essentially cancelling them.

I also think your point about 2 separate SKU's makes sense. When they moved to I believe 32nm they released a separate SKU of the ipad with that updated chip because they had such limited quantities. I think they would be smarter to keep original size iphone on the newest process since it is most size-constrained and keep the phablets on old process since they can shoehorn in a bigger battery.

Oh well, lets get back to hearing about how Core M will fit in a watch and have 2 weeks of battery life...
 

ams23

Senior member
Feb 18, 2013
907
0
0
Okay, they weren't complaining, they presented this "reality" in a completely neutral fashion without expressing any kind of dissatisfaction. Gotcha.

There is a distinction between having a discussion about future cost trends with key partners in the semiconductor industry (and proposing some ways to lower future cost too) vs. blindly whining about it.

This was specifically about nVidia launching dGPUs on 20nm, it has nothing to do with Tegra. Erista may very well be 20nm, I have no idea - what I do know is that it won't be coming out any time soon so the question of why Apple got priority for 20nm over Erista would make no sense.

It is pretty unlikely that Apple got "priority" for 20nm. That would imply favoritism on TSMC's part towards a new client over old clients. The reality is probably much more simple than that. Since Apple is a vertically integrated company and only needs to design one or two SoC's to fit inside one or two new phones and one or two new tablets that they design themselves and sell for very high $ profit margins (relatively speaking), they are afforded certain luxuries that most other companies do not have with respect to time to market, risk, and yield. As for dGPU's, the latest midrange and high end dGPU's have much larger die size and many more transistors than an Apple A series SoC, so obviously it is more complicated and more costly and more time consuming to introduce them on a new fab process node in comparison.
 
Last edited:

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
...
I also think your point about 2 separate SKU's makes sense. When they moved to I believe 32nm they released a separate SKU of the ipad with that updated chip because they had such limited quantities. I think they would be smarter to keep original size iphone on the newest process since it is most size-constrained and keep the phablets on old process since they can shoehorn in a bigger battery.

Oh well, lets get back to hearing about how Core M will fit in a watch and have 2 weeks of battery life...

Two new SKUs is really the only way this makes any sense.

All the leaks over the past 6 months points to a 4.5" or 4.7", and a 5.5". I somewhat doubt the 5.5" though, it's out of character for Apple. Phones over 5" just aren't "elegant" enough to fit their product line. A 4.5" in the same form factor as the 5/5S (ie less bezel) plus a 5" would seem more "apple-like" to me.

I also don't think performance matters tremendously with the new SoCs. The A7 is incredibly fast and well-suited to a phone already, but even with double the performance of the A6 it is difficult to tell the difference. I think we're at "good enough" levels here.

My guess would be the A8 will be a mildly improved A7 with some new "heterogeneous" compute type capabilities\features. Apple is good about coming up with new ways to use these devices, though I think Motorola's X8 was the most ingenious platform of the past year (always on voice recognition, and notifications on screen even while the phone was in sleep mode).
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
...
It is pretty unlikely that Apple got "priority" for 20nm. That would imply favoritism on TSMC's part towards a new client over old clients. ...

Pretty sure you are wrong on that point.

"In the big picture, it was noted that "To ensure the quality of the microprocessors, hundreds of TSMC engineers were sent to Apple's headquarters last year to work on the project."

For sure Apple's orders are great for TSMC, but Taiwan's Business Weekly is reporting that Apple's iPhone 6 will save Taiwan, not only its lifeless technology industry, but also save the Taiwan stock market because of the orders for the iPhone 6 and its various components."

http://www.patentlyapple.com/patent...phone-processors-to-apple-plans-for-2015.html

Hundreds of engineers is a lot. Clearly TSMC put considerable resources into this. And given the 2nd statement, the deal may actually be part of economic strategy for the entire country.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
No, I didn't assume that, nowhere did I say anything like that. You were the one assuming that it being thinner meant it put out less heat. I was correcting that faulty assumption by drawing attention to the other dimensions.
Intel's 7.2mm tablet is evidence to me that Core M fits in every tablet form factor. I never said thinner equals less heat. I just said 4.5W is low enough.

So 10nm is done already, but there are multiple possibilities for what it could be using? The entire context of his talk is technologies beyond 10nm, and you think they're talking about what might be used in 10nm? This is beyond belief.
No, believing that Intel will only use Germanium and/or III-V at 5nm is beyond belief. Intel already confirmed that 14nm is the last silicon node and all evidence points towards Germanium and/or III-V compound semiconductors.

I gave you 2 possible explanations: he wasn't allowed to disclose 10nm technology, or he simply mentioned all the possibilities that are not yet implemented. Intel could introduce Germanium at 10nm, but they could continue to use that beyond 7nm ;).

III-V isn't a "successor" to tri-gate in any meaningful sense of the word, it's a totally different and orthogonal improvement. You really read some interesting things into this. The only thing those slides say is that it's one of several potential future technologies.
Maybe I didn't explain it well enough, although I think it's clear.

First he's talking about Intel's achievements: they invented strained silicon, HKMG and Tri-Gate. So he was talking about the upcoming FinFET technology. And then he moves on to a new technology; the fourth transistor innovation. I didn't mean successor as replacement for Tri-Gate.

I disagree that it's just one of the possible technologies. As far as I know, there are no other possibilities for 10nm because CNTs, graphene, etc. are planned for much further in the future.

And you can see 2 slides (page 11 and 12), the first one about Tri-Gate with then, and the second about III-V with now.



Right, so you do think they'll be using EUV so soon...
If that's the conclusion that the evidence leads me to, sure.

Chart-4.png


Intel wants to scale aggressively at 10nm, and EUV seems the most feasible technology to do that.

I guess we'll just have to wait for those 10nm products then, you'll probably insist on this until the day they're out (several months after mid-2016 :p)
If that's where the evidence leads me to.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
I mostly agree with your points about the pro-intel talk. Its kinda crazy how every single thread gets led back to Intel products and insane claims about their future abilities by a certain member of the boards- In the mobile space Intel is nothing at this point, and my opinion is that unless they get a huge ~100% perf/W advantage over Apple they will never EVER get in an iphone. Hell it will be very hard for them to get a foothold in the phone space at all considering the CPU/GPU is quickly becoming inconsequential to nearly everyone.

You don't seem to realize that Intel already has a 100% performance/watt advantage with Silvermont.
 

North01

Member
Dec 18, 2013
88
1
66
It also has a 12.5" display, now why do you think that is? Kind of means there's more to that comparison than thickness. And why look at iPad Air thickness when we're talking about phones, which A7 is also in? If you really think that CPU can fit okay in a phone I've got news for you.

Just to clarify, Intel's Llama Mountain tablet was shown in two sizes:

12.5 inch display | 1.4 lbs | 7.2 mm thick

10 inch display | 1.2 lbs | 6.8 mm thick