• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Anand: Apple's A7 Cyclone Microarchitecture Detailed (2014-03-31)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Benchmarking and power profile assessment is far to complicated to be done by others than oem.

We will know what works simply by the market situation. What is used and what is not.

As for Apple they are hell bend on using their own solution. That decision was made several years ago. As Otellini said the worst decision he had been part of at Intel was not saying yes to design and make the first iphone soc even though Apple approached them.
 
(Note that I'd love it if Apple provided the same level of detail/review assistance as Intel, but it's more likely that Intel would provide a wired-up iPad Air and measurement equipment for reviewers to use than Apple...)

If would be awesome if Intel did the same thing as what they did 1 year and 3 months ago. That could give us the most objective measurements with very clear conclusions.
 
As Otellini said the worst decision he had been part of at Intel was not saying yes to design and make the first iphone soc even though Apple approached them.

Indeed. Otellini was a good CEO for the first few years but had the key failing of being obsessed with margins. He couldn't see a way to obtain what he deemed acceptable margins building an SoC for Apple's phone so he passed on the offer. (There are many other examples of Intel's products falling short due to Otellini's focus on margins adversely affecting performance targets, with the key example being the limited die area devoted to graphics.)

If would be awesome if Intel did the same thing as what they did 1 year and 3 months ago. That could give us the most objective measurements with very clear conclusions.

Such would indeed be awesome, but I have zero expectation of it happening - I highly doubt Intel wants to do anything that has the possibility of alienating Apple. Either NVIDIA or Qualcomm could of course, but why would they want to if they'd lose in the comparison?
 
Is it really that hard for people to grasp that a chip on Intel's 22nm process, which is superb for mobile, is more efficient a chip on Samsung's (gate-first, of all things) 28nm process? This isn't rocket science.
 
Indeed. Otellini was a good CEO for the first few years but had the key failing of being obsessed with margins. He couldn't see a way to obtain what he deemed acceptable margins building an SoC for Apple's phone so he passed on the offer. (There are many other examples of Intel's products falling short due to Otellini's focus on margins adversely affecting performance targets, with the key example being the limited die area devoted to graphics.)

He tells the story about the "No" to Apple a little different, like he thought it was a good idea, but the others persuaded him it was not. Lol. I think we can assume there is different perceptions of that.

As for his obsession with margins we have to remember he came to Intel at the time when K7 hit the market, and Intel were so desperate they pressed the p3 arch to 1ghz so it failed with errors. I was a hard time. He sometimes bribed his way through eg. saying about Michael at Dell when he had a phonecall with him "He didnt call for more money" - Dell had to sell their p4 servers and the customers complained. Ofcource he was obsessed with margins! - if it wasnt for his agressive and sometimes ilegal economic and sales approach Intel would have faired far worse imo. He was absolutely the right man at that time. Looking at how he kept profit high even at those hard times it was very impressive.

And that situation of protecting margins he never let go. There is always bad decisions. What i think Intel have misunderstanded now is, when you make a decision - good or bad - you change the future. Its not like you can come back 4 years later and say hey - we want the mobile market too. It doesnt work that way. We can just look at what is actually selling now.

Fighting against Samsung and Apple economy straight on is doomed. Its excactly the same failure AMD have done for so many years trying to fight Intel head on. They have done it since AMD inception - and except for a few years never made any money. Its not going to work for Intel for the same reasons. Its like lemmings.
 
Last edited:
I don't think even Apple predicted how much of a cash cow the iPhone would be.

Certainly, their first attempt with Motorola on getting phones built with iTunes DRM support was a dismal failure.

Does anyone remember the original ROKR? It was total junk.
 
Fighting against Samsung and Apple economy straight on is doomed. Its excactly the same failure AMD have done for so many years trying to fight Intel head on. They have done it since AMD inception - and except for a few years never made any money. Its not going to work for Intel for the same reasons. Its like lemmings.

Did you really just predict that Intel will fail against apple and Samsung because AMD has failed against intel?
 
Last edited:
If by mature markets, you mean Mac, it has started to decline even for Apple (cf. page 27 of their 2013 annual report) so it indeed makes little sense to heavily invest in CPU development, especially given that Intel is doing a very good job here.

The entire PC industry is on the decline, mostly because of tablet sales, so it makes more sense than ever for Apple to invest in CPU development.

USq8HBA.jpg
 
The entire PC industry is on the decline, mostly because of tablet sales, so it makes more sense than ever for Apple to invest in CPU development.
My point is that it doesn't make sense to invest into developing a CPU to replace CPU designed by Intel for a declining market, especially when these Intel CPU are very good. For other markets they certainly should keep on investing to better fit these markets.
 
And instead of applauding Intel for providing reviewers a capability that none of the other players in this market offer what response does Intel get? Yeah...

While I continue to appreciate Notebookcheck's efforts with respect to measuring power consumption it's pretty clear that their methodology is flawed. Exhibit A being the 3.3W delta for the iPad air between maximum idle and maximum load when we have contradictory evidence from Anandtech's review. My suspicion of such being that their 'load' test for Android, iOS, and maybe even OSX needs some work/isn't comparable to what they run in x86 Windows.

Even if you do take the Notebookcheck delta numbers for Baytrail at face value and use them for comparison to Anandtech's iPad Air numbers you're left with ~8W for Baytrail and, oh yeah, 8W for A7. Key difference being that that's a maximum of 2W per core for Silvermont and 4W per core for Cyclone. (Note that I'd love it if Apple provided the same level of detail/review assistance as Intel, but it's more likely that Intel would provide a wired-up iPad Air and measurement equipment for reviewers to use than Apple...)

Silvermont in Bay Trail consumes <1W at full load in Cinebench.
 
Silvermont in Bay Trail consumes <1W at full load in Cinebench.
Anand wrote this:
http://www.anandtech.com/show/7314/intel-baytrail-preview-intel-atom-z3770-tested/2
I had Intel measure SoC power at the board level while running a single threaded Cinebench 11.5 run on the Atom Z3770 and saw a range of 800mW - 1.2W.
I guess this will get higher if the 4 cores are used, though the chip might reduce frequency to keep power consumption under control.
 
Multithreaded performance puts Bay Trail and AMD's Kabini at similar performance levels. Once again, looking at SoC power however the Atom Z3770 pulls around 2.5W in this test.

1 thread is not full load.
 
Is it really that hard for people to grasp that a chip on Intel's 22nm process, which is superb for mobile, is more efficient a chip on Samsung's (gate-first, of all things) 28nm process? This isn't rocket science.

Nope, not hard at all. But some have an innate need to beat other complexities to death before observing the obvious.
 
The A7 is a great part. And I am looking forward to seeing what they do with the A8.

I must say, I don't understand a lot of the fanboyism in this thread. Apple makes a great part, that doesn't mean Intel is going to drop dead any time soon, since they also make great parts. Further, Apple and Intel don't directly compete. Apple is a big Intel customer, and could have been a bigger customer back when Apple approached them to make a SOC for them.

At that point in time is likely when they started planning to make their own CPU's. Or perhaps a bit later when the iPhone took off, and could justify the expense.
 
The fact that people have to defend Intel's engineering skillz in the face of the Apple onslaught is a testament to Apple's abilities.

I think the bottom line here is that Cyclone is excellent, and Apple is a new leading power in CPU design, something we wouldn't have thought possible just 5 years ago.

Everything that is wrong with Intel in mobile mostly stems from having the wrong business model. Forgot Apple and Samsung for the moment, even Qualcomm as a major third party chip provider earns 2/3 of their revenue just by IP licensing alone without paying a cent to build a fab.
 
Everything that is wrong with Intel in mobile mostly stems from having the wrong business model. Forgot Apple and Samsung for the moment, even Qualcomm as a major third party chip provider earns 2/3 of their revenue just by IP licensing alone without paying a cent to build a fab.

When you're making bank, you can invest plenty of cash in engineers and R&D to make the exact processors you need for a complete product.
 
Everything that is wrong with Intel in mobile mostly stems from having the wrong business model. Forgot Apple and Samsung for the moment, even Qualcomm as a major third party chip provider earns 2/3 of their revenue just by IP licensing alone without paying a cent to build a fab.
So Intel should lose the fab? Everyone should throw away their fabs because they're too expensive?
 
Intel stopping as IDM would be the most stupid they could do. At the current rate, Intel could make every other foundry obsolete as serious manufacturing choice within ~6 years. That is their business model.
 
The A7 is a great part. And I am looking forward to seeing what they do with the A8.

The only rumor I've seen was that it is going to be Quad Core, although that might have been for the now delayed iPad Pro. Probably also have the latest PowerVR GPU.
 
Intel would be nowhere without a state of the art fab, they don't have the design talent to do what Qualcomm does.



Intel was almost done in by AMD just a few short years ago, all the while having a huge process node advantage and more money than AMD could dream of.
 
Intel stopping as IDM would be the most stupid they could do. At the current rate, Intel could make every other foundry obsolete as serious manufacturing choice within ~6 years. That is their business model.

If they can find a way to fill those fabs, sure... Because it looks like the PC volumes ain't coming back.
 

Last time I checked my Galaxy S4 isn't using intel silicon, neither is any iPhone or android phone i've considered buying ever.



It's pretty apparent from the fact that intel kept recycling it's awful netburst architecture (because they suck at designing) for some 4 years past the time when it was obsolete. Last time I checked, qualcomm doesn't sit on 10 year old designs while it loses marketshare to AMD or whomever. It couldn't, mobile is too competitive. If intel didn't have those fabs and a gargantuan budget, it wouldn't exist.
 
Last time I checked my Galaxy S4 isn't using intel silicon, neither is any iPhone or android phone i've considered buying ever.
how about them apples in your signature? didn't it hurt deep inside to know they were essentially flawed by using intel silicon?
 
Back
Top