• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

The reason Intel is not producing hot chips anymore

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
And this is where reality hits Intel cause quad/octa core cortex A7 based phones/tablets are plenty fast enough & have excellent battery life.

Have you used one of these quad core A7 phones? Take a look at the AnandTech review of the Moto G. The scores that it produces on the browsing benchmarks are truly awful. The 3d scores are equally as bad. Maybe it's "fast enough" to run the GUI on the small display, but that's about it. If ARM and its partners continue to believe in stamping out multiple underpowered cores to provide performance, they will fail long term.

This is wrong as has been proven by how leaky/inefficient the A15 is vs an A7.
The only thing that's been proven is that the A15 itself is terribly inefficient, not that "hurry up and go idle" is the wrong concept. In fact there are quite a few examples that says it's a good idea. Apple, Qualcomm, and Intel have all avoided any kind of heterogeneous solution, and they get better battery life than any big.LITTLE device available. In fact, the only people pushing big.LITTLE are the ones that are using A15. That''s pretty strong evidence that it's a solution to an A15 problem, not a general solution for good power.

Now I've said this before & will say it again that the display is the single most power consuming component of a phone/tablet & so whatever Intel puts out there will at best give you one half to an hour more of battery life depending on your usage.
The display doesn't consume the most power if you're using A15. :biggrin: A phone display is ~1W and each A15 itself can easily consume that much power when active.

Who doesn't want better battery life? There is mounting evidence that Intel is happy to compete on price. If it can sell SOCs at a competitive price, then why wouldn't OEMs be willing to buy it so they can provide customers with the additional battery life?
 
Have you used one of these quad core A7 phones? Take a look at the AnandTech review of the Moto G. The scores that it produces on the browsing benchmarks are truly awful. The 3d scores are equally as bad. Maybe it's "fast enough" to run the GUI on the small display, but that's about it. If ARM and its partners continue to believe in stamping out multiple underpowered cores to provide performance, they will fail long term.
Why yes I have & as I mentioned it does "well enough" in terms of performance for upto 1080p screens, the GPU however needs some more horsepower, but as I said earlier the debate around Intel being x times more power efficient than an ARM SoC(especially A7) is simply ridiculous & something like this will hold its own even in a low/medium budget tablet. Besides people don't run benches on their portable devices & a lot many smartphones/tablets are mainly used for connecting to the internet while on the move, their single biggest utility & I doubt a cherrytrail or anything low(er) power from Intel will shake this segment of the market too much.
The only thing that's been proven is that the A15 itself is terribly inefficient, not that "hurry up and go idle" is the wrong concept. In fact there are quite a few examples that says it's a good idea. Apple, Qualcomm, and Intel have all avoided any kind of heterogeneous solution, and they get better battery life than any big.LITTLE device available. In fact, the only people pushing big.LITTLE are the ones that are using A15. That''s pretty strong evidence that it's a solution to an A15 problem, not a general solution for good power.
I was never alluding to the big LITTLE concept as a panacea here rather the fact the A7 & A15 are at the extreme end of the spectrum as far as efficiency in the ARM realm is concerned & so all this talk about Intel being super efficient vs ARM in the mobile arena is just a clever ploy(by marketers) to lure people into buying uber expensive phones. This doesn't work in the mobile/tablet space because "good enough" is still the mantra that sells in the market dominated by sub 500$ devices.
The display doesn't consume the most power if you're using A15. :biggrin: A phone display is ~1W and each A15 itself can easily consume that much power when active.
That isn't true AFAIK well certainly not unless you're running benches on your mobile 24x7 :colbert:
Who doesn't want better battery life? There is mounting evidence that Intel is happy to compete on price. If it can sell SOCs at a competitive price, then why wouldn't OEMs be willing to buy it so they can provide customers with the additional battery life?
Did I ever say that battery life isn't one of the main concerns that plagues the mobile industry? The argument however that Intel is way more efficient than ARM isn't a sound one also the display still eats 50~70% of battery(another ~10% to modem) life so the single biggest component that matters in all of this is again the screen :whiste:
 
What is the performance version of Moore's law? I use 2 versions of moore's law:
1) The number of transistors in a chip doubles every 2 years.
2) The number of transistors in the same area doubles every 2 years.

Both mean that transistor size should become 1.4x smaller every 2 years.

I've never seen a "performance version" of it.

At least for the next 10+ years, Moore's law won't be death (1nm should arive around 2030). As long as companies like Intel can succeed to get enough money for R&D, which I don't think will be a problem for the next years.

The 'layman's version" of Moore's law is that computer speed doubles every two years or so. of course, that's the inaccurate version.
 
And this is where reality hits Intel cause quad/octa core cortex A7 based phones/tablets are plenty fast enough & have excellent battery life.
Lol. Are you serious? There's no such thing as too much battery life with these devices.
The other thing to consider is that with bigger resolution phones/tablets entering mass market GPU is getting more attention & this is again where ARM is advancing at a faster pace than Intel, even with a node's(or two) disadvantage, so any amount of perf/watt lead that Intel had in the desktop arena, against AMD, is at best 1.1~1.3x times the most efficient ARM SoC in the mobile market.
This is an absolutely terrible argument. Firstly, you are seemingly ignorant of Intel's Gen 8 graphics. Second, Intel has been licensing graphics from Imagination Technologies. Even if ARM's graphics were better than Intel's (and are you just talking about vanilla Mali GPUs, or what?), Intel has been licensing and is still licensing graphics from Imagination. You can't argue that Intel's opponents are faster when they're both using the same GPUs.
This is wrong as has been proven by how leaky/inefficient the A15 is vs an A7.
Another terrible argument. Slim's already covered why.
Now I've said this before & will say it again that the display is the single most power consuming component of a phone/tablet & so whatever Intel puts out there will at best give you one half to an hour more of battery life depending on your usage.
Perhaps you should be aware that LCD development isn't standing still. Their power draw lessens with every passing year.
Again rather subjective because cheaper tablets will still sell a heck lot more, like the Nexus or Kindle fire, unless of course we're talking about Apple, where it's the brand that's selling & not necessarily the device or its performance.
Well that's a non sequitur if I've ever seen one. That has what, exactly, to do with what I was saying there?
 
Last edited:
Why yes I have & as I mentioned it does "well enough" in terms of performance for upto 1080p screens, the GPU however needs some more horsepower, but as I said earlier the debate around Intel being x times more power efficient than an ARM SoC(especially A7) is simply ridiculous & something like this will hold its own even in a low/medium budget tablet. Besides people don't run benches on their portable devices & a lot many smartphones/tablets are mainly used for connecting to the internet while on the move, their single biggest utility & I doubt a cherrytrail or anything low(er) power from Intel will shake this segment of the market too much.

So you've already gone from good enough to "well enough". There is no way anyone could consider web browsing on any A7 device to be a pleasant experience. You're talking close to original iPhone speeds ... remember, the device where the CPU was so slow that Apple didn't bother to put a 3G modem in? Just flipping home screens back and forth and seeing that it "doesn't lag" is not the same thing as "good enough."

An A7 isn't even as fast as the *original* Atom that came out ... I refuse to believe a 1.6GHz Atom from 2008 was as much CPU performance as anyone needs ... no matter how efficient the software is.
 
An A7 isn't even as fast as the *original* Atom that came out ... I refuse to believe a 1.6GHz Atom from 2008 was as much CPU performance as anyone needs ... no matter how efficient the software is.
An A7 is roughly as fast as Intel's top tablet bin. It's certainly faster than Saltwell.
 
An A7 is roughly as fast as Intel's top tablet bin. It's certainly faster than Saltwell.

Source? STW was a 2 wide in order pipeline machine with a low latency L2 cache and a memory subsystem capable of maxing out the available DRAM bandwidth. It has higher DMIPS/MHz, higher MHz, higher SPEC scores, Sunspider/browser scores... and much higher frequency (it has been shipping at over 2.1GHz for a long time) than the A7.

Intel's "top tablet bin" is a 2.4GHz OOO processor that is dramatically faster than ARMs A7... or maybe you are referring to Cyclone, the CPU in Apple's A7 SOC? I don't think that's what is being discussed here.
 
Lol. Are you serious? There's no such thing as too much battery life with these devices.
So lemme get this straight, what you're saying is that an SoC(from Intel) which will save 5~10% of total battery life is going to make a world of difference to this industry when in fact we have rumors of 4K displays coming to portable devices next year, yeah right 😱
This is an absolutely terrible argument. Firstly, you are seemingly ignorant of Intel's Gen 8 graphics. Second, Intel has been licensing graphics from Imagination Technologies. Even if ARM's graphics were better than Intel's (and are you just talking about vanilla Mali GPUs, or what?), Intel has been licensing and is still licensing graphics from Imagination. You can't argue that Intel's opponents are faster when they're both using the same GPUs.
Reread what I posted then, what I said was that ARM graphics in the mobile/tablet industry are advancing at a faster pace than Intel's own solutions & unless you're inferring that Gen8 graphics are gonna be present even on low/medium budget smartphones/tablets I don't see how your argument is less valid than what you accuse me of i.e. it's pure & utter gibberish. Also Mediatek & other low end ARM vendors are going with PowerVR's rogue come 2014 so my point still stands which is that Intel doesn't enjoy that big a lead in the GPU department as far as ultra low power devices are concerned.
Another terrible argument. Slim's already covered why.
What, did you even read what I said? The Cortex A7 is more efficient than A15 & that's a terrible argument, now I see where you're going with this :awe:
Perhaps you should be aware that LCD development isn't standing still. Their power draw lessens with every passing year.
How much more efficient do you think the average smartphone display has become over the last two years :hmm:
You really have nothing to corroborate your haughty claims do you ?
Well that's a non sequitur if I've ever seen one. That has what, exactly, to do with what I was saying there?
Well then I guess I'll let you continue with your one liners & wayward rebuttals since you don't seem to be getting my point, whether deliberately or not is totally upto you.
Just to clarify, what I said was that its the total all round experience which sells a(ny) device & not necessarily just the performance as has been proven by Apple's immense success over the years.
 
So lemme get this straight, what you're saying is that an SoC(from Intel) which will save 5~10% of total battery life is going to make a world of difference to this industry when in fact we have rumors of 4K displays coming to portable devices next year, yeah right 😱
No, that's not what I'm saying at all. Try again, this time without the straw man.
Reread what I posted then, what I said was that ARM graphics in the mobile/tablet industry are advancing at a faster pace than Intel's own solutions & unless you're inferring that Gen8 graphics are gonna be present even on low/medium budget smartphones/tablets I don't see how your argument is less valid than what you accuse me of i.e. it's pure & utter gibberish.
It's only gibberish because it contradicts your beliefs.

And yes, it's very likely that Gen8 is present from top to bottom across Intel's lineup.
Also Mediatek & other low end ARM vendors are going with PowerVR's rogue come 2014 so my point still stands which is that Intel doesn't enjoy that big a lead in the GPU department as far as ultra low power devices are concerned
You're still unaware that Intel licenses PowerVR? Even after I just told you?

GPUs absolutely do not have any belonging in this discussion. If Intel's graphics are lacking, they'll just license from others.
What, did you even read what I said? The Cortex A7 is more efficient than A15 & that's a terrible argument, now I see where you're going with this :awe:
Like Slim said, all that A15's inefficiency proves is that it's less efficient than an A7. It doesn't disprove race to sleep as a inherently poor concept. Your poor attempt to assert it as such is laughable.
How much more efficient do you think the average smartphone display has become over the last two years :hmm:
Quite a lot. Ever heard of IGZO?
You really have nothing to corroborate your haughty claims do you ?
Interesting load of garbage coming from a guy that has posted zero citations himself.

Looks like I just ruined your day.
Well then I guess I'll let you continue with your one liners & wayward rebuttals since you don't seem to be getting my point, whether deliberately or not is totally upto you.
Keep living in your fantasy world.
Just to clarify, what I said was that its the total all round experience which sells a(ny) device & not necessarily just the performance as has been proven by Apple's immense success over the years.
And it had absolutely nothing to do with the statement you had quoted.
 
however desktops will not go away. apple is one of the smartest companies around and they just released a new Mac Pro. they wouldn't have put the money into developing a high end desktop if they didn't see a good enough market for it.

A Mac Pro is just that...a PRO computing device. It never has and never will be a mainstream product. Desktops will be around for a while for those who need lots of horsepower.

It's the average user base who previously bought the cheapest Wintel they could find at Best Buy who are no longer buying desktops. But that's the largest percentage of the PC market.
 
As for the future process nodes. Intel said at IDF this year that they see a clear path to 7nm in 2017. I wouldn't hold my breath waiting for 1nm. That is going to be extremely difficult and may be impossible without a major shift or change in the manufacturing tech.
The latest info that I saw from Intel (investor meeting 2012) states that 5nm is in research: http://images.anandtech.com/doci/6253/it_photo_178222_52-580x434.jpg

There are a few possible successors to silicon, which won't be the best thing <14nm. One of them is indium antimonide. Indium antimonide would have 10x less power and 1.5x higher clock speeds. There are also some other options in the III-V class, but I think indium anitmonide is the best I've seen (other options would have 8 or 6x less power). Intel has been running these chips in their labs since at least 2009. Paul Otellini only said "it's cool."

The first non-silicon cpu's should come after Cannonlake, the 2016 tick (die shrink + probably new gpu-arch) of Skylake, thus in 2017. Note that this forecast was made by Otellini in 2009, so reality could be different.

Source: http://apcmag.com/intel-looks-beyond-silicon-for-processors-past-2017.htm

Another possibility is silicon-germanium of pure germanium. The source, a presentation of Applied Materials, gives SiGe as a possibility for 5 and 3nm.

Source: http://www.extremetech.com/computin...that-will-take-us-to-the-limits-of-moores-law

3nm as end of node shrinks should keep Intel busy till 2024, but I'm confident that 3nm isn't the end.

Another informative article: http://www.extremetech.com/computin...t-iii-v-cmos-wafers-roll-off-production-lines

...
Also, gen8 looks very promising (or "hot"), with possibly 40% performance increase: http://www.extremetech.com/computin...ape-indicates-major-improvements-over-haswell
But the TS wants enthusiast chips with high CPU power, so probably not something he cares about 😉.
 
Another possibility is silicon-germanium of pure germanium. The source, a presentation of Applied Materials, gives SiGe as a possibility for 5 and 3nm.
Actually, SiGe is labeled as 10/7nm in that presentation.

From what I'm reading, new channel materials are undoubtedly on their way soon. All signs point to 10nm, and to be frank, I'm excited as hell for it.
 
Why people still buying Intels marketing slides?
Baytrail on 22nm is not better than A15 on TSMC's 28nm process - compare the Surface 2 to whatever Baytrail tablet you want.

Performance and Perf/Watt is in the same league.
 
Why people still buying Intels marketing slides?
Baytrail on 22nm is not better than A15 on TSMC's 28nm process - compare the Surface 2 to whatever Baytrail tablet you want.

Performance and Perf/Watt is in the same league.

Silvemont can run @ max frequency for an almost limitless amount of time. A15 can run for, what, about 6 seconds (estimated 😉). I wouldn't say that's the same Perf/Watt. It can also run @ 2.4GHz ... A15 runs how fast (in a shipping product & for how long)?

Apples & oranges.

Displays, battery sizes, $$ spent on power delivery ... all of these factor into "batty life". 22nm >> 28nm for frequency as well as perf/watt.
 
Why people still buying Intels marketing slides?
Baytrail on 22nm is not better than A15 on TSMC's 28nm process - compare the Surface 2 to whatever Baytrail tablet you want.

Performance and Perf/Watt is in the same league.
Reviews do not back up your claims.
 
Actually, SiGe is labeled as 10/7nm in that presentation.

From what I'm reading, new channel materials are undoubtedly on their way soon. All signs point to 10nm, and to be frank, I'm excited as hell for it.

What would the cost penalty of other materials be? Silicon is the most common element in earth's crust.

I'm also excited for those new materials. Some could give huge power improvements, so Intel could use their process advantage very well.
 
Why people still buying Intels marketing slides?
Baytrail on 22nm is not better than A15 on TSMC's 28nm process - compare the Surface 2 to whatever Baytrail tablet you want.

Performance and Perf/Watt is in the same league.

No. Silvermont has by far the best performance/watt.

Example:

Apple A7 consumes about 2.5W/core
Quadcore Silvermont consumes 2.4W or 0.8W/core (@2.4GHz)

A7 has about 40% faster singles threaded performance. That means Silvermont has 2x more performance/watt.
 
Why people still buying Intels marketing slides?
Baytrail on 22nm is not better than A15 on TSMC's 28nm process - compare the Surface 2 to whatever Baytrail tablet you want.

Performance and Perf/Watt is in the same league.

So you want us to stop buying Intel's marketing slides... and start buying NVidia's? :colbert:
 
And this is where reality hits Intel cause quad/octa core cortex A7 based phones/tablets are plenty fast enough & have excellent battery life.

Regarding the cheap phone chips, I'm really interested to see how they will run this:

ubuntu-edge-desktop.jpg


Intel SoFIA (or even Merrifield) vs. inexpensive quad core ARM A7/A53/whatever cores. What is going to give the most utility bang for the buck?

...then, of course, there is potential for a Windows comparison also. If MS starts packaging Office for free (or low cost) on future handsets like it does with the ASUS T100 (and other small screen devices) there is always that ARM vs x86 to consider also.
 
Last edited:
What would the cost penalty of other materials be? Silicon is the most common element in earth's crust.

I'm also excited for those new materials. Some could give huge power improvements, so Intel could use their process advantage very well.
The higher raw material costs are offset, at least partially, by the performance benefits. For example, you could keep performance flat, but lower costs by using other materials.

The main benefit is their extension to scaling. In the long run, costs go down, because Intel and others are able to shrink transistors further without having leakage go through the roof. If we were still using plain old silicon, and none of this HKMG, strained silicon magic, Moore's Law would have died ages ago.
 
@R0H1T - "plenty fast enough" is really a subjective thing to say. I am no power user and I found my old ARM A7 to be slow when playing even basic games. and there is also issue with OEMs not releasing latest android to those phones
 
When we talk about tablets we should talk about using tablets as... tablets... not as a slave device for our desktop work. What I mean by that is how do you forsee people using office on their tablets with touch? Notebooks have long been powerful enough to do it and aside from your company I have seen very few use notebooks as their primary office workstation. When on the move, sure, but not in office.

I think the phone being plugged into something else (Lapdock or Deskdock, etc) is probably a better "slave device" for basic office work.

If you think about it such a device doesn't need much in the way of GPU either (other than to display a higher resolution to a larger screen, if necessary).

.....so the way I see things (at this time) its kinda of an interesting perdicament these chip companies have:

1. Do they spend the extra xtor budget from a die shrink on more iGPU (and/or cpu cores)?

or

2. Do they forgo extra iGPU (and/or extra cpu cores) and make an an attempt to increase integration to obtain the highest utility per dollar "basic office" computer/phone chip.
 
Last edited:
Regarding the cheap phone chips, I'm really interested to see how they will run this:

ubuntu-edge-desktop.jpg


Intel SoFIA (or even Merrifield) vs. inexpensive quad core ARM A7/A53/whatever cores. What is going to give the most utility bang for the buck?

...then, of course, there is potential for a Windows comparison also. If MS starts packaging Office for free (or low cost) on future handsets like it does with the ASUS T100 (and other small screen devices) there is always that ARM vs x86 to consider also.

A7 ARM is just not fast enoght for a desktop, even a quad, and about their IGPs, most of current ARM igp only support OpenGL ES, what makes things difficult at Linux desktops, that is not longer a problem with OpenGL capable arm igps, still you have any idea of the lack of support ARM vendors, specially IGP ones are for linux?, it took to the Allwinner A10 community (a arm cpu with open sources and a Mali-400 IGP) 2 years, yes, 2 years to get a proper linux driver running, something that ARM Mali was "supporting" from day 1.

on top of that, Ubuntu is way too heavy and "common" people are just not used to linux, its just a recipe for a dissaster, will say that is only viable with a fast quad A15, and a quad A53, and that is already not cheap.

I think Intel option with Windows may have a better chance, its also more elegant than android/ubuntu solution, thats already like having to run 2 OS side by side, even if they share kernel, root filesystems are not the same.

You can do a field test if you want, stop random people on the street and ask them if they whould like a android phone that is also have ubuntu for using with a monitor/keyboard/mouse, im sure more of the replies will be "what is ubuntu?", "i dont known to use it".... etc
 
Last edited:
Back
Top