The ARM v.s. Intel Thing - Let's Discuss It

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
Also like Clover Trail vs. Tegra 3?
Intel needs to get their time to market together. They can't come to the market after one year. They seem on track for the smartphone market but in the tablet and low-end they are to far behind the competition.

???

it clearly beats tegra 3 in performance/watt even so it and old crappy uArch. And now we also know what was obvious: A15 will consume much more power under load than previous designs. I mean Clover trail doesn't even clearly lose to A15 in performance/watt (even with a process disadvantage!!!) and the power the A15 uses to get it's higher performance...it's going into ULV-Core uArch areas and we all know which one of those is faster...

Silvermont will blow A15 out of the water:
- triGate 22nm
- complete new uArch
- out of order

Yes, it will be another year till it shows up but how often does ARM come up with a new uArch? every 3 years? So the A15 will compete with silvermont and later even it's successor.

So Intel will sure not fail in terms of performance/watt. Of course the question of price and platform remains.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
How do you get that "Krait has a much better perf/watt ratio than Atom"?

Krait consumes less power, but Atom performs better. To do "performance per watt", you need to factor in the "performance" part.

Look at the section with the Nexus 10. There are graphs showing the energy used for the bench.
For example Kraken:
52470.png
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
perf/W isn't linear, but a lot of people seem to treat it as if it is. If you have processor A that uses twice as much power to complete a task in 20% more time people will say that it's (2 / 1.2) = 66% as efficient. But that's not really fair, because the faster processor is exercising performance headroom the slower processor doesn't actually have, headroom that you don't actually have to use. What would give you an accurate perf/W comparison would be to measure the power consumption after max clock speed has been set to normalize the performance on both devices.

I'd be really curious to see some breakdowns of power consumption at different capped max clock speeds. Here's an example of how very non-linear it can be:

http://beyond3d.com/showpost.php?p=1676823&postcount=200
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
It's because only the 32-bit version of Windows 8 supports Connected Standby. Since you're not allowed to simultaneously support ACPI S3 and Connected Standby Bay Trail-T would be in major disadvantage in 64-bit mode.

Really? What is the long-term plan for this then? They can only stay in 32-bit land for so long...
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
It's because only the 32-bit version of Windows 8 supports Connected Standby. Since you're not allowed to simultaneously support ACPI S3 and Connected Standby Bay Trail-T would be in major disadvantage in 64-bit mode.

That's no good reason to make the chip 32-bit only, especially if MS decides to change this (as far as I'm aware?) arbitrary restriction.
 

MightyMalus

Senior member
Jan 3, 2013
292
0
0
Samsung cpu and gpu cores are straight up CA15's/ARM designs. Qualcomm cores are custom. NVidia's are CA15's + "custom gpu and stuff"(I think).

What I mean is, its not just ARM vs Intel.

I think Intel might be in a bad position later on and ARM will have some more of a market were Intel usually is. Nvidia will probably ditch Intel after its done with its own custom ARM 64bit chips. Apple will probably do the same. And AMD, who knows what they'll do, HSA?
Next consoles will probably go AMD, well, it would be very smart for AMD to strike a deal with them. Tho, rumor says ps4 has APU inspired AMD hardware, rumors rumors.

I personally wouldn't mind seeing an ARM based console tho. And don't even say Ouya, obsolete! Should have waited for Tegra 4. Said it from the start. That way, at least it would have been "current" for 2-3 years.

Imagine a tri~quad SoC Tegra 4/SnappyS4Prime/whatever and 8GB's of RAM with a 5 year product cycle for about $200 or less?

I see, fragmentation...****! This sucks! ='/
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
perf/W isn't linear, but a lot of people seem to treat it as if it is. If you have processor A that uses twice as much power to complete a task in 20% more time people will say that it's (2 / 1.2) = 66% as efficient. But that's not really fair, because the faster processor is exercising performance headroom the slower processor doesn't actually have, headroom that you don't actually have to use. What would give you an accurate perf/W comparison would be to measure the power consumption after max clock speed has been set to normalize the performance on both devices.

Perf/Watt is the only thing which matters in the mobile world. Having faster processors is great but sacrificing battery time will make the SoC not competitive.
And that is intels problem: With ARM there are tens of OEMs which difference SoC and ARM implementation: Faster and worse perf/watt, slower and much better perf/watt.

I'd be really curious to see some breakdowns of power consumption at different capped max clock speeds. Here's an example of how very non-linear it can be:

http://beyond3d.com/showpost.php?p=1676823&postcount=200

That is not really new. Higher clocks needs higher voltage. And voltage increase power by +%increase^2increase instead of +%increase.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,086
3,928
136
Perf/Watt is the only thing which matters in the mobile world. Having faster processors is great but sacrificing battery time will make the SoC not competitive.
And that is intels problem: With ARM there are tens of OEMs which difference SoC and ARM implementation: Faster and worse perf/watt, slower and much better perf/watt.

That is not really new. Higher clocks needs higher voltage. And voltage increase power by +%increase^2increase instead of +%increase.

i think SOC power/perf importance is way overblown, the phone manufactures need enough performance to deliver the features they want and the SOC's power under normal user conditions need to consume a given amount of power relative to the rest of the device. Personally i think importance of power/perf for SOC's is very much overstated for phones. Tablets have the opportunity for it to matter more but if tablets are just ereaders/big phones then not really.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I don't know if its a Z2580 . It seems there are 2 differant z2580 an L and an W model . The L has 2 graphics cores. At least that what an article I just read . But god is this thing missed up . Alot of people are saying its 14nm. Which I believe when hell freezes over . Seems alot of misinformation about Intel Atoms out there . Some of the reviews are really really bad . No where near the results others are getting . Arm it seems has an army of soldiers spreading misinformation , Which makes since . Apple fanbois can be hardcore . Along with other nameless fans of 1 company or another
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Perf/Watt is the only thing which matters in the mobile world. Having faster processors is great but sacrificing battery time will make the SoC not competitive.

But like I said, perf/W isn't linear so it's not just a fixed number you can compare. This is going to be especially true for Tegra SoCs with companion cores and big.LITTLE configurations using the cores the same way - perf/W is going to be even less linear when it crosses from the little to big CPU.

I disagree that it's the only thing which matters too, especially if you're only looking at the CPU. Peak perf of the CPU, peak perf and perf/W of other components, especially GPU, additional integration options, price, and yes political standing all matter a lot.

That is not really new. Higher clocks needs higher voltage. And voltage increase power by +%increase^2increase instead of +%increase.

It isn't new, but it's something people should think about more. The voltages you need to reach clocks varies a lot with design and process. Intel deserves a lot of credit for being able to hit 2GHz at the voltages they have, the power consumption of Atom at this point seems very good.

Voltage/frequency isn't necessarily linear either. I think my example is pretty illuminating, look at how much power consumption goes up for that measly 200MHz at the end. Apple kept their 45nm A9s clocked at only 1GHz on tablets and 800MHz on phones, and from this we can see why (it's reasonable to assume that the power consumption was basically the same as on Exynos 42xx, die shots reveal the physical implementation was the same as you'd expect). Samsung was obviously willing to allow huge power consumptions at high clock speeds and let the phones get pretty hot, although the 32nm Exynos 44xx brought a huge improvement.

It really does make me feel like the question that should be asked is, "what's the power consumption like with proper voltage tuning at lower clock speeds", instead of just instantly declaring that the power consumption sucks - what if you get half the power consumption at 1.3GHz? Will it still totally suck then?

It also makes me wonder what power consumption will be like on TSMC 28nm. Particularly on the G process instead of the LP, if we can see such a thing (maybe on Tegra 4?), since that should allow high clock speeds to be hit with lower voltages. Samsung's move to 28nm will help too, although not as much as their move to 32nm did for Exynos 4.

Arm it seems has an army of soldiers spreading misinformation , Which makes since . Apple fanbois can be hardcore . Along with other nameless fans of 1 company or another

Fake information about upcoming products circulates all the time. Really unfair to automatically blame the competing companies for this. Not like Intel fanboys can't be extreme too. But a lot of wrong information probably comes from people misinterpreting or miscommunicating information, and attributing more authority to people than they should have.. passed from person to person it ends up worse at each step. I've seen cases where random forum posts from totally unconnected people end up as sources, and the post was just someone's wishes or estimations posed in an over-confident fashion. Part of the misinformation could also just be from people looking for attention and loving the impact they have.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
If this is true . its really not right . Intel has been trying to get its processors inside more smartphones and tablets. However, the company isn't forgetting the fact that it has a long and healthy relationship with Microsoft and its Windows division. This week, as part of its announcements at the Mobile World Congress trade show, Intel announced plans for a new processor and also a chip that could be used inside Windows 8-based tablets.

As reported by News.com, Intel's new smartphone processor is the Z2580 Atom chip. It's got two cores, has a clock speed of up to 1.8 GHz and has the PowerVR SGX 544MP2 chip inside for graphics. That chip has the code name Clover Trail L but it likely won't be made available for smartphones until sometime in 2013.

However, Intel also announced a variant of that chip called Clover Tail W. That chip could be released by the end of 2012 but it only has a single core graphics engine. The story speculates that chip could be used by PC makers to power a Windows 8-based tablet product.

Intel has not officially commented on its plans for Clover Tail W. However, there are rumors that HP will have at least two x86 tablets with Windows 8 by the end of the year. It's possible that one or both could be using Intel's Clover Tail W processor.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Last edited:

MightyMalus

Senior member
Jan 3, 2013
292
0
0
And Vizio is about to launch an AMD Z-60 based Win8 tablet soon.
Things are getting interesting.
Yes, I know the Z-60 is an "old" chip, but it will show us something.
Temash will be quite an upgrade from the Z-60. And its coming out soonish.

And we all know GPU's are king in entertainment.

Cheap x86 tablet with worthy gpu's. Plug it on a tv, dock it with a mouse and key board. Oh yeah! Can't wait!

EDIT: Oh snap, I took forever to post this! Jajajaja!
 

MightyMalus

Senior member
Jan 3, 2013
292
0
0
Well it might be more than other AMD tablets, if any more show up, till Temash arrives and because of the brand name. But probably cheaper than an Intel Atom.

Still, its an x86 and benefits come with that. Heck, if most of my programs, games and IDE's ran on ARM, I'd be all over one.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
That's from nVidia's press statement:
Unprecedented Power Efficiency
Designed for maximum energy efficiency, Tegra 4 includes a second-generation battery saver core for low power during standard use, and PRISM 2 Display technology to reduce backlight power while delivering superior visuals.
Tegra 4 consumes up to 45 percent less power than its predecessor, Tegra 3, in common use cases. And it enables up to 14 hours of HD video playback on phones.
http://nvidianews.nvidia.com/Releases/NVIDIA-Introduces-World-s-Fastest-Mobile-Processor-8ed.aspx


But i'm little disappointed by the lack of information by nVidia:
No release date of products, nothing on the GPU, nothing on the site.


I hope they will fill all the blanks in the next hours.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I didn't like it much when NV compared to a dual core product and states were 2x faster when it was 2cores vs 4 but nowadays with amd constanrly comparing their craptastic 8 cores against intel 4. Now in tablets were going to see the same dang thing with the Z2580 agaist all 4 cores. Silvermont will change it to 4 cores against 4 cores and NO HT.
 
Last edited:

MightyMalus

Senior member
Jan 3, 2013
292
0
0
In small set ups, its not about the cores its about the gpu.
And if tegra 4 is as performant as the claims, omg.
By the way, Project Shield! *drools* I hope that comes out for sure!
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well I just watched the pod cast Silvermont will be out in the stores by holidays of 2013 . From the horses mouth. So that takes care of that made up story . Windows 8 is actually looking pretty good . I like the presentation . Haswell also is no longer 10 watt . Its been moved to 7 watts . I figured at based on last years demo . Intel managed 1 extra watt. So there you have in the tablet space intel will dominate the high end all others are cream puffs . Silvermont will likely also beat the others giving us to GREAT intel choices .
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Well I just watched the pod cast Silvermont will be out in the stores by holidays of 2013 . From the horses mouth. So that takes care of that made up story . Windows 8 is actually looking pretty good . I like the presentation . Haswell also is no longer 10 watt . Its been moved to 7 watts . I figured at based on last years demo . Intel managed 1 extra watt. So there you have in the tablet space intel will dominate the high end all others are cream puffs . Silvermont will likely also beat the others giving us to GREAT intel choices .

What made-up story are you talking about?

The slide we saw says early 2014 for Bay Trail (all versions). Silvermont is not just Bay Trail. It doesn't include Merrifield. Given that Medfield came out (well) before Clovertrail it wouldn't be surprising to see a repeat for Silvermont SoCs. Of course this isn't even considering the server chips.
 
Mar 10, 2006
11,715
2,012
126
What made-up story are you talking about?

The slide we saw says early 2014 for Bay Trail (all versions). Silvermont is not just Bay Trail. It doesn't include Merrifield. Given that Medfield came out (well) before Clovertrail it wouldn't be surprising to see a repeat for Silvermont SoCs. Of course this isn't even considering the server chips.

Bay Trail, according to the presentation at CES, will be available in products on the shelves in "Holiday 2013".

They pulled it in. Nemesis 1 was not "making it up".
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
What made-up story are you talking about?

The slide we saw says early 2014 for Bay Trail (all versions). Silvermont is not just Bay Trail. It doesn't include Merrifield. Given that Medfield came out (well) before Clovertrail it wouldn't be surprising to see a repeat for Silvermont SoCs. Of course this isn't even considering the server chips.

Watch it fella . I already told you ware I got the info . I wasn't the only one that watched the presentation . So what I said was true little fanbois spreading rumors . Intel haters . God its hard being the best of the best . But thats what intel is. This matched what intel said last year
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
I think you two are misunderstanding me here, when I said "what made up story are you referring to" I'm asking what Nemesis 1 was referring to when he said this:

"So that takes care of that made up story ."

I'm not saying HE made up anything.

Can't watch the pod cast yet, will get to it later today. Just going what he said that "Silvermont by 2013" by itself doesn't contradict the slides seen earlier.
 

wlee15

Senior member
Jan 7, 2009
313
31
91
Bay Trail, according to the presentation at CES, will be available in products on the shelves in "Holiday 2013".

They pulled it in. Nemesis 1 was not "making it up".

Cedar Trail was supposed to launch in September of 2011, but ultimately launched around March of 2012.