• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Apple A6X

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
And point 5 brings us back to A6X. Now... disregarding how fast A6 is compared to A15 or S4, how fast do we think A6X would be compared to A6?

We'll have to wait and see on this one since Apple hides their cards so well. I do suspect the primary difference will be the GPU (and the memory controller) but who knows if Apple upgraded the ARM core significantly as well.

As for _how_ Apple did it, perhaps it's time for a new SGX core? Series 6 even maybe? The MP4 is pretty huge so adding more cores seems pretty crazy.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I think we should wait to see how S4 Pro does in the LG Nexus on android 4.2 before we can say which is fast. A6 or S4.

There is a possibility Exynos 5250 will be in the Nexus 10, if it is true and should be shown next week. It will have to be if they don't want that 2560x1600 to be a lagfest. I don't know at what clocks though.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Because the original thing that sparked all of this was a pure mention that the Chromebook is faster than the iPhone 5 at Sunspider.

And the rest followed.

I think here is what sums things up the best:

1) A15, S4, and A6 are all faster than A9, but only under certain circumstances. Hence the "up to" preposition. I'm sure you can break things down in Geekbench and find certain tests where A6 and S4 don't really show any advantage over A9 at all (other than clockspeed). It remains to be seen how A15 fares.

2) Since A6 is only available on iOS, A15 is only available in one device that's basically running a glorified web browser, and S4 is only available on Android, God knows how we can compare them in a more comprehensible manner. So the 3 chips are inherently not comparable for now.

3) ARM has made it clear that A15 is a high performing part without much regard to power consumption, so it remains to be seen whether OEMs will be able to "downsize" A15 to fit into phones and tablets without resorting to companion A7 cores. In that regard, A15 performance will likely suffer, or power consumption will suffer. It's likely we won't see much of a difference when A15 hits. I think we are making A15 look too... magical.

4) It's clear Apple's A6 is at least leading in power consumption compared to other SoCs. This is as a whole for the CPU, GPU and RAM modules among other things. It may or may not have to do with how Apple implemented their Swift core compared to Krait or A15. Basically, I'm saying the power saving benefits of A6 may not come from the CPU alone. It remains to be seen whether the A6X will follow suit, because it's evident A6X will feature increased clock speed over A6 at least for the GPU.

And point 5 brings us back to A6X. Now... disregarding how fast A6 is compared to A15 or S4, how fast do we think A6X would be compared to A6?

S4 is in Windows Phone 8 as well.
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
311
1
76
I think it's not that A15 is expected to be faster than S4 that's the problem. It's obvious Apple's Swift core is also faster than S4 where it counts. Problem: S4 isn't the performance beast we all expected it to be.

If we take A9 as the baseline, then Swift is actually either as fast as A15 or faster than A15. Why? Because ARM could only claim a 40% performance improvement over A9 for A15.

http://www.itproportal.com/2011/03/14/exclusive-arm-cortex-a15-40-cent-faster-cortex-a9/

Apple's Swift showing 50% performance improvement over A9 clock-by-clock is actually quite awesome.



ARM writes "up to", not "definitely". I think the performance advantage roughly depends on how many cores are configured with the SoC, as well as the clock speed.

Unless Samsung runs Mali T-604 at its absolute max performance configuration possible, I don't think you'll see actual performance reaching that point. In which case, Anand is not completely off base.

Here is an antutu benchmark of a developer device with Exynos 5 clocked at 1,5 GHz

http://www.antutu.com/static/attachment/articles/201210/17073045-content.jpg

The GPU part scores more than 2x the Optimus G with the Adreno 320 and that one is already neck to neck with the iPad 3 and iPhone 5. And this is on development drivers and lower clocks

Anand is wrong sorry
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
S4 is in Windows Phone 8 as well.

Windows Phone 8 does not exist as of yet, right?

Here is an antutu benchmark of a developer device with Exynos 5 clocked at 1,5 GHz

http://www.antutu.com/static/attachment/articles/201210/17073045-content.jpg

The GPU part scores more than 2x the Optimus G with the Adreno 320 and that one is already neck to neck with the iPad 3 and iPhone 5. And this is on development drivers and lower clocks

Anand is wrong sorry

And here is the Antutu score of Galaxy SIII running Mali-400...

17073206-content.jpg


Incidentally, it's about 1/3 that of Mali T-604.

Apple's A5X is just about 3x faster than Mali-400, so in that regard, Mali T-604 is not going anywhere near 2x A5X performance. It would need to be 6x faster than Mali-400 in order to be anywhere near that.

Either the Adreno 320 scores in the Optimus G are bugged, or Adreno 320 sometimes don't benchmark that high, which is the norm for every Adreno solution.
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
311
1
76
Windows Phone 8 does not exist as of yet, right?



And here is the Antutu score of Galaxy SIII running Mali-400...

17073206-content.jpg


Incidentally, it's about 1/3 that of Mali T-604.

Apple's A5X is just about 3x faster than Mali-400, so in that regard, Mali T-604 is not going anywhere near 2x A5X performance. It would need to be 6x faster than Mali-400 in order to be anywhere near that.

Either the Adreno 320 scores in the Optimus G are bugged, or Adreno 320 sometimes don't benchmark that high, which is the norm for every Adreno solution.

http://images.anandtech.com/graphs/graph6330/50939.png

A5X is 2x faster than Mali-400. Since A6X is twice as fast that would mean its 4X faster than Mali-400. Now add in the fact that Exynos 5 is actually underclocked here and running unfinished drivers.
 

bearxor

Diamond Member
Jul 8, 2001
6,605
3
81
I think you're forgetting just how mediocre A9's memory controller was. It would be very reasonable for a SGX543MP4 at 333MHz(ish) and coupled with Swift's memory controller (expanded to 128bits) to double A5X's graphics performance in certain situations. Primarily those raster/ROP bound as opposed to shader bound.

But if we look at an iPhone 5 A6, it took the better memory controller, a clock speed bump, and 1 extra GPU core to double the performance of the A5 in the iPhone 4S.

Does it not make sense that it would take similar efforts to double the GPU performance of the A5X?
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
http://images.anandtech.com/graphs/graph6330/50939.png

A5X is 2x faster than Mali-400. Since A6X is twice as fast that would mean its 4X faster than Mali-400. Now add in the fact that Exynos 5 is actually underclocked here and running unfinished drivers.

I don't think 1.5GHz is "underclocked" considering A5X is only clocked at 1GHz, and A6 is only clocked at 1.3GHz.

But even if you were to run Exynos 5250 at 2GHz, that's only a 33% clock bump. Multiply the 3x advantage over the 1.33x clock increase and you get close to 3.99x. So even at 2GHz, Exynos 5250 is only barely touching A6X performance.
 

amdhunter

Lifer
May 19, 2003
23,332
249
106
What's the point of faster hardware in an iOS device again? I am being 100% serious. Just to move a bunch of icons left and right? Or just to run games that run fast already -- faster?

Give us new features already. Widgets, ability to drag and drop media files...etc.

Apple lost a fan today.

Did anyone catch that comment that jerkface said about, it's amazing how this makes everything else look old?

WTF at mind games.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
What's the point of faster hardware in an iOS device again?

Well, for starters:

- Run higher resolution games (the iPad 3 has to upscale some)
- Render websites faster (scrolling is smooth but rendering is not)
- Edit more high resolution photos (Photoshop Touch and iPhoto)
- Maybe allow us to edit more than 8 tracks in GarageBand
- Display high resolution PDFs without freezing, lagging, or stuttering
- Allow Apple to push OS features in the future (so yeah, widgets and such)
- Make Google Chrome run as fast as Mobile Safari (yeah, I like Chrome better so sue me)

There are all sorts of reasons why Apple should push hardware.

I agree that this update came too abruptly, though. Plus the make-everything-else-look-old comments are uncalled for. In fact, many of the Android references and comparisons this time are uncalled for. It was like they got fed up from investors hounding them to make a smaller iPad, so they flamed Android for the heck of it.

But that doesn't really undermine the significance of their new hardware IMO.
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
311
1
76
I don't think 1.5GHz is "underclocked" considering A5X is only clocked at 1GHz, and A6 is only clocked at 1.3GHz.

But even if you were to run Exynos 5250 at 2GHz, that's only a 33% clock bump. Multiply the 3x advantage over the 1.33x clock increase and you get close to 3.99x. So even at 2GHz, Exynos 5250 is only barely touching A6X performance.

Its underclocked because the finished chip is running at 1.7 GHz in the Chromebook. Its also been announced as 1.7 GHz officially by Samsung. Not sure what Apples clock frequencies have to do with the matter

You are also ignoring the driver issue. T604 is a brand new architecture so the difference between a beta driver and release driver can be substantial. Galaxy S2 got a 30-40% increase in GPU performance just from a driver update this year. Doubling frequency like Apple is doing wont double things like fillrate etc, those advantages come from more efficient architectures

So again. Things are far from said and done the way you seem to think
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
s4 is lacking in sunspider because of the browser and os used,I got 1200 using my old nexus that uses a pos old a9 clocked at 1.2ghz using official jellybean

I ran a leaked jellybean rom on my gs3 with the s4 and it went from 1600 to 1300 and that was a bloated leaked rom running samsungs browser.

wait for official nexus s4 hardware with official android/browser for official sunspider marks.

once we get a nexus device with the s4 hardware you will see scores under 500 with teaked kernals and overclocked cpus.

so to rap this up,my old galaxy nexus has an old a9 dual core cpu clocked at 1.2ghz and it beats the living hell out of my gs3 in sunspider so that sums up how well that bench is optimised for os and browser performance and a old junk cpu can out score a monster s4 and s4pro with the right os and browser tweaks
 
Last edited:

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
Its underclocked because the finished chip is running at 1.7 GHz in the Chromebook. Its also been announced as 1.7 GHz officially by Samsung. Not sure what Apples clock frequencies have to do with the matter

You are also ignoring the driver issue. T604 is a brand new architecture so the difference between a beta driver and release driver can be substantial. Galaxy S2 got a 30-40% increase in GPU performance just from a driver update this year. Doubling frequency like Apple is doing wont double things like fillrate etc, those advantages come from more efficient architectures

So again. Things are far from said and done the way you seem to think

So you are saying increasing clock frequency doesn't improve fillrate? Please explain this then:

49977.png


49979.png


Also bear in mind A6 doesn't have as much memory throughput as A5X.
 

OBLAMA2009

Diamond Member
Apr 17, 2008
6,574
3
0
is a6 whats in the new ipods. i was checking out the new ones yesterday and almost bought one because it seemed so fast, well, until i realized basically have no use for it....
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
My guess is that it is 2 Swift cores with the same SGX543MP4 and 128-bit memory bus. Remember the ipad 3s soc is still 45nm so a shrink should allow them to clock the gpu much higher giving them the 2x graphics performance.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
is a6 whats in the new ipods. i was checking out the new ones yesterday and almost bought one because it seemed so fast, well, until i realized basically have no use for it....

No. New iPod Touch has A5. A6 is a lot faster than A5.

My guess is that it is 2 Swift cores with the same SGX543MP4 and 128-bit memory bus. Remember the ipad 3s soc is still 45nm so a shrink should allow them to clock the gpu much higher giving them the 2x graphics performance.

I'm sure the 128-bit memory bus is a sure thing. Just wondering if Apple will bump clock speed for both CPU and GPU or just GPU. Their iPad CPUs have consistently been clocked about 20% higher than iPhone counterparts.

But to really achieve 2x GPU performance that they claim, it takes more than 20% clock bump for the GPU.
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
No. New iPod Touch has A5. A6 is a lot faster than A5.



I'm sure the 128-bit memory bus is a sure thing. Just wondering if Apple will bump clock speed for both CPU and GPU or just GPU. Their iPad CPUs have consistently been clocked about 20% higher than iPhone counterparts.

But to really achieve 2x GPU performance that they claim, it takes more than 20% clock bump for the GPU.

So probably 1.56Ghz CPU and 400Mhz GPU. I guess that's ok, but the PowerVR 5 series is getting rather long in the tooth. I think the max the 5XT series has been clocked in a shipping product is 533Mhz in Intel Clover Trail chips, so we should/need PowerVR 6 to keep that clock speed down and get better performance/watt ratios.
 

Mopetar

Diamond Member
Jan 31, 2011
8,496
7,753
136
So probably 1.56Ghz CPU and 400Mhz GPU. I guess that's ok, but the PowerVR 5 series is getting rather long in the tooth. I think the max the 5XT series has been clocked in a shipping product is 533Mhz in Intel Clover Trail chips, so we should/need PowerVR 6 to keep that clock speed down and get better performance/watt ratios.

Any idea when the 6 series is going to start shipping? The most recent information doesn't have an expected launch date. Looks to be a monster, whenever it comes out.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
So probably 1.56Ghz CPU and 400Mhz GPU. I guess that's ok, but the PowerVR 5 series is getting rather long in the tooth. I think the max the 5XT series has been clocked in a shipping product is 533Mhz in Intel Clover Trail chips, so we should/need PowerVR 6 to keep that clock speed down and get better performance/watt ratios.

Well, it'll likely be one of the fastest mobile GPU (if not the fastest) at this time frame for a good while.

There is no indication of the PowerVR series 6 GPUs to be ready any time soon. If we end up having to wait another 12 months, then I think I'd rather have the best mobile GPU that is available right now.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Any idea when the 6 series is going to start shipping? The most recent information doesn't have an expected launch date. Looks to be a monster, whenever it comes out.
Some time in 2013. We'll probably see the first announcements at CES, though it's anyone's guess how long after that it will be until those products actually ship.
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
311
1
76
So you are saying increasing clock frequency doesn't improve fillrate? Please explain this then:

No thats you putting words in my mouth. I said simply doubling frequency wont double fillrate. Not that it wont improve at all


http://images.anandtech.com/graphs/graph6324/49977.png[/IMG]

49979.png


Also bear in mind A6 doesn't have as much memory throughput as A5X.

Adding 1 extra core with more ALUs/TMUs is going to help with fillrate a helluva lot more than frequency alone. So im not sure what exactly you think are you showing me.

At this point you have painted yourself in a corner with your absolute statements and might end up looking pretty foolish in a few months
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
No thats you putting words in my mouth. I said simply doubling frequency wont double fillrate. Not that it wont improve at all

Adding 1 extra core with more ALUs/TMUs is going to help with fillrate a helluva lot more than frequency alone. So im not sure what exactly you think are you showing me.

At this point you have painted yourself in a corner with your absolute statements and might end up looking pretty foolish in a few months

I'm simply showing hard data.

If I'm proven wrong in a few months, I'll gladly accept it. It's not like I gain or lose anything for being "right" anyway. I was proven "wrong" with the A5X after all.

On the other hand, I'm wondering... why the insistence that Mali T-604 MUST somehow be faster or comparable to double-frequency SGX543MP4? It doesn't look that way to me... unless there is that magical 30% improvement with drivers and the chip runs at higher clocks to boost. Like you said, final parts are 1.7GHz. 1.5GHz to 1.7GHz is hardly a big jump in clock speed. Plus drivers for the sample might not have been that bad, so 30% improvement may not be likely at all.
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
311
1
76
I already said in my first post that its entirely possible that A6X is faster (GPU wise) but not by the same lead A5X has enjoyed over its competitors. You are the one who has taken an absolute stance, not me.

As for Cortex A15 vs Swift, the 40% increase ARM claimed is for DMIPS alone. Samsungs paper says 1,7 GHz Cortex A15 is 2X faster than a 1,4 GHz Cortex A9
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
A15 is certainly faster than Swift, but swift is here now in a smartphone, whereas the A15 isn't until Q1 2013.

If Apple didn't design their own chip, we wouldn't have the iPhone 5 right now. I think with the A6 and A6X, Apple has shown that they're very good at designing their own chips.