Apple A5X SoC

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

smartpatrol

Senior member
Mar 8, 2006
870
0
0
Seems pretty obvious that Infinity Blade 2 is not running at the full 2048x1536. Not complaining though, it still looks damn good.
 

smartpatrol

Senior member
Mar 8, 2006
870
0
0
5008photo2.PNG

1639photo.PNG


Here's a couple I took. The second is from Mass Effect Infiltrator.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
Yeah. I don't know what the deal is with Infinity Blade 2. There are some scenes that look like they were blown up from the iPad 2, and then there are some scenes that look legitimately good. And it varies. I mean... Look at the character model. There are some lines on there that look razor sharp, and there are some others that look obviously pixelated.

And overall, image quality is just bad in IB2. Most textures look obviously low res, and a lot of interface elements are still in low res.

In terms of consistency, I find the other games doing a much better job.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
Well, verdict is in. A5X is indeed faster or equal to Tegra 3.

http://liliputing.com/2012/03/are-ipads-a5x-graphics-really-4x-faster-than-nvidia-tegra-3.html

It's still a bunch of static benchmarks, but it shows just how poorly threaded mobile applications are. Tegra 3 with 4 CPU cores should bust ahead, but instead, it's only matching A5X in CPU performance, and it gets totally massacre'd (though not up to 4x) in GPU performance.

So I guess A5X will remain king for now.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
I'm actually amazed. 4 1.3GHz Cortex A9 wasn't a lot faster than 2 1GHz Cortex A9. In which case, I guess I have to agree that Apple wouldn't want to move to 4 CPU cores or slightly faster cores at all, as the difference shown there is so minimal, but the thermal difference would have fried the iPad 3.
 

Steelbom

Senior member
Sep 1, 2009
455
22
81
I'm actually amazed. 4 1.3GHz Cortex A9 wasn't a lot faster than 2 1GHz Cortex A9. In which case, I guess I have to agree that Apple wouldn't want to move to 4 CPU cores or slightly faster cores at all, as the difference shown there is so minimal, but the thermal difference would have fried the iPad 3.
The reason the Tegra 3 is quad-core is to make up for the weaker GPU. Games can take advantage of the CPU for some tasks as well. There are some benefits to quad-core CPUs but... not many right now.

I'm hoping Apple goes for a dual-core Cortex A15, and not a quad-core A15.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I'm hoping Apple goes for a dual-core Cortex A15, and not a quad-core A15.

I'm hoping for the same thing for the A6 as well. Since iOS doesn't really allow you to multi-task much, Apple doesn't really need a ton of cores. Multiple cores have become rather popular in computers, because we tend to multi-task a lot. I remember when I first got my Athlon 64 X2, and it was heaven over my older Athlon 64! While quad-core wasn't the same heavenly experience, it does offer a lot cleaner performance in a certain applications or heavier multi-tasking.
 

Steelbom

Senior member
Sep 1, 2009
455
22
81
I'm hoping for the same thing for the A6 as well. Since iOS doesn't really allow you to multi-task much, Apple doesn't really need a ton of cores. Multiple cores have become rather popular in computers, because we tend to multi-task a lot. I remember when I first got my Athlon 64 X2, and it was heaven over my older Athlon 64! While quad-core wasn't the same heavenly experience, it does offer a lot cleaner performance in a certain applications or heavier multi-tasking.
That's true, but iOS is quite heavily threaded. There's plenty of APIs available for multi-threading, but simply put, there's just not a whole lot of uses for a quad-core aside from in games, and those uses can be put on a better GPU.
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
Looks like all that nonsense about games looking bad on the iPad didn't come true.

1024x768 on a 10" screen already looked pretty good. Combine that with the easy scaling, and I really doubt any developers will use the full resolution for high-end games.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
The reason the Tegra 3 is quad-core is to make up for the weaker GPU. Games can take advantage of the CPU for some tasks as well. There are some benefits to quad-core CPUs but... not many right now.

The Tegra 3's GPU is the best available for Android right now. So what is there to make up for? Android games can't expect better.

Also the quad-core can help a lot- when I force all the cores on manually it FLIES! Problem is that it also eats the battery alive in that mode, which is why most of the time it doesn't do that.

The Tegra 3, at its full 1.6ghz, eats alive any other mobile SoC on the CPU component.
 

Steelbom

Senior member
Sep 1, 2009
455
22
81
The Tegra 3's GPU is the best available for Android right now. So what is there to make up for? Android games can't expect better.

Also the quad-core can help a lot- when I force all the cores on manually it FLIES! Problem is that it also eats the battery alive in that mode, which is why most of the time it doesn't do that.

The Tegra 3, at its full 1.6ghz, eats alive any other mobile SoC on the CPU component.
As in, compared to the PowerVR SGX543MP2, it's somewhat lacking. However, the CPU is able to handle some tasks a GPU might normally do as well, like dynamic lighting, etc.

So it's in a sense making up for the GPU not being as powerful as other solutions. It allows it to do more than it could if it only had a dual-core CPU.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
That's not a comparison, the Android version is essentially a paid ad for nVidia. =/

This is a very valid point. It is much like GLBenchmark, only nVidia pays for stupid things like games while PowerVR pays for a benchmark, who needs games when you have benchmarks to score high on!
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
311
1
76
Do you have proof that PowerVR paid for GLBenchmark?

Paid is a bit of a stretch maybe. But Kishonti the company behind GLbenchmarks is a key partner of IMG technologies. You can find various press releases of them working together

Either way its clear beyond a shadow of a doubt that the benchmarks are inconsistent and contradict each other regardless of wether its Adreno vs Nvidia or IMG vs Nvidia. We need better tools to measure performance
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
I'm sure that if GLBenchmark was so biased, we wouldn't see Anand use it for his reviews.

With limited choice and most likely manufacturers pushing reviewers to use certain benchmarks in their "reviews" there's not much surprise that it's used a lot.

GLbenchmark is widely known to favor the PowerVR architecture which makes it just as dubious as Tegra Optimised titles being used. Gotta complain about the use of both if you want to be fair, or neither.
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
311
1
76
I'm sure that if GLBenchmark was so biased, we wouldn't see Anand use it for his reviews.

Have you been skipping the parts in his reviews where he constantly complains about the benchmarks not being accurate and he would like to see better tools to benchmark GPU performance?
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
With limited choice and most likely manufacturers pushing reviewers to use certain benchmarks in their "reviews" there's not much surprise that it's used a lot.

GLbenchmark is widely known to favor the PowerVR architecture which makes it just as dubious as Tegra Optimised titles being used. Gotta complain about the use of both if you want to be fair, or neither.

I keep reading GLBenchmark favoring PowerVR architecture, yet I'm not sure I "know" about something like that. Mind enlightening me?

Have you been skipping the parts in his reviews where he constantly complains about the benchmarks not being accurate and he would like to see better tools to benchmark GPU performance?

http://www.anandtech.com/show/4216/...ance-explored-powervr-sgx543mp2-benchmarked/2

I think he wrote otherwise...

Anand said:
GLBenchmark 2.0 is the best example of an even remotely current 3D game running on this class of hardware—and even then this is a stretch. If you want an idea of how the PowerVR SGX 543MP2 stacks up to the competition however, GLBenchmark 2.0 is probably going to be our best bet (at least until we get Epic to finally release an Unreal Engine benchmark).

Note that he wrote about the same thing for the SGS2 review:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17

I haven't read anywhere that he complained about GLBenchmark being a biased benchmark. The most I have read was this:

Anand said:
It's obvious that GLBenchmark is designed first and foremost to be bound by shader performance rather than memory bandwidth, otherwise all of these performance increases would be capped at 2x since that's the improvement in memory bandwidth from the 4 to the 4S. Note that we're clearly not overly bound by memory bandwidth in these tests if we scale pixel count by 50%, which is hardly realistic. Most games won't be shader bound, instead they should be more limited by memory bandwidth.

In his iPhone 4S review:
http://www.anandtech.com/show/4971/apple-iphone-4s-review-att-verizon/6

It's possible that GLBenchmark is limited as it's not fully taking into account other parts of a whole system, yes, but I have not read anywhere where Anand specifically states that he thinks GLBenchmark is pro-PowerVR.