Why don't more people use the PowerVR SGX543MPX?

Mr. Pedantic

Diamond Member
Feb 14, 2010
5,027
0
76
Like, it's been obvious for a long time that the SGX543 is a great GPU. Apple's been using them since forever, but practically nobody else is using a PowerVR GPU, and if they are, they're using the SGX540 in their crappy, lower end phones. Does ARM have a thing where their licensees can't use these GPUs or does PowerVR have a weird licensing agreement with Apple?

I know the Adreno 320 is good, but the 543 apparently came out in Jan 2009 according to Wiki, that's like the Dark Ages in terms of mobile computing. So why has nobody been using this GPU, or at least making something that works faster, for so long?
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
I know the Adreno 320 is good, but the 543 apparently came out in Jan 2009 according to Wiki, that's like the Dark Ages in terms of mobile computing. So why has nobody been using this GPU, or at least making something that works faster, for so long?

Because the SGX543MPX series still offers plenty of performance, and all those years have given it stable drivers and implementations.

When something works, you don't just replace it.

And Sony uses the 543 in their Vita as well. That handheld kicks arse.
 

Mr. Pedantic

Diamond Member
Feb 14, 2010
5,027
0
76
Because the SGX543MPX series still offers plenty of performance, and all those years have given it stable drivers and implementations.

When something works, you don't just replace it.

And Sony uses the 543 in their Vita as well. That handheld kicks arse.

Exactly. So:

1) Why has nobody (except Apple and apparently Sony) been using it?
2) Why can't anyone come up with anything better?

Wiki says it can go up to 16-module configurations, I'm assuming nobody's going to use that for mobile devices because of die size and power consumption. But as we've seen, even the SGX543MP3 in the iPhone 5 is pretty decent. It keeps up with Adreno 320 mostly, and basically wipes the floor with everything else.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
Because if you look at things realistically, everybody else makes only Android handsets and sometimes tablets.

Android doesn't make extensive use of the GPU yet (yeah yeah, I know what Google advertises...). Disregarding the interface, there aren't a lot of games or 3D applications.

So the problem is not that nobody is gunning for more GPU power, but that having more GPU power on an Android device handset doesn't really make much sense.

It's different for iOS (iPhone, iPod and iPad) because they have plenty of games in the App Store, and many of them can bring even the iPad 3 (SGX543MP4) to its knee. It's absolutely necessary for Apple to keep GPU performance up.

It's also different for Sony because their handheld device is made for... gaming.

So I think it's more of a use case problem rather than that nobody uses it. Maybe the landscape will change when Google introduces the Nexus 10. If and only if that device comes with a decent GPU.
 

s44

Diamond Member
Oct 13, 2006
9,427
16
81
(1) It's huge. Apple uses a *far* larger die size budget than anyone else to get their GPU speed. There's nothing inherently faster about the design.

(2) Actually, TI does. It's just that Qualcomm's LTE dominance has kept Android releases on their silicon much of this year. And Nvidia obviously has their own design, and Samsung has gone fully with ARM.

But (1) is the key.
 

Jodell88

Diamond Member
Jan 29, 2007
8,762
30
91
Like, it's been obvious for a long time that the SGX543 is a great GPU. Apple's been using them since forever, but practically nobody else is using a PowerVR GPU, and if they are, they're using the SGX540 in their crappy, lower end phones. Does ARM have a thing where their licensees can't use these GPUs or does PowerVR have a weird licensing agreement with Apple?

I know the Adreno 320 is good, but the 543 apparently came out in Jan 2009 according to Wiki, that's like the Dark Ages in terms of mobile computing. So why has nobody been using this GPU, or at least making something that works faster, for so long?
Intel Atoms use it, but not for long. Intel is moving to a ivy bridge based graphics core for the next release of their Atoms. :)
 

Mr. Pedantic

Diamond Member
Feb 14, 2010
5,027
0
76
Because if you look at things realistically, everybody else makes only Android handsets and sometimes tablets.

Android doesn't make extensive use of the GPU yet (yeah yeah, I know what Google advertises...). Disregarding the interface, there aren't a lot of games or 3D applications.

So the problem is not that nobody is gunning for more GPU power, but that having more GPU power on an Android device handset doesn't really make much sense.

It's different for iOS (iPhone, iPod and iPad) because they have plenty of games in the App Store, and many of them can bring even the iPad 3 (SGX543MP4) to its knee. It's absolutely necessary for Apple to keep GPU performance up.

It's also different for Sony because their handheld device is made for... gaming.

So I think it's more of a use case problem rather than that nobody uses it. Maybe the landscape will change when Google introduces the Nexus 10. If and only if that device comes with a decent GPU.
Chicken vs. egg.

Prior to my SGS3, I never wanted to play 3D games on my Android phone because it simply could not have handled a 3D game. My SGS could barely handle tetris without lag.

(1) It's huge. Apple uses a *far* larger die size budget than anyone else to get their GPU speed. There's nothing inherently faster about the design.

(2) Actually, TI does. It's just that Qualcomm's LTE dominance has kept Android releases on their silicon much of this year. And Nvidia obviously has their own design, and Samsung has gone fully with ARM.

But (1) is the key.
Didn't realize. So how big is it, actually? And how big are competitors like the Mali-400? And why does Apple tolerate larger die sizes than other mobile players?

Intel Atoms use it, but not for long. Intel is moving to a ivy bridge based graphics core for the next release of their Atoms. :)

Actually, Medfield used the SGX540.
 

ITHURTSWHENIP

Senior member
Nov 30, 2011
310
0
0
ST-Ericsson has been working on their A9600 chip for a while now. 2 GHz Cortex A15 with Series 6 "Rogue" at 210 gflops. Hopefully someone buys those chips
 

stormkroe

Golden Member
May 28, 2011
1,550
97
91
(1) It's huge. Apple uses a *far* larger die size budget than anyone else to get their GPU speed. There's nothing inherently faster about the design.

(2) Actually, TI does. It's just that Qualcomm's LTE dominance has kept Android releases on their silicon much of this year. And Nvidia obviously has their own design, and Samsung has gone fully with ARM.

But (1) is the key.

Totally agree. Keep in mind that the 543 isn't ahead of anything, they have to double, triple, or quadruple them to get those results. Adreno 320 matches the mp4 often, and its a single gpu. I imagine the same will happen with Mali t604.
Brute force works great, but not everyone has the die/battery budget to go for it.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
Chicken vs. egg.

Prior to my SGS3, I never wanted to play 3D games on my Android phone because it simply could not have handled a 3D game. My SGS could barely handle tetris without lag.

Except it isn't chicken vs egg. Even with good GPUs on the market (Tegra 3, Exynos 4, and soon Adreno 320), developers still aren't interested in making high-end 3D games for Android. They were pushing hard-core OpenGL ES 2.0 games even on the iPhone 3GS with a puny SGX 535...

Didn't realize. So how big is it, actually? And how big are competitors like the Mali-400? And why does Apple tolerate larger die sizes than other mobile players?

A5X is 162.94mm^2
A5 (45nm) is 120mm^2
A5 (32nm) is 69mm^2
A6 is 92mm^2
Tegra 3 is 80mm^2
Exynos 4210 (dual-core) is 118mm^2
Exynos 4412 (quad-core) is approximately 144mm^2
Intel Ivy Bridge quad is 160mm^2
 
Last edited:

Mr. Pedantic

Diamond Member
Feb 14, 2010
5,027
0
76
Except it isn't chicken vs egg. Even with good GPUs on the market (Tegra 3, Exynos 4, and soon Adreno 320), developers still aren't interested in making high-end 3D games for Android.
There are plenty of high-end 3D games for Android, given a quick search. It's just that most of them are of no interest to me at all. But if I see a game I like, I still want to play it.

A5X is 162.94mm^2
A5 (32nm) is 69mm^2
A6 is 92mm^2
Tegra 3 is 80mm^2
Intel Ivy Bridge quad is 160mm^2
Didn't realise quad-core Ivy was so small.
 

bearxor

Diamond Member
Jul 8, 2001
6,605
3
81
(1) It's huge. Apple uses a *far* larger die size budget than anyone else to get their GPU speed. There's nothing inherently faster about the design.

(2) Actually, TI does. It's just that Qualcomm's LTE dominance has kept Android releases on their silicon much of this year. And Nvidia obviously has their own design, and Samsung has gone fully with ARM.

But (1) is the key.

I agree/disagree with you.

I think both reasons are correct. But I think 2 is the key.

Let's look at who designs their own chips in this space (big players), Apple, TI (well...), Qualcomm, Samsung, and nVidia.

So, Samsung has a GPU based off ARM's GPU design. nVidia has their own take. Qualcomm has their own graphics core. That leaves TI and Apple as the last big players that don't develop their own GPU core, so they need to license it. Both chose PowerVR.

So what happens is that Qualcomm's LTE chipsets have been the standard in the states for a while, at least the past two years starting with the S3 and now the S4. So, basically, if you're going to buy the Qualcomm chipset regardless, you're getting the Qualcomm GPU.

Now that Qualcomm is starting to deliver the LTE radio on it's own in 28nm, I think this is likely to change over the next year. Just look at the iPhone 5 and the Note II as examples.

Didn't realize. So how big is it, actually? And how big are competitors like the Mali-400? And why does Apple tolerate larger die sizes than other mobile players?

GPU's are huge regardless.
This is the A5X (MP4)
ipad3-performance-analysis,0-U-330510-13.jpg

This is the A6 (MP3)
14643_A6-die.jpg


One thing about the A6 is that hand-laided out ARM core. That shows that Apple is willing to give up the larger die space for performance. Just compare the size of the dual-core in the A6 to the A5X!

As to why Apple seems more willing to do this, I have no idea.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
There are plenty of high-end 3D games for Android, given a quick search. It's just that most of them are of no interest to me at all. But if I see a game I like, I still want to play it.

Yeah, but they actually have different versions for different devices.

And there are still a lot more on Apple's side... even counting the ones that Android has as multiplatform games.

The truly taxing games look really good on Apple's side. Some of them are clearly console-level (as in PS3 and 360 level).

Didn't realise quad-core Ivy was so small.

Yeah. So the chip Apple uses in their rMBP 15" is actually smaller than the chip they use in their iPad 3. And the quad-core Ivy is probably more than 10x faster.

Imagine that...
 

Mr. Pedantic

Diamond Member
Feb 14, 2010
5,027
0
76
Your question is geared towards manufacturers, not consumers! :p
Yeah, that's why I asked it here. Given it is a tech forum...

If I wanted to ask consumers, I'd ask myself. :)

Yeah, but they actually have different versions for different devices.

And there are still a lot more on Apple's side... even counting the ones that Android has as multiplatform games.

The truly taxing games look really good on Apple's side. Some of them are clearly console-level (as in PS3 and 360 level).
Well, I played some of the games when I had an iPhone, and borrowing other people's iPhones. They're not that great. Phones don't really lend themselves to gaming, since any space you use for controls takes away from screen space.

Yeah. So the chip Apple uses in their rMBP 15" is actually smaller than the chip they use in their iPad 3. And the quad-core Ivy is probably more than 10x faster.

Imagine that...
Yeah, never thought about that.

Looks like Intel's going to do well in the mobile space. And in every other space, it seems :p

I agree/disagree with you.

I think both reasons are correct. But I think 2 is the key.

Let's look at who designs their own chips in this space (big players), Apple, TI (well...), Qualcomm, Samsung, and nVidia.

So, Samsung has a GPU based off ARM's GPU design. nVidia has their own take. Qualcomm has their own graphics core. That leaves TI and Apple as the last big players that don't develop their own GPU core, so they need to license it. Both chose PowerVR.
Yeah, but TI doesn't yet have an SoC using the SGX543 yet, if I remember correctly; they're all using the SGX540.

And everyone else can still license from PowerVR. For example, Intel licensed an SSD controller from Sandforce for the 520 (I think?) while they were developing their next controller. Why can't they do that for mobile GPUs?

GPU's are huge regardless.
This is the A5X (MP4)

This is the A6 (MP3)

One thing about the A6 is that hand-laided out ARM core. That shows that Apple is willing to give up the larger die space for performance. Just compare the size of the dual-core in the A6 to the A5X!

As to why Apple seems more willing to do this, I have no idea.

Why does die size matter anyway?
 

s44

Diamond Member
Oct 13, 2006
9,427
16
81
Yeah, but TI doesn't yet have an SoC using the SGX543 yet, if I remember correctly; they're all using the SGX540.

And everyone else can still license from PowerVR.
Pretty sure the OMAP 4470 that's in the new Nook and Kindle tablets is SGX543/544.

Remember that Samsung *did* use PowerVR in the Hummingbird. They switched to Mali on purpose. Again, there's no inherent advantage to the former.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Yeah. So the chip Apple uses in their rMBP 15" is actually smaller than the chip they use in their iPad 3. And the quad-core Ivy is probably more than 10x faster.

Imagine that...

... and it probably has over 10x the power consumption. ;)

It's good to keep in mind that Ivy Bridge is built on an even lower manufacturing process node than any other chip that you listed. Sandy Bridge 4C is 216 mm².

EDIT:

Pretty sure the OMAP 4470 that's in the new Nook and Kindle tablets is SGX543/544.

That looks to be the case:
http://en.wikipedia.org/wiki/TI_OMAP#OMAP_4

Although, it is interesting to see that it's a single-core instead of the usual dual or quad configurations (I guess I should include triple now too. :p).
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
Lead times for SoC designs are so long that you can't really "react" to the competition in anything less than 2 years. Keep in mind that up until the A5, Apple pretty much with the rest of the pack in GPU selection. The A4 had the PowerVR SGX535 whereas Exynos 3 Single had an SGX 540 in the Samsung Galaxy S (Samsung has always used top end hardware in it's flagship devices). The A5 actually caught nVidia, TI, and Qualcomm by surprise. Also, most of the SoC vendors are not building chips for their own products, but create designs that are ordered by phone OEMs. No one was requesting a part with so much die space and no one was building it because of cost reasons.

It should also be noted that because Apple doesn't have a supplier for the design of their chip, there is no direct reason for SoC vendors to "beat" what Apple offers. In fact, because Apple and Samsung design their own SoCs (and conveniently make most of the smartphone profits as well), competition has only recently started to heat up.

Anyway, die space and scalability of GPU configurations are the 2 big reasons, and mostly because the other SoC vendors have a GPU team in house, there's no reason to use PowerVR.
 

tangey

Junior Member
Jul 25, 2011
14
0
0
Www.goingonrewards.com
Only 3 Soc vendors have inhouse graphics, Qualcomm, Nvidia and Broadcom.

Imagination technologies has loads of licensees for its graphics IP.

SGX543 has been licensed by Apple and Renesas and Sony.

SGX544( similar to 543 but including DX9 compliance), has been licenced by TI for its Omap5 series and also the omap4470 (used in Kindle fire/fire HD, barnes and noble Nook), samsung is using omap4470 in a new phone. Huawei is also using Omap4470 in a phone

SGX544 is also licensed by ST and will appear mid this year in the A9540/8540 chips. It is also licensed by Intel and will appear in the next gen Medfield (again mid this year). It has also been licensed by Mediatek and will appear in low cost chips in Q1 2013 , mediatek currently use SGX531. Finally SGX544 has also been licensed by Samsung, but its use at this time is unknown.

SGX540 has been used by samsung in hummingbird, Intel in medfield and TI in omap3.

SGX545 has been licensed by Intel, and is currently showing up in clovertrail.

Next gen cores, codename rogue has been licensed by 10 licensees, some are known and some can be guessed at.

There are some new partners that have recently announced licensing agreements with IMG, including Fujitsu and LG and Huawei (hisilicon)

the above is not an exhaustive list, but includes most of the major partners.
 
Last edited:

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
have you guys seen the gpus in the up coming samsung A15 soc? its got 4x the performance over its last gen cores
 

tangey

Junior Member
Jul 25, 2011
14
0
0
Www.goingonrewards.com
Intel Atoms use it, but not for long. Intel is moving to a ivy bridge based graphics core for the next release of their Atoms. :)

Its true that intel is shifting to inhouse for tablet designs. However intel will continue to use SGX for smartphone SOCs for many years. It is widely assumed that Intel is one of the unknown Rogue licensees, and certainly intels next gen medfield will include a dual core SGX544.
 

tangey

Junior Member
Jul 25, 2011
14
0
0
Www.goingonrewards.com
have you guys seen the gpus in the up coming samsung A15 soc? its got 4x the performance over its last gen cores

Dont read too much into marketing. Samsung quote the Exynos 5250 as getting up to x5 graphics performance of the Exynos4 series. Which is true, as the first gen exynos4 graphics core was clocked @266 Mhz. Exynos 4 has since been implemented @ 32nm and is clocking its graphics core at 533Mhz is some tablets. The Exynos 5250 is also clocked its next gen graphics core (T604) @ 533mhz and is seeing around twice the performance of the similarly clocked Exynos4, in early benchmarks (there are no end unit devices out with samsungs T604).

Those benchmarks show that the Exynos 5250 is getting around 25-30% more performance than the SGX543MP4 running in the ipad3. The ipad4 is out tomorrow with double the clock frequency for the gpu over the ipad3 (now around 500Mhz) and hence double the gpu performance. It is a bit damning that a brand new next gen gpu is going to be substantially better by a 2 year old gpu archeitecture simply upclocked, and it hasn't even seen an end user device yet.

It will also be interesting to see how big T604 is, all that extra compliance that they have included beyond what was in the mali400mp4, such as OpenCL, DX9, etc is going to take up more die space.

someone up-thread asked why die size is important. The bigger a chip die is, the less dies you get to a wafer, its the major production cost.
 
Last edited:

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
yes but samsung makes its own chips so there is no middle man in production costs
 

tangey

Junior Member
Jul 25, 2011
14
0
0
Www.goingonrewards.com
yes but samsung makes its own chips so there is no middle man in production costs

Why should that make a difference ?

Just because you make you own garden chairs, it doesn't mean your costs don't reduce it you can get 20 garden chairs out of a set piece of wood, instead of 16.
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
Why should that make a difference ?

Just because you make you own garden chairs, it doesn't mean your costs don't reduce it you can get 20 garden chairs out of a set piece of wood, instead of 16.

you brought up the die size and cost,samsung can eat the small amount of extra cost because they make there own soc's and dont need to pay someone to make them for them.

also the 5250 will go in cell phones while I dont think apple has even the ipad 3 gpus in any phone yet.