The real reasons Microsoft and Sony chose AMD for consoles [F]

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,433
5,771
136
A15 and Krait400 have 2x to 3x the perf/watt of Jaguar.
In the 3DMark physics test a 1,8GHz A15 gets 13774 points and Krait 14828 points.
That is 82%/88% of the A4-5000 Kabini SoC.

Wow. So much garbage.

Lets take a look at the benchmark you pointed out:

54810.png


Notice that Exynos 5250 way down the charts? With less than half the score of the Kabini? That's an A15. I wonder what the TDP of that part is? Oh right, 8W. http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/13 So the A15 part with over half the TDP of Kabini gets less than half the performance.
 

NTMBK

Lifer
Nov 14, 2011
10,433
5,771
136
Your logic is severely flawed. Replace MS with Intel. And then ask yourself why Intel would jack up prices.

When Intel has a competitor, the consumer has 3 choices- Windows + Intel, Windows + AMD, or a totally different platform entirely. Without AMD, the choice is Intel+Windows or a different platform entirely.

Most users are familiar with Windows, are heavily invested (own lots of Windows software) and want to stick with Windows. They are a lot less likely to ditch the platform entirely than they would be to buy an AMD PC. As such without AMD, the price elasticity for Intel processors falls dramatically- they can increase prices with greater freedom, without losing customers at the same rate they would have if there was a viable competitor for them to move to. The prices would have to be high enough to force a significant number of people to ditch Windows entirely, or not buy a computer at all, which is a much higher threshold than the price required to force them to move to a cheaper but almost equivalent competitor (AMD).

In short- I'd expect some price rise, but not the apocalyptic "so expensive that only the 5 richest princes in Europe can afford it" that some people around here are predicting!

EDIT: Let's break out the graphs :D ...

EDIT 2: Let's put away the graphs, I realise my explanation of the graph was complete crap :p My economics professor would be ashamed. Good job I only did it for one semester...
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
But using a Jaguar core makes sense? Right...

A15 and Krait400 have 2x to 3x the perf/watt of Jaguar.
In the 3DMark physics test a 1,8GHz A15 gets 13774 points and Krait 14828 points.
That is 82%/88% of the A4-5000 Kabini SoC.

Wow, how many times you want to repeat that they really wanted x86? :rolleyes:
If x86 would be so great MS did not dump Intel for IBM and the mobile world be dominated by x86 and not ARM.

It is quite interested that the world is going away from x86 and yet the console markers think using x86 will help them to survive.

You are right. They wanted a junk SoC with a CPU performance of a "low-end SoCs" except the superb perf/watt. :awe:

Wow, one benchmark really proves anything! Yes, the highest end ARM can almost compete (1/2 the score) with a lower clocked Kabini with HALF the cores of the XB1/PS4 SoC's :rolleyes: So, by the benchmark posted above, that puts XB1/PS4 SoC as approx. 4x higher score than A15. Before adding clockspeed (rumoured ~2Ghz?).

Seriously, ARM is amazing for tablets and phones, but it really ain't anywhere CLOSE to desktop performance. The best Nvidia can do is power a handheld Shield. They don't have the ability to make a medium/high end SoC. Only Intel and AMD have that ability, and only AMD can deliver the graphics performance needed for games.
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
Your logic is severely flawed. Replace MS with Intel. And then ask yourself why Intel would jack up prices.

That's the nice thing about having a virtual monopoly on x86 CPU's. You can charge whatever you want for them (within reason... you still have antitrust lawyers watching) and people still have to buy them from you.

I'm sure that Intel's marketing guys are smart enough with pricing to maximize their profit without hurting sales too much. Without competition from AMD, I could easily see the premium desktop and laptop performance CPU's costing over $1,000. Hell.. they are getting close to that already thanks to the failure of Bulldozer.
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Wow, one benchmark really proves anything! Yes, the highest end ARM can almost compete (1/2 the score) with a lower clocked Kabini with HALF the cores of the XB1/PS4 SoC's :rolleyes: So, by the benchmark posted above, that puts XB1/PS4 SoC as approx. 4x higher score than A15. Before adding clockspeed (rumoured ~2Ghz?).

Seriously, ARM is amazing for tablets and phones, but it really ain't anywhere CLOSE to desktop performance. The best Nvidia can do is power a handheld Shield. They don't have the ability to make a medium/high end SoC. Only Intel and AMD have that ability, and only AMD can deliver the graphics performance needed for games.

There may not be any ARM cores out today that are anywhere close to desktop performance but the upcoming consoles are nowhere close to desktop performance either.

8-core (2x4 cluster) Cortex-A15 @ 2.5GHz is perfectly implementable on 28nm SoCs today, and it would run within a power budget that's suitable for consoles. That would have been performance competitive with Jaguar (with equal quality code of course). Performance was not the problem.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Wow. So much garbage.

Lets take a look at the benchmark you pointed out:

54810.png


Notice that Exynos 5250 way down the charts? With less than half the score of the Kabini? That's an A15. I wonder what the TDP of that part is? Oh right, 8W. http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/13 So the A15 part with over half the TDP of Kabini gets less than half the performance.

Here is the quad core Exynos A15 scoring ~9500, granted its TDP limit is probably lower than the tablet dual core A15. Doesn't look like an 8 core A15 will compare that favorably to 8 core Jaguar and that's before considering A15 is 32bit only. When a design is already up there in TDP than saving wattage at the expense of some performance can be important but when the TDP and power draw are well within the design envelope performance is usually more important.

http://community.futuremark.com/hardware/mobile/Samsung+Galaxy+S+IV+(Exynos+5+Octa)/review
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
There may not be any ARM cores out today that are anywhere close to desktop performance but the upcoming consoles are nowhere close to desktop performance either.

8-core (2x4 cluster) Cortex-A15 @ 2.5GHz is perfectly implementable on 28nm SoCs today, and it would run within a power budget that's suitable for consoles. That would have been performance competitive with Jaguar (with equal quality code of course). Performance was not the problem.

Well Jaguar is definitely not Haswell or Piledriver, but it seems to provide adequate performance for what devs want.

Lol, some mish mash of franken-ARM cores versus an 8-core design that was actually made to be run as 8 cores. Why not make 16 jaguar cores @ 2.5GHz, it's just as plausible :rolleyes:

There has been no consumer level ARM CPU that can match Jaguar on a core-for-core basis, there has been no consumer level ARM CPU with 8 cores, and there has been no consumer level ARM CPU paired with anything remotely close to mid-range graphics.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Well Jaguar is definitely not Haswell or Piledriver, but it seems to provide adequate performance for what devs want.

Yeah and Cortex-A15's perf/MHz isn't that far behind it. If you think that 2.5GHz Cortex-A15 is far behind 2GHz Jaguar on a core-by-core basis then let's see your benchmarks.

Lol, some mish mash of franken-ARM cores versus an 8-core design that was actually made to be run as 8 cores. Why not make 16 jaguar cores @ 2.5GHz, it's just as plausible :rolleyes:

Please, educate yourself just slightly before saying anything else about this. 2x4-core A15 is a standard configuration option from ARM. 2.5GHz is an advertised clock target from ARM. If you think 16 Jaguar cores at 2.5GHz is just as plausible then you know nothing about Jaguar, 8 cores is a design limit and most likely so is clocking it around 2GHz, IF the consoles even clock at that. So no, it's not just as plausible.

And it doesn't even matter because that'd exceed the TDP budget. Point is that you can get similar performance to the current console setup with a realistic A15 setup, and it'd probably use a similar amount of power.

You all think that "lolz there's only weak ARM SoCs" is a valid argument. That has nothing to do with what you can do with licensed IP and a custom design. This argument is as good as saying that AMD couldn't have done an 8-core 18CU console chip because Kabini is only 4-core 2CU.

If nVidia wanted to they could have done an 8-core Cortex-A15 with Kepler in similar scale to console chips. But yes, being stuck with 32-bit and ARM would be two big disadvantages, before considering whatever else AMD would be willing to bring to the table that nVidia wouldn't.
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Yeah and Cortex-A15's perf/MHz isn't that far behind it. If you think that 2.5GHz Cortex-A15 is far behind 2GHz Jaguar on a core-by-core basis then let's see your benchmarks.

The Exynos 5250 dual A15 is hitting ~6-8W at 1.7GHz. On what basis do you think A15 at 2.5GHz would have any power efficiency advantage over Jaguar cores with similar performance?
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
The Exynos 5250 dual A15 is hitting ~6-8W at 1.7GHz. On what basis do you think A15 at 2.5GHz would have any power efficiency advantage over Jaguar cores with similar performance?

I said competitive, I didn't say "power efficiency advantage" - I'm sure you can understand and appreciate the difference.

Now this stuff with using so-called "TDP" of an entire SoC like Exynos 5250 is not a methodology I agree with. The power consumption of the CPU cores themselves is about 1.5-2W @ 1.7GHz. That's on Samsung's 32nm leakage-optimized process. It's a bit behind their own 28nm in power consumption. It's also operating way past the ideal efficiency knee voltage-wise: a big part of that is because it's on a leakage-optimized process. Things would be pretty different on TSMC 28HP or even 28HPM, which seems to me like a no-brainer for a console chip.

Give me a power number for what you think the Jaguar chips use per-core and we can talk more. But if 8x2.5GHz Cortex-A15s can use ~30W then I think that'll fit, yes.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Yeah and Cortex-A15's perf/MHz isn't that far behind it. If you think that 2.5GHz Cortex-A15 is far behind 2GHz Jaguar on a core-by-core basis then let's see your benchmarks.
54812.png

54810.png



Please, educate yourself just slightly before saying anything else about this. 2x4-core A15 is a standard configuration option from ARM. 2.5GHz is an advertised clock target from ARM. If you think 16 Jaguar cores at 2.5GHz is just as plausible then you know nothing about Jaguar, 8 cores is a design limit and most likely so is clocking it around 2GHz, IF the consoles even clock at that. So no, it's not just as plausible.

And it doesn't even matter because that'd exceed the TDP budget. Point is that you can get similar performance to the current console setup with a realistic A15 setup, and it'd probably use a similar amount of power.

Both are silly, yes. Considering hitting ~1.8-2GHz is already pushing ARM cores past the point of power efficiency, how is 2.5GHz going to make the situation any better? :rolleyes:

You could get an A15 setup that uses tons of power (2.5 is WAY beyond optimal power range) , scratches the bottom end of Jaguar performance, etc.

Quad core A4-5000 hits 11W system power consumption at full load. Dual core A15 hits a very similar (8w while gaming).

You all think that "lolz there's only weak ARM SoCs" is a valid argument. That has nothing to do with what you can do with licensed IP and a custom design. This argument is as good as saying that AMD couldn't have done an 8-core 18CU console chip because Kabini is only 4-core 2CU.

If nVidia wanted to they could have done an 8-core Cortex-A15 with Kepler in similar scale to console chips. But yes, being stuck with 32-bit and ARM would be two big disadvantages, before considering whatever else AMD would be willing to bring to the table that nVidia wouldn't.

Well, frankly the fact that there has never been a solid high-performance ARM CPU, yes, there are essentially only weak ARM SoC's. Sure, maybe it could be done with lots of work, but why not just go to AMD, who already has a design for a core that performs the way they want, and use that?

I said competitive, I didn't say "power efficiency advantage" - I'm sure you can understand and appreciate the difference.

Now this stuff with using so-called "TDP" of an entire SoC like Exynos 5250 is not a methodology I agree with. The power consumption of the CPU cores themselves is about 1.5-2W @ 1.7GHz. That's on Samsung's 32nm leakage-optimized process. It's a bit behind their own 28nm in power consumption. It's also operating way past the ideal efficiency knee voltage-wise: a big part of that is because it's on a leakage-optimized process. Things would be pretty different on TSMC 28HP or even 28HPM, which seems to me like a no-brainer for a console chip.

Give me a power number for what you think the Jaguar chips use per-core and we can talk more. But if 8x2.5GHz Cortex-A15s can use ~30W then I think that'll fit, yes.

Jaguar uses approx. 11w total power consumption (entire system) under CPU load, for 4 cores at 1.5GHz.

Exynos 5 Dual appears to use ~4-8w under load (8w TDP) with 2 cores at 1.7GHz, so between 1-2.5w per core. Now, let's say they use 1.5w per core at 1.7GHz. Without ANY voltage increases (which is essentially impossible), moving to 2.5GHz would increase power consumption approx. 40-50%, then add some voltage to hit the target (I'm assuming 1.4v or so, since 1.7GHz takes 1.2-1.3v according to Anandtech), and you're probably touching a 3-5w load power consumption core (voltage increases give exponentially greater power consumption). Now, multiply that by 8 and you have an optimistic 24w at worst, not including the leakage losses of more than 4x (4x as many cores, more interconnects between the cores, larger uncore etc.)

I could give more reasons why pushing ARM to 2.5 is not a power efficient idea, but I think you get the point.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Please, educate yourself just slightly before saying anything else about this. 2x4-core A15 is a standard configuration option from ARM. 2.5GHz is an advertised clock target from ARM. If you think 16 Jaguar cores at 2.5GHz is just as plausible then you know nothing about Jaguar, 8 cores is a design limit and most likely so is clocking it around 2GHz, IF the consoles even clock at that. So no, it's not just as plausible.

There are already faster Jaguars in the pipeline according to cpu world - http://www.cpu-world.com/news_2013/2013062701_AMD_leaks_model_numbers_of_future_Kabini_APUs.html

Judging by the numbers you would expect the 5350 to be at least 2150 MHz, which granted isn't far above 2 GHz.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
I'm not even looking at that 3DMark physics one because it's totally useless w/o the same core count (and no that Exynos 5 Octa score that performs much worse than the S600 isn't valid either because it's clearly not running at 1.6GHz). I'll take the JS bench as one datapoint, although I'd like to see it in other browsers too.

In stuff like Geekbench, 7-Cpu, native benches like that Cortex-A15 gets close to but not quite the same perf/MHz as Bobcat. I expect 2.5GHz (25% more MHz than 2GHz) to be reasonably competitive with Jaguar.

Both are silly, yes. Considering hitting ~1.8-2GHz is already pushing ARM cores past the point of power efficiency, how is 2.5GHz going to make the situation any better? :rolleyes:

2GHz is pushing Cortex-A15 past the point of power efficiency for tablets. When designed on leakage optimized processes. That need more voltage for higher clocks than performance optimized processes. The rules are TOTALLY DIFFERENT for consoles. This is obvious.

When ARM advertizes 2.5GHz capability for A15 what applications did you think they had in mind?

Figure it out.

Quad core A4-5000 hits 11W system power consumption at full load. Dual core A15 hits a very similar (8w while gaming).

That 8W number is a total load of crap. It doesn't use "8W while gaming", it uses 8W while running a game and running a multithreaded benchmark in the background. Before throttling, anyway.

The 1.5GHz Kabini chip tested uses 11W while JUST exercising the CPU.

I don't know how you think this flies as a valid comparison.

Well, frankly the fact that there has never been a solid high-performance ARM CPU, yes, there are essentially only weak ARM SoC's. Sure, maybe it could be done with lots of work, but why not just go to AMD, who already has a design for a core that performs the way they want, and use that?

The core design is not the problem. There's not a ton of new work involved in making a console-targeted SoC with 8x 2.5GHz Cortex-A15s. That no one has done this for mobile platforms (duh) is totally irrelevant.

Jaguar uses approx. 11w total power consumption (entire system) under CPU load, for 4 cores at 1.5GHz.

Exynos 5 Dual appears to use ~4-8w under load (8w TDP) with 2 cores at 1.7GHz, so between 1-2.5w per core. Now, let's say they use 1.5w per core at 1.7GHz. Without ANY voltage increases (which is essentially impossible), moving to 2.5GHz would increase power consumption approx. 40-50%, then add some voltage to hit the target (I'm assuming 1.4v or so, since 1.7GHz takes 1.2-1.3v according to Anandtech), and you're probably touching a 3-5w load power consumption core (voltage increases give exponentially greater power consumption). Now, multiply that by 8 and you have an optimistic 24w at worst, not including the leakage losses of more than 4x (4x as many cores, more interconnects between the cores, larger uncore etc.)

You clearly don't understand the difference in voltage at peak clocks needed for a process + layout optimized for leakage vs one optimized for dynamic power consumption at high clock speeds. Yes, with the latter over the former you could clock higher at the same voltage.

So your estimated 24W minimum is totally infeasible for this. So how much do these 8x2GHz Jaguar cores use again?

Whatever the case is, this is arguing over what most likely amounts to a few W in either direction. When compared to the several dozen needed by the GPU it's not necessarily going to make or break the design.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Why would Microsoft and Sony pour money into pushing A15 to 2.5GHz (something no current ARM vendor provides or will provide in the next 3-6 months) just to get within the ballpark of performance to AMD's design? In addition, losing out on the extra features Jaguar has over A15. In fact A15 is looking like it will mirror most previous ARM cores and not show off its highest clocks until another node shrink, we might see 2+ GHz models at TSMC 20nm.
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
I'm not even looking at that 3DMark physics one because it's totally useless w/o the same core count (and no that Exynos 5 Octa score that performs much worse than the S600 isn't valid either because it's clearly not running at 1.6GHz). I'll take the JS bench as one datapoint, although I'd like to see it in other browsers too.

In stuff like Geekbench, 7-Cpu, native benches like that Cortex-A15 gets close to but not quite the same perf/MHz as Bobcat. I expect 2.5GHz (25% more MHz than 2GHz) to be reasonably competitive with Jaguar.



2GHz is pushing Cortex-A15 past the point of power efficiency for tablets. When designed on leakage optimized processes. That need more voltage for higher clocks than performance optimized processes. The rules are TOTALLY DIFFERENT for consoles. This is obvious.

When ARM advertizes 2.5GHz capability for A15 what applications did you think they had in mind?

Figure it out.

... etc etc...

Whatever the case is, this is arguing over what most likely amounts to a few W in either direction. When compared to the several dozen needed by the GPU it's not necessarily going to make or break the design.

Okay, then you show me the benchmarks where A15 beats/ties Jaguar, I can't find benchmarks that both platforms run (Android vs. Windows, which PS goes to show the different segments the processors target).

And you show me your power consumption numbers too, Jaguar is proven to run at ~11w for 4 cores + SoC, even at ~5w for 2 cores + SoC A15's performance is still worse for the same amount of power.

ARM makes excellent phone and tablet CPUs. Like I said before, they are not PROVEN to be as excellent for desktops/servers. Your speculation isn't very helpful, considering it is just that - speculation.

If 2.5GHz is a simple process change, then why haven't we seen anyone actually make 2.5GHz ARM processors? :rolleyes:

It all boils down to the fact that ARM would have to jump through hoops and over the moon to achieve the same CPU performance AMD already is offering, and then they need the GPU performance on top of that.... More and more hoops for the same result.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
What? nVidia was not rejected. They declined to sell something for low gross margins.

As shown in the OP, Microsoft and Sony rejected Nvidia because lacked the technology. Only after loosing the contract to power next consoles Nvidia began its funny margins campaign. A campaign that nobody believes.

But using a Jaguar core makes sense? Right...

Sony and Microsoft run benchmarks with the hardware offered (I bet Nvidia sent them some ARM Tegra SoC) and benchmarks showed that ARM lacked the performance offered by AMD x86 SoC.

This part of the report linked in the OP is clear:

ARM-based architectures will soon get as powerful as AMD's Jaguar cores, but not when Sony or Microsoft needed them for their new consoles.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,433
5,771
136
The 1.5GHz Kabini chip tested uses 11W while JUST exercising the CPU.

No, the entire platform uses 11W. Anand's quote on the subject (emphasis mine):

I also suspect the 15W TDP is perhaps a bit conservative, total platform power consumption with all CPU cores firing never exceeded 12W (meaning SoC power consumption is far lower, likely sub-10W).
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Okay, then you show me the benchmarks where A15 beats/ties Jaguar, I can't find benchmarks that both platforms run (Android vs. Windows, which PS goes to show the different segments the processors target).

I said that there are benchmarks where Cortex-A15 performs near the perf/MHz of Bobcat to make a case for a 2.5GHz A15 being reasonable. Don't distort it into something else. I gave two benches, here's links:

Bobcat 1.6GHz vs Exynos 5250 7-cpu: http://www.7-cpu.com/
Bobcat 1.6GHz vs Exynos 5 Octa Geekbench: http://browser.primatelabs.com/geekbench2/compare/1950660/1159110 (look at the single threaded tests only, ignore total scores)

And you show me your power consumption numbers too, Jaguar is proven to run at ~11w for 4 cores + SoC, even at ~5w for 2 cores + SoC A15's performance is still worse for the same amount of power.

It's 1.5W to 2W per core, that is what I said. That's clearly evident in AT's power analysis of Exynos 5250. Exynos 5 Octa would be lower. We don't have much idea what power consumption of a Cortex-A15 is like on TSMC's 28nm, HPM or HP.

ARM makes excellent phone and tablet CPUs. Like I said before, they are not PROVEN to be as excellent for desktops/servers. Your speculation isn't very helpful, considering it is just that - speculation.

We're not talking about desktops or servers. We're talking about consoles. Whatever desktop or server processors AMD made doesn't have to do with a Jaguar vs Cortex-A15 comparison.

If 2.5GHz is a simple process change, then why haven't we seen anyone actually make 2.5GHz ARM processors? :rolleyes:

Because using a dynamic power optimized process means your idle power consumption gets a lot worse, making it a very bad choice for phones and tablets. So far A15 has only been released in a small handful of phones and tablets.

But we're talking about what options there'd be if someone tried to make a console SoC with this. Something we'll probably never have the answer to because a variety of other things precluded this from being the right console design for Sony or MS. Maybe someone else will make a device that targets performance instead of low clock characteristics.

It all boils down to the fact that ARM would have to jump through hoops and over the moon to achieve the same CPU performance AMD already is offering, and then they need the GPU performance on top of that.... More and more hoops for the same result.

Look at press releases that have announced 2.5-3GHz+ spins of Cortex-A9 on performance optimized processes with GF or TSMC. Do you think ARM jumped through hoops and over the moon to achieve that? For a press release? More likely it was the foundries pimping pretty normal existing designs. ARM says that it can run at 2.5GHz on 28nm, do you think they're making it up? They could very well already have a hard-macro that does this. They had one for Cortex-A9, it did 2GHz on 40nm back when everyone else was stuck at around 1.5MHz max. It wasn't used in an SoC because it wasn't a good fit for what people wanted the SoCs for, although NuFront was doing one that I don't think ever got released.

Once again. A lack of released SoC doesn't mean it's a huge engineering challenge to do it now. Maybe it is. But saying that they would necessarily have to go through that would be speculating on YOUR part.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
No, the entire platform uses 11W. Anand's quote on the subject (emphasis mine):

Yeah, I know that, I'm sorry there was a misunderstanding here - what I meant to say is that this Kabini platform uses 11W while exercising the CPU fully but not exercising the GPU at all. The power numbers going around for Exynos 5250 are for CPU AND GPU at full load, you can't compare the two.

And suffice it to say that I don't agree with Anand's conclusion; 15W only looks conservative because it's not factoring in the GPU.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
The GPU definitely draws more power than the CPU cores, but isn't it the CPU cores we're discussing?
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
The GPU definitely draws more power than the CPU cores, but isn't it the CPU cores we're discussing?

That's what we should be discussing, but when people insist on making comparisons using power consumption numbers for Exynos 5250 that include the GPU running at full tilt.. well you figure it out.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
I said that there are benchmarks where Cortex-A15 performs near the perf/MHz of Bobcat to make a case for a 2.5GHz A15 being reasonable. Don't distort it into something else. I gave two benches, here's links:

Bobcat 1.6GHz vs Exynos 5250 7-cpu: http://www.7-cpu.com/
Bobcat 1.6GHz vs Exynos 5 Octa Geekbench: http://browser.primatelabs.com/geekbench2/compare/1950660/1159110 (look at the single threaded tests only, ignore total scores)

On 7-CPU, the A4-5000 wins by approx. 50-60% with 4 threads. That is at a lower clockspeed for the A4 as well. Extrapolate to 8 threads, easy conclusion: Jaguar is immensely faster.

On Geekbench: Bobcat wins in every single-threaded test at 1.6GHz. Jaguar adds between 10-20% IPC boost from Bobcat. Extrapolate to 8 threads, ARM again loses by a large margin.

It's 1.5W to 2W per core, that is what I said. That's clearly evident in AT's power analysis of Exynos 5250. Exynos 5 Octa would be lower. We don't have much idea what power consumption of a Cortex-A15 is like on TSMC's 28nm, HPM or HP.

We're not talking about desktops or servers. We're talking about consoles. Whatever desktop or server processors AMD made doesn't have to do with a Jaguar vs Cortex-A15 comparison.

1.5 to 2w, essentially the same power consumption as Jaguar, for worse performance. No thanks.

Desktop, server, game console, fax machine, call it whatever you want, ARM has never made a processor which is at the performance level that an 8C Jaguar is expected to be at.


Because using a dynamic power optimized process means your idle power consumption gets a lot worse, making it a very bad choice for phones and tablets. So far A15 has only been released in a small handful of phones and tablets.

But we're talking about what options there'd be if someone tried to make a console SoC with this. Something we'll probably never have the answer to because a variety of other things precluded this from being the right console design for Sony or MS. Maybe someone else will make a device that targets performance instead of low clock characteristics.[/QUOTE]

So, what you're saying is that ARM could make a processor with worse power consumption, to compete against a processor that already has better power consumption. Lol, that's excellent logic. They're definitely going to make their situation BETTER with that.

Look at press releases that have announced 2.5-3GHz+ spins of Cortex-A9 on performance optimized processes with GF or TSMC. Do you think ARM jumped through hoops and over the moon to achieve that? For a press release? More likely it was the foundries pimping pretty normal existing designs. ARM says that it can run at 2.5GHz on 28nm, do you think they're making it up? They could very well already have a hard-macro that does this. They had one for Cortex-A9, it did 2GHz on 40nm back when everyone else was stuck at around 1.5MHz max. It wasn't used in an SoC because it wasn't a good fit for what people wanted the SoCs for, although NuFront was doing one that I don't think ever got released.

Once again. A lack of released SoC doesn't mean it's a huge engineering challenge to do it now. Maybe it is. But saying that they would necessarily have to go through that would be speculating on YOUR part.

Ok, fine, sure they can MAKE 2.5GHz ARM processors. Sure, they could make billions! Oh, wait, they consume more power, and still aren't very fast...

Even assuming ARM could easily make a 2.5GHz A15 core, the power consumption numbers make it look absolutely atrocious in comparison to Jaguar.

More for less.

That's what we should be discussing, but when people insist on making comparisons using power consumption numbers for Exynos 5250 that include the GPU running at full tilt.. well you figure it out.

The numbers don't look much better when you compare CPU+GPU for Kabini. A4-5000 draws 15w at full CPU and GPU load. A4-5000 performs over TWICE as fast as Exynos 5250 in GPU based benchmarks. Look at 3dmark: over double the CPU score, over triple the GPU score. That's better performance/watt for damn sure.
 
Last edited: