More Llano leaks (A8 APU extensively benchmarked)

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Just posted by Anand on Twitter:

ddr3scaling-16x10.png

so what if the Llano graphic card oced to 750Mhz, it will be fairly interesting, i hope anand will oced Llano extensively



EDIT: btw can anand benchmark this Llano in bitcoin ?
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
The difference between DDR3 1333 and DDR3 1866 is around the same difference between 6450 and 5570.

More important the price difference between DDR3 1333 and DDR3 1600 is negligible, and that also provides a nice boost.

The difference between 6450/5570 to 5770 performance level is $50-70.

We are discussing future iterations of Fusion.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
We are discussing future iterations of Fusion.

I remember having arguments with some members that was already impossible to have 400 cores and achieve near 400 cores performance.

From comments from AMD it seems trinity will have more cores and Cayman ones.

RAM will also get cheaper and faster.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Unfortunately you can't compare them that way, or I should say you can't attribute the results of such a comparison to the process integration as you are attempting.

Over half of the llano die is composed of a transistor block that need only operate at ~400-500MHz. When your clockspeed requirements are that low, your Idrive requirements are equally lowered.

Glofo's 32nm xtors are electrically 2-dimensional, they have a length and a width (as do Intel's). "gate density" is determined by the gate length, but you must make the xtors as wide as needed for Idrive purposes (hitting your clockspeeds).

This is the same basic device physics that are at play when you see sram cell sizes changing between full-speed L2$ cache versus 1/2 speed (or slower) L3$. The slower L3$ is more dense, it can be more dense because the clockspeed is intentionally reduced meaning the xtor widths can be reduced as well, leaving more room to pack in more xtors in the same area.

SB benefits from this as well as their GPU is likewise lower in clocks, xtors can be intentionally slower (i.e. smaller) in the GPU logic versus the CPU logic, but the relative area is smaller than that in llano.

To get a feel for the normalized xtor density benefits of gate-first versus gate-last between these two processes we need to compare IC circuits that are nearly identical (including the clockspeeds and the operating voltages).

You are right but 47% more transistors (995M vs1.45B) for only 5.5% (216mm2 vs 228mm2) more die area is a lot of difference.

http://www.realworldtech.com/page.cfm?ArticleID=RWT021511004545&p=3
(Last paragraph)
While IBM did not present any information on actual array density, it is possible to make some inferences. Comparing IBM’s eDRAM in the POWER7 to comparable SRAMs from Intel yields a roughly 2X density advantage at the same node. Equivalently, IBM’s 45nm eDRAM slightly exceeds the density of Intel’s 32nm SRAM. Based on the results demonstrated and IBM comments, the overall array area should scale by 60% at 32nm. This suggests that IBM can expect roughly a 2X advantage for their storage arrays and possibly some further upside with innovations in the overall array architecture.

And from the following table the IBM/AMD/freescale 32nm process (IFA d) Lgate is at 25nm when Intel’ Lgate at 32nm process is at 30nm.

http://www.realworldtech.com/includes/images/articles/iedm10-10.png

Although a large portion of the die size is only operating at 444MHz (fGPU and more), I still believe that Glofo’s 32nm Gate First SOI HKMG process played a bigger role for that large transistor density.

I Could be wrong ;)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
You are right but 47% more transistors (995M vs1.45B) for only 5.5% (216mm2 vs 228mm2) more die area is a lot of difference.

http://www.realworldtech.com/page.cfm?ArticleID=RWT021511004545&p=3
(Last paragraph)


And from the following table the IBM/AMD/freescale 32nm process (IFA d) Lgate is at 25nm when Intel’ Lgate at 32nm process is at 30nm.

http://www.realworldtech.com/includes/images/articles/iedm10-10.png

Although a large portion of the die size is only operating at 444MHz (fGPU and more), I still believe that Glofo’s 32nm Gate First SOI HKMG process played a bigger role for that large transistor density.

I Could be wrong ;)

Just coming at this from a position of experience, we built a node at TI that was intended to span the range of the lowliest (clockspeed-wise) densest IC's (our dirt-cheap cellphone chip for NOK) to the highest (clockspeed-wise) and least dense IC's (SUN Sparc chips).

Within the same node, optimized designs that targeted ~300-400MHz operating speeds versus those that targeted 2-2.5GHz operating speeds netted nearly a 3x delta (your read that right, 300%) in xtor density.

I look at Llano and its xtor density and to me its just a no-brainer, no magical process-node pixie dust here. They optimized the GPU for low-clockspeeds and high xtor density, QED.

That's not to say that GloFo's gate-first process tech did not help out across the board, but we need to see some teardown analyses of devices that make it into the channel before we let ourselves make too much over the differences in xtor density. (IC Insights, Chipworks, etc)

IEDM is great for benchmarking, been there and done that for years, but it is not a contract, what gets published can and does vary significantly from what actually goes into production for a lot of very good (technically-facing) reasons...and the bottom line is that IEDM pubs are every bit for marketing purposes as the materials that come from the marketing dept directly.

I must admit I was expecting a tad more, uhm, fanfare (?) to be made by GloFo and AMD regarding the technological prowess of their 32nm process tech given that this is the first product to come to market that was designed and birthed by not only the vision to buy ATI but also the vision to spin-off the fabs. (bobcat counts, but not done at GloFo is what I mean)

I expected a lot of sizzle, and what happened was a lot of fizzle. Leaves me a sad panda :(
 

gorobei

Diamond Member
Jan 7, 2007
4,042
1,537
136
maybe they are too busy prepping for trinity to spend money trumpeting 32nm. After all amd's classic achilles heel has been capacity to produce. Historically, nothing has ruined more of their rollouts worse than not being able to produce enough chips. They sold out of Zacate pretty quick and nothing pisses off dell/hp/toshiba/etc more than not being able to get a product out because of short supply.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I don't know myself , But according to my wife who bought an msi FR720 with 17" screen . 6gb memory i7 2630QM for work (Company reemburst) for $850. She says because of gain in productivity she get 3x better battery life than with her older last years model . I forget what it was but it was an intel cpu .
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I expected a lot of sizzle, and what happened was a lot of fizzle. Leaves me a sad panda :(

I will remember this quote for a long time :awe:

I think Llano will do pretty well in the market place, as long as the machines with it are reasonably priced. A 14" machine with a 4 core/400 SP Llano would need to be under $600 for me to even consider it though. Even still I think I'd rather shoot for a dedicated graphics machine at $800 instead.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
I must admit I was expecting a tad more, uhm, fanfare (?) to be made by GloFo and AMD regarding the technological prowess of their 32nm process tech given that this is the first product to come to market that was designed and birthed by not only the vision to buy ATI but also the vision to spin-off the fabs. (bobcat counts, but not done at GloFo is what I mean)

I expected a lot of sizzle, and what happened was a lot of fizzle. Leaves me a sad panda :(

On the desktop side yeah, I was expecting AMD would be releasing faster APUs, both on CPU clock and GPU clock side - but apparently they are only targeting laptops and HTPC market.

On the other hand, Llano should be the last run of "stars cores" so...

But I got happy about one thing - it was time the faster IGPs became faster than frigging 6600GT to 7600GT speeds!
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
You're only getting 10% more fps for each step up in ram speed. Going from 1333 to 1866 only buys 20% performance. Isnt that what we'd expect to see? I'm not sure how that equates to being bandwidth limited. It looks balanced to me. Sure we'd like it to be faster but 1866 ram is still rather expensive. Typical OEM pricing on 2x2 1066 DDR3 is about $30. It is an extra $5 to bump up to 1333, and about an extra $12 to bump from 1066 to 1600. To go to 1866 the price increase jumps all the way to $25. It's not worth it to go to 1866. It definitely is worth it to go to 1600, and that is where oems should settle. It is up to the consumer to choose the 1600 systems and not the 1333 systems. But I bet we're going to see plenty of 1333 systems on sale in a few months for under $400. And they'll be a hell of a deal.

IMO, Increasing your ram speed should have no impact on fps. If it does, then you are bandwidth limited.

On the other hand, since this is an igp, I guess it is no different than just overclocking your VRAM.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Yep he just didn't get that 1 was 35 watts and the other was 25 watts , But the truely funny thing is the replies to the video
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
I expected a lot of sizzle, and what happened was a lot of fizzle. Leaves me a sad panda :(

Doesn't surprise me at all. Think of what the media coverage of that would be.

<AMD and GF bragging about the new process>
...
Intel, lead CPU designer and manufacturer in the world, is planning on transitioning to OMG WTF BBQ 22nm 3D!!11!! transistors later this year.


I think JFAMD said it the best:Customers don't care about the manufacturing process, they buy on performance, power usage, and price. Any media coverage of the process AMD is using is going to mention that it isn't as advanced as Intel's, even if it doesn't matter...
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
I will remember this quote for a long time :awe:

I think Llano will do pretty well in the market place, as long as the machines with it are reasonably priced. A 14" machine with a 4 core/400 SP Llano would need to be under $600 for me to even consider it though. Even still I think I'd rather shoot for a dedicated graphics machine at $800 instead.

the only time these types of savings would be worth it is if you're buying in bulk for a company/school. even a $50 savings is nice if you're buying 1000 laptops. i'm with you though, for personal use, i'd rather spend a little extra and get a dediced graphics card with it.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
I notice the label says "2.9Ghz". Is that the stock clock, or the turbo clock?

I think that I've suggested this before, but perhaps CPU makers (with the advent of Turbo modes), will start selling CPUs on the basis of their "max clocks". So CPUs would be labeled, "2.9Ghz max", or "up to 2.9Ghz". Just like DVD burners.

Edit: Huh. I thought LLano had AMD's "Turbo Core" technology. No? That's disappointing.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I notice the label says "2.9Ghz". Is that the stock clock, or the turbo clock?

I think that I've suggested this before, but perhaps CPU makers (with the advent of Turbo modes), will start selling CPUs on the basis of their "max clocks". So CPUs would be labeled, "2.9Ghz max", or "up to 2.9Ghz". Just like DVD burners.

From Anandtech preview article:
The 3850 has four cores running at 2.9GHz and doesn't support Turbo Core.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126

I don't get it. That article says that the 2.9Ghz LLano desktop chip doesn't have TurboCore.

But this mobile review has a big page all about it:
http://www.anandtech.com/show/4444/amd-llano-notebook-review-a-series-fusion-apu-a8-3500m/4

So which is it? Does LLano have TurboCore, or doesn't it?

Possibly, Anand mentioned BIOS immaturity on that ASRock A75 Extreme6 board he was using, perhaps that was what was limiting TurboCore?

I'm kind of confused about the issue.

Hopefully, AMD will release a 35W LLano, with some sort of TurboCore.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
So which is it? Does LLano have TurboCore, or doesn't it?

High end desktop part(A8-3850) does not have turbo functionality.(We do not know the desktop lineup as of now IIRC)

All mobile parts that have been announced do have turbo.
 

Gigantopithecus

Diamond Member
Dec 14, 2004
7,664
0
71
Hopefully, AMD will release a 35W LLano, with some sort of TurboCore.

They will. Unfortunately it will be a laptop/mobile SKU that might or might not be readily available, even as a bare OEM CPU, to individuals through retail channels.

To clear up any confusion: none of the announced desktop retail Llano APUs have Turbo Core, while all of the announced mobile OEM Llano APUs do have Turbo Core.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I don't get it. That article says that the 2.9Ghz LLano desktop chip doesn't have TurboCore.

But this mobile review has a big page all about it:
http://www.anandtech.com/show/4444/amd-llano-notebook-review-a-series-fusion-apu-a8-3500m/4

So which is it? Does LLano have TurboCore, or doesn't it?

Possibly, Anand mentioned BIOS immaturity on that ASRock A75 Extreme6 board he was using, perhaps that was what was limiting TurboCore?

I'm kind of confused about the issue.

Hopefully, AMD will release a 35W LLano, with some sort of TurboCore.

TBH it is perplexing to me as well, they have a 100W TDP budget to play with, there are only 4 cores and the GPU is not exactly a monster either, all on 32nm SOI w/HKMG and yet they aren't taking the core clocks over 2.9GHz?

For a Stars-core derivative shrunk to 32nm I expected a lot more clockspeed/watt than what these early indications are showing.

I'm starting to wonder if this isn't going to be another 90nm->65nm type of transition where the 65nm chips could hardly clock as high as their 90nm older siblings.

We already know Bulldozer was officially delayed because of lackluster clockspeed yields, maybe a 2.9GHz quad core with a low-end GPU bolted on really does suck up 100W on GloFo's 32nm process...its a shame if it is true because it would likely mean then that we are looking at another 12 months or so before GloFo tweaks 32nm to get the clockspeeds up to where they need to be now.

I don't know, would be nice to have something concrete to refute the picture that is shaping up based on all the pieces of the puzzle we are collecting.

Between bulldozer delay, llano core clocks, and bapco withdrawal timing...its hard to see where the upside surprise is going to come in.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
TBH it is perplexing to me as well, they have a 100W TDP budget to play with, there are only 4 cores and the GPU is not exactly a monster either, all on 32nm SOI w/HKMG and yet they aren't taking the core clocks over 2.9GHz?

For a Stars-core derivative shrunk to 32nm I expected a lot more clockspeed/watt than what these early indications are showing.

I'm starting to wonder if this isn't going to be another 90nm->65nm type of transition where the 65nm chips could hardly clock as high as their 90nm older siblings.

We already know Bulldozer was officially delayed because of lackluster clockspeed yields, maybe a 2.9GHz quad core with a low-end GPU bolted on really does suck up 100W on GloFo's 32nm process...its a shame if it is true because it would likely mean then that we are looking at another 12 months or so before GloFo tweaks 32nm to get the clockspeeds up to where they need to be now.

I don't know, would be nice to have something concrete to refute the picture that is shaping up based on all the pieces of the puzzle we are collecting.

Between bulldozer delay, llano core clocks, and bapco withdrawal timing...its hard to see where the upside surprise is going to come in.

Really? How do we know that exactly? All these subliminal messages really bring into question your impartiality.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
the only time these types of savings would be worth it is if you're buying in bulk for a company/school. even a $50 savings is nice if you're buying 1000 laptops. i'm with you though, for personal use, i'd rather spend a little extra and get a dediced graphics card with it.

But not everyone has a little extra money to burn. Also even for enthusiasts this has legs. Cuts down the cost of a secondary computer, one that doesn't need all the bells and whistles, but still have some room for more then basic work. A system for the kids, or the wife that wants to play WoW with you but doesn't need to play Crysis.