My Llano summary in two graphs and four sentences.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
Wow will use as many cores as you have if you set it up correctly I can't find a decent guide atm but for anyone that is interested the info is out there. I quit wow a while back but after seeing a guide in a pc mag i had bought i confirmed you can change a command line in the setup folder to manually adjust exactly how many threads you want wow to run on and also make it run in DX11 with another command. I checked and confirmed this after Cata dropped btw

Well I got it running in DX 11. I thought that was supposed to do it for the quad core too.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
On desktops Llano has a narrow window of opportunity (~6months?) before 28nm low-end discreet cards come to market which will be cheaper to manufacture and have even better performance/watt metrics.

Not likely to happen...Since 32nm SOI+HKMG is surely better in respect
of perf/watt than 28nm bulk HKMG silicon..
 

RobertPters77

Senior member
Feb 11, 2011
480
0
0
So the leaked prices were pretty close... If I get an Octo-Core BD for 300$ I'll definitely jump on that(If performance per clock is better than LLano).
Not likely to happen...Since 32nm SOI+HKMG is surely better in respect
of perf/watt than 28nm bulk HKMG silicon..

I think what he meant was that Nvidia's 28nm cards will be priced accordingly to make llano a less attractive option.

GeForce's For Everyone!
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Not likely to happen...Since 32nm SOI+HKMG is surely better in respect
of perf/watt than 28nm bulk HKMG silicon..

If you consider that an AMD 7650 or so will probably be able to play BF3 and Skyrim at reasonable framerates with reasonable details, along with stuff like Witcher 2, and that Llano will probably choke even at low resolutions with those newer more demanding titles, it may be infinitely better performance/watt if you measure playable vs. unplayable.

That assumes a similar increase as 5650 was to 4650, and makes some assumptions about BF3 and Skyrim taking similar power to run as Witcher 2, which may be a stretch. Who knows.

I doubt anyone cares about the performance/watt aspect when considering very low power cards anyway. If a Llano box uses 100w, and a PhII X4 + 7650 box uses 145w, what difference does that make to most users really? Even with 24/7 use and expensive electric rates, that's a paltry cost difference.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Not likely to happen...Since 32nm SOI+HKMG is surely better in respect
of perf/watt than 28nm bulk HKMG silicon..

Considering the perf/watt of 32nm SOI+HKMG is on par with Intel's 32nm bulk HKMG I don't see the justification for asserting the half-node shrink over 32nm to 28nm won't provide benefits to 32nm SOI+HKMG .

That's why AMD put zacate on bulk-Si 40nm instead of going with SOI 45nm? And the zacate shrink?
 

RobertPters77

Senior member
Feb 11, 2011
480
0
0
Considering the perf/watt of 32nm SOI+HKMG is on par with Intel's 32nm bulk HKMG I don't see the justification for asserting the half-node shrink over 32nm to 28nm won't provide benefits to 32nm SOI+HKMG .

That's why AMD put zacate on bulk-Si 40nm instead of going with SOI 45nm? And the zacate shrink?

You seem educated about this sort of stuff. Let me ask you something...

When a transistor's volume is reduced in half, what is the average power savings and/or TDP reduction? I.E. 100nm to 75nm.
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
Id like to see how llano does with a decent gpu like a gtx460, I'm impressed with the built in gpu as it is but id like to see how it does compared to other cpus.

Llano would be a easy cheap upgrade for me but with it paired with a decent gpu is what I want to see.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Id like to see how llano does with a decent gpu like a gtx460, I'm impressed with the built in gpu as it is but id like to see how it does compared to other cpus.

Llano would be a easy cheap upgrade for me but with it paired with a decent gpu is what I want to see.

Should be fairly close to PhII X4 iirc. It has more L2, but lacks the L3, so pretty close probably.

Given that, it wouldn't be inconcieveable to start using llano at low res/medium detail for gaming, and then add a real gpu later as need be. That's assuming that cpu power won't get too extreme with newer games, which seems doubtful in the short term.
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
I could always get a llano combo and play all the older games I missed out on, by the time I get caught up game wise I could buy a better gpu. I'm gonna wait it out and see how it benches with a decent gpu
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Considering the perf/watt of 32nm SOI+HKMG is on par with Intel's 32nm bulk HKMG

Not at all, despite Glofo process being their first iteration of 32nm
while Intel is already at three iteration in the same node...

A 4C Llano with 1.45bn transistors having a power consumption comparable
to a 2C SB with 0.624bn transistors...

Do the maths..

That's why AMD put zacate on bulk-Si 40nm instead of going with SOI 45nm? And the zacate shrink?

Low cost/low perfs CPU is logically made on a low cost/low perfs process...
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
<- process development engineer @ TI for 0.5um to 32nm

You seem educated about this sort of stuff. Let me ask you something...

When a transistor's volume is reduced in half, what is the average power savings and/or TDP reduction? I.E. 100nm to 75nm.

Its an effect, not a cause, so the answer is "it depends" as it is usually an engineering target to "hit" a specific level of power-consumption reduction as a matter of creating the node itself.

It is NOT the case that we build these things (the nodes) and then after the fact we are like "oh wow, good job fellas, turns out the node we just designed has a 40% reduction in power-consumption at the same clockspeed compared to the older node".

The general expectation is to deliver anywhere from 30% to 50% reduction in the power consumption when a whole bunch of other parameters are normalized (voltage, xtor width, operating temps, etc etc).

A significant portion of the 4yr development timeline is spent in the pursuit of ensuring the node delivers the targeted power-consumption reduction.

Since most products are TDP limited in their respective markets nowadays (GPU's need to be not more than 400W, desktop CPU's not more than 150W, laptop CPU's not more than 65W, etc) the power-consumption for each successive node is pretty much required to deliver 30-50% power reduction so as to enable the near doubling in xtor count that comes with the new product generations.

Think of AMD's 4870 to 5870 transition (55nm -> 40nm). Essentially doubled the xtors (956M -> 2.15B), bumped up the core clocks a bit (750 -> 850MHz), and increased load power by a little bit (160W -> 188W).

^ that's all made possible because the power-consumption on a xtor-normalized basis (and voltage normalized, clockspeed normalized, temp normalized...in other words "all else being equal") has been reduced by ~50% with 40nm over that of 55nm.

Not sure if that helps answer the question you had in mind, but it gets a little onerous to try and define it any more explicitly without the aid of some specific examples to speak too.
 

RobertPters77

Senior member
Feb 11, 2011
480
0
0
@Idontcare

It clarified and helped alot. Thank You.

Also, you work at Texas Instruments? Can you divulge any details about their cpu's for Windows 8? And are you excited that TI is finally getting into the consumer space?(Apart from calculators I mean)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I'm not there anymore, but yes I am quite excited for my past coworkers who are still there and are getting ready. TI is continually reinventing itself, they'll do fine in whatever they find themselves doing in 10yrs.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Can you post a link of that info please??
i have the reviewer's guide. :p

Here is the image from my article
graphicsCards.jpg
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
It doesn't. There is a short list of not-so-powerful Radeons (including HD 6670) that it will pair up with for "up to 120%" improvement.

It got 45 fps avg in mw2 at 1280x720 which is the res I play at, say I stick with stock setup then get a 6670 for under $100 when I get the cash and be getting over 100fps. I should be able to use max settings aa and af and still get 60 fps or more which will be great.

I missed out on a lot of good pc games, I played some on consoles that I would get again on pc. I just really like the idea of it being a quad core and being able to get smooth payable framerates at my res. Llano is perfect for someone like me..
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
Am I missing something? You said up to 120&#37; performance increase but I see no numbers actually showing that...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Am I missing something? You said up to 120% performance increase but I see no numbers actually showing that...
*AMD* said, up to 120% increase :p

If you feel you are still missing something, do a search - there were several brand new reviews released at midnight when the NDA ended that cover gaming performance.
 

khon

Golden Member
Jun 8, 2010
1,318
124
106
What I can't figure out is this: Why do they sell Llano motherboards with support for discrete graphics cards ?

I mean what's the point ? If you're going to get a discret graphics card anyway, why on earth would you pick a Llano processor to go along with it ?

That's like buying a Prius, and then trying to tune it up for racing.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
What I can't figure out is this: Why do they sell Llano motherboards with support for discrete graphics cards ?

I mean what's the point ? If you're going to get a discret graphics card anyway, why on earth would you pick a Llano processor to go along with it ?

That's like buying a Prius, and then trying to tune it up for racing.
It's AMD's idea of future proofing

Intel CPUs are mostly overkill for gaming. When the new games come out you rarely need to upgrade your CPU - you get a new video card (unless you are a RTS fan). You could (then) toss in a HD 6950 into the (then) old Lano box in a couple of years (from now) for $100 and play any game from 2011-2012 with decent results at 1920x1080.
:\
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
i97251_dektop-roadmap-2012-2.jpg


Does anyone if the x in FMx is a place holder for 1,2,3 etc or if it is a set name?

Lately AMD socket name conventions have been 2 letters followed by a number.

This investment in motherboards for only Llano seems weird, so I'm assuming if that FMx could be FM1 that would mean these Llano motherboards could be replaced by either Trinity APU or Komodo CPU.
 
Last edited:

hans007

Lifer
Feb 1, 2000
20,212
18
81
What I can't figure out is this: Why do they sell Llano motherboards with support for discrete graphics cards ?

I mean what's the point ? If you're going to get a discret graphics card anyway, why on earth would you pick a Llano processor to go along with it ?

That's like buying a Prius, and then trying to tune it up for racing.

i can see why you'd get say a single new video card. i mean say you had an llano a6 and wanted to upgrade to say a 6850. seems perfectly reasonable and the 6850 isn't super high end but its way way faster then 320 443mhz cores.

but the ones with crossfire, just boggle my mind.
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
*AMD* said, up to 120% increase :p

If you feel you are still missing something, do a search - there were several brand new reviews released at midnight when the NDA ended that cover gaming performance.

I can't find anything that I haven't read yet