Nvidia Q4313 results

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I agree, if there was a significant change is was the jump to SB. But the point was, the transistion to no midrange GPU in notebooks took longer than - at least i - expected.

Intel faces three problems to overcome in the iGPU area:

- First is the die size area. Intel for economic reasons didn't follow the asinine AMD APU strategy, when a huge GPU is blended together with a huge CPU, making it a huge package choked by memory bandwidth and thermals. Intel went for incremental increases each generation, devoting more and more size for the iGPU while reducing TDP.

- Second is efficiency of the iGPU. Intel iGPU architecture wasn't efficient enough in the first generations, so not only the iGPU had to grow in size, the EU's should be more efficient.

- Third is driver support. Intel doesn't have the release cadence nowhere near what ATI and Nvidia have for their drivers, which means delayed optimizations for games and apps. They are improving here, slower than in the other two areas.

The conundrum posed by the first two points will be solved with Broadwell, as 14nm will give the die area for another descent jump in the die area devoted to the iGPU and whatever bandwidth problems posed by the slow DDR3 memory can be mitigated by a beefed up Crystalwell solution, plus Broadwell should have a reworked GPU architecture, and this is where the things go south for Nvidia.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
I agree. The drivers is the weak point, the rest will get there with broadwell.

We might fx. see impressive 3d mark benchmarks with Haswell, but the moment the notebooks hits real world gaming, the problems will start, both quality and performance wise. And inconsistent quality, is a no go on the market, its a killer for marketing.

We tend to focus a lot on hardware side in these forums, but software (compilers/drivers in this business) have always have an huge importance.

From another business doing high tech development, i can see the software plays an even greater part each year. Patens, and what really differentiate to the competitors, is more and more embedded in software. There is good reasons for that, besides that the software is relativesly more important for performance and efficiency, it can be encrypted and guarded from the competitors. Its more safe investments. But its still undervalue inside the corp. imho.

What we also tend to think wrong is how long time it take to build the competence to do this world class programming, and how extreme broad the competence you need to take into action when you do it.

On the other side i have seen software beeing very good at seeing how they are a multidiciplinary profession, and also the management methods, and methods in general is often more modern and dynamic than the hardware side, still thinking a little like "next time we will get the requirements spec. spot on, then there will be no problems".

Intel writes some, if not the best, compilers of this world, why the gpu driver take so long to get of the ground, i dont quite understand? They might have underestimated the importance 5 years ago, or perhaps some key employees left ?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Intel writes some, if not the best, compilers of this world, why the gpu driver take so long to get of the ground, i dont quite understand? They might have underestimated the importance 5 years ago, or perhaps some key employees left ?

Classic project management triangle situation, right?

250px-The_triad_constraints.jpg


^ pick any two of the three, but only two, as your priorities.

When it comes to compilers one could surmise that Intel apparently values both the scope (performance/capability) and the schedule, and as such we might conclude they pump a fair amount of money into that effort.

To be contrasted with their GPU driver team, where the schedule is not all that much of a priority but they do try and keep the scope (DX10, DX11, etc) nearly state of the art (but obviously lagging the leading edge). As such we might conclude they prioritize maintaining the effort as a low-cost affair.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Is that really any different than the justification for adding ISA extensions in CPUs. Who cares about SSE2 the day the first processor with SSE2 hits the market? No one really except a handful of programmers. But a couple years later, as the market is seeded with the hardware, people do care.



"Market Correction" do happen, and they happen for reasons the rest of the "stupid market" failed to foresee. You can't hold up "the market" as some infallible benchmark.

QCOM is over-priced, waiting to crash, right?

APPL was P/E = 16 in Sept, then "crashed" to P/E = 11. More like a bubble that shouldn't have existed in the first place has been corrected.

QCOM P/E = 17 right now. At some point that P/E is going to become ~11.

Indeed, Qualcomm is set to have some very good quarters in Q1, Q2, and maybe even Q3, but as Nvidia and Broadcomm roll out their LTE modems, and when Nvidia gets Tegra grey out, Qualcomm will not be able to maintain it's enormously high shares and sales.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Intel for economic reasons didn't follow the asinine AMD APU strategy, when a huge GPU is blended together with a huge CPU, making it a huge package choked by memory bandwidth and thermals. Intel went for incremental increases each generation, devoting more and more size for the iGPU while reducing TDP.
Sandy Bridge was 5% smaller than Llano and the transistor increase from Llano to Trinity was much smaller than the one from SB to IB. The only advantage for IB is 22nm and by all means, IB cores are huge compared to Trinity. Also, Trinity is up to 35W only in Notebook space, that supposedly huge package may be choked by memory bandwith, but Thermals?
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
But I wouldn't put my money on Qualcomm either. Qualcomm doesn't have the fundamentals for a 100 billion CAPEX.
I would put my money on both Intel and Qualcomm in the medium term.

The only thing that concerns me is the macro environment.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Classic project management triangle situation, right?

250px-The_triad_constraints.jpg


^ pick any two of the three, but only two, as your priorities.

When it comes to compilers one could surmise that Intel apparently values both the scope (performance/capability) and the schedule, and as such we might conclude they pump a fair amount of money into that effort.

To be contrasted with their GPU driver team, where the schedule is not all that much of a priority but they do try and keep the scope (DX10, DX11, etc) nearly state of the art (but obviously lagging the leading edge). As such we might conclude they prioritize maintaining the effort as a low-cost affair.

Agree, thats the likely scenario, but why do they maintaining it as a low cost affair? - considering how they have investing in capacity, that is very much very fixed cost and risk, i simply dont understand the dont prioritize software development on the gpu side more. What do they think is going to save them? Some future avx 2??

The moment some popular games crash or have very visible deficits, their entire GPU brand is down the drain,- again. I think Anand was - very - nice on the chopping hd3000 video graphics, the consumers dont judge the same way.

Its like they are stucked in the Larabee failure. But hey, man up, its the game of developent, but perhaps top management self-esteem can not take it. They gladly go on stage with woodscreews, but when the troubles arives, they blame the people who worked for it.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Sandy Bridge was 5% smaller than Llano...

I think you are referring to SNB 4C, which offers *a lot* more CPU performance than Llano but was in a market bracket above it. Llano was in the same market bracket of SNB 2C, which had roughly half of the die area of Llano.

This is one of the reasons of why AMD APU strategy is a failure. AMD doubled the die size of their chips by putting a huge iGPU but could not command any price premium for the better iGPU. They ballooned their costs and got nothing in return.

...and the transistor increase from Llano to Trinity was much smaller than the one from SB to IB.

And this is the other. Unless AMD believed GLF press releases, it was obvious that GLF would be behind Intel in the process race for the foreseeable future. Intel would have advantages in both the performance *and* die area budget to play. It was a very big risk going with >240mm^2 parts that needed clocks in the sky to get close to Intel's performance levels. They took it, they failed, their margins are in the toilet and market share is plunging.

Also, Trinity is up to 35W only in Notebook space, that supposedly huge package may be choked by memory bandwith, but Thermals?

35W with anemic performance. I don't doubt AMD could go all the way down to 10W if they wanted, but that would yield a SKU with subpar performance. They 35W and 17W are just like that. And this assuming they hadn't another 8350 moment when labeling those units.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Agree, thats the likely scenario, but why do they maintaining it as a low cost affair? - considering how they have investing in capacity, that is very much very fixed cost and risk, i simply dont understand the dont prioritize software development on the gpu side more. What do they think is going to save them? Some future avx 2??

IMHO the model is in transition from compromising on cost (because it wasn't worth to fix drivers for Westmere iGPU for example) to a model compromising on features (a generation of delay over what NV and AMD introduces of bleeding edge features). I just think this transition is too slow for us consumers. But when we look at the financials from AMD and Intel, it was clear what decision was right.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
The investment in GPU software, must be extremely slim compared to the other activities - peanuts in Intel enourmous R&D budget (isnt it ~10x AMD). It isnt so visible now it was so wrong, because of AMD complete failures, but that doesnt make it the most optimal decision. It was wrong on the Intel part. Dead wrong.

They are putting Atom CPU to market, that stand no chance because of GPU funcionality from another age. They absolutely need something of different quality for the 2014 Atoms, or they seriously affect their own ability to battle temash/kabini market.

Their treatement of the consumers for their GPU tech is just a failure and seriously hurt their brand value. It doesnt make sense. It was just bad decisions.

Considering the extreme competences, IP, and technology Intel masters, there is obviously something they dont do optimal given their market cap. This was one of the decisions. And i think the reason for it is straight forward:
They think themselves as producing CPUs. Its their mindset hindering their potential.

I think both AMD and Intel needs a crazy and visionary, intelligent ego as their CEO. Imagine the power and potential JHH could release in both companies.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
You might notice that GPU tech is mainly restricted by IPs. But Intel have made alot of IP purchase and licensing lately.

Also IGPs are simply value added addons. You get zero return on them.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The investment in GPU software, must be extremely slim compared to the other activities - peanuts in Intel enourmous R&D budget (isnt it ~10x AMD). It isnt so visible now it was so wrong, because of AMD complete failures, but that doesnt make it the most optimal decision. It was wrong on the Intel part. Dead wrong.

I agree with you that Atom graphics is a significant handicap and that Intel could have invested more on their graphics, but this is going to change, and in 20/20 hindsight Intel should have invested more in graphics, even if only for the sake of their mobile line up. But I don't think you are seeing what comes ahead.

We consumers are always behind the curve in the IC business. What we buy today is what someone conceived 3-4 years ago, and finished design 18 months ago. So whatever changes Intel made in their GPU business model, won't be perceived now but 18-24 months ahead. I'm betting your perception will change, specially after Broadwell.

It's easy to see why SNB and IVB iGPU didn't get top notch software support and I see two main reasons for that:

- IVB and SNB iGPUs weren't bleeding edge. How much performance is on the table for those iGPUs that weren't already tapped by current drivers? I doubt it's something that would be perceived by the users. Haswell GT3, mainly the Cristalwell parts are another performance levels. I'd wait to see how Intel treats this specific SKU to make a judgement on Intel's iGPU support level.

- The other reason is how to monetize this investment. As AMD clearly showed us, better graphics aren't able to command a huge price premium. Consumers do want better graphics, but not at any price, so whatever money you sink in iGPU must be carefully considered.

This of course refers to the Core line. If we go to Atom, yes, better graphics are something badly needed, but Intel mistakes there weren't limited to graphics. The core badly needs an update too.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
I think you are referring to SNB 4C, which offers *a lot* more CPU performance than Llano but was in a market bracket above it. Llano was in the same market bracket of SNB 2C, which had roughly half of the die area of Llano.
I won't doubt that it ended up competing against dual core SBs, but during their design process Llano probably was much more sound. The die size isn't wtfbbq massive in comparison to Intels top chips and apart from GLF's 32nm process everything else was a known quantity.

This is one of the reasons of why AMD APU strategy is a failure. AMD doubled the die size of their chips by putting a huge iGPU but could not command any price premium for the better iGPU. They ballooned their costs and got nothing in return.
... Unless AMD believed GLF press releases, it was obvious that GLF would be behind Intel in the process race for the foreseeable future. Intel would have advantages in both the performance *and* die area budget to play. It was a very big risk going with >240mm^2 parts that needed clocks in the sky to get close to Intel's performance levels. They took it, they failed, their margins are in the toilet and market share is plunging.
Well Intel isn't able to capitalize much on their igpus either, but without them consumers would run to tablets even quicker and Ultrabooks would still be a niche served by Sony and Apple.
And AMD has taken risks with larger parts quite often in the past, it's one of the few possibilities to counter Intels process tech advantage. Neither is outspending an option, nor sitting it out. But with Trinity they did manage to tackle efficiency quite well, going from 45W to 35W and delivering typically more than 20% more performance in games and programs was very needed.

35W with anemic performance. I don't doubt AMD could go all the way down to 10W if they wanted, but that would yield a SKU with subpar performance. They 35W and 17W are just like that. And this assuming they hadn't another 8350 moment when labeling those units.
I disagree on the 35W part. It's a very compelling product in its market segment, but Intel completely drowned the press with (back then mostly shitty) ultrabooks during Trinity launch. Just compare the amount of articles AT has posted with uninspiring Ultrabooks to Trinity reviews. The lower Trinity parts aren't as compelling, but could still serve underrepresented markets very well.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I won't doubt that it ended up competing against dual core SBs, but during their design process Llano probably was much more sound. The die size isn't wtfbbq massive in comparison to Intels top chips and apart from GLF's 32nm process everything else was a known quantity.

If you recognize that Llano could not compete compete with 4C SNB but only with SNB 2C, which is half Llano die size, how can you not consider Llano a failure? You are pointing out exactly the reason why the APU concept is a failure: The price premium AMD expected for their iGPU didn't pan out, so they ended up selling 220mm^2 processors for the same price of Intels t 120mm^2 procesors. Higher costs, lower margins. Do you want a bigger failure than that?

Well Intel isn't able to capitalize much on their igpus either, but without them consumers would run to tablets even quicker and Ultrabooks would still be a niche served by Sony and Apple.

I don't disagree with that, but the answer isn't AMD asinine APU strategy. Nvidia with power efficient low end/mainstream dGPUs and optimus is making more gains than AMD in mobile.

And AMD has taken risks with larger parts quite often in the past, it's one of the few possibilities to counter Intels process tech advantage. Neither is outspending an option, nor sitting it out. But with Trinity they did manage to tackle efficiency quite well, going from 45W to 35W and delivering typically more than 20% more performance in games and programs was very needed.

I'm not really surprised that the APU concept was generated by Ruiz and Dirk, the terms where AMD arrogance and hubris was more pronounced.

AMD always took the risk by making some things better than Intel. Athlon ran P3 code better than P3, Athlon64 ran most of the codes better than P4 and Brazos runs code better than Atom. Whenever AMD followed this simple recipe, efficiency focus, they had good products and could make money on the market. It was during Ruiz/Dirk terms that the company went to new levels of arrogance (delaying 65nm, APU, Bulldozer, GNC, etc etc etc) that things went downhill.


I disagree on the 35W part. It's a very compelling product in its market segment, but Intel completely drowned the press with (back then mostly shitty) ultrabooks during Trinity launch.

The ultrabook market isn't about the chip only, Intel is flexing a lot of their financial muscles in the supply chain. Chassis, displays, SSDs just to name a few components. AMD cannot compete here. And 35W isn't enough. Even 17W isn't enough. That's why Intel is pushing Broadwell to sub-10W levels.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
I agree with you that Atom graphics is a significant handicap and that Intel could have invested more on their graphics, but this is going to change, and in 20/20 hindsight Intel should have invested more in graphics, even if only for the sake of their mobile line up. But I don't think you are seeing what comes ahead.

We consumers are always behind the curve in the IC business. What we buy today is what someone conceived 3-4 years ago, and finished design 18 months ago. So whatever changes Intel made in their GPU business model, won't be perceived now but 18-24 months ahead. I'm betting your perception will change, specially after Broadwell.

It's easy to see why SNB and IVB iGPU didn't get top notch software support and I see two main reasons for that:

- IVB and SNB iGPUs weren't bleeding edge. How much performance is on the table for those iGPUs that weren't already tapped by current drivers? I doubt it's something that would be perceived by the users. Haswell GT3, mainly the Cristalwell parts are another performance levels. I'd wait to see how Intel treats this specific SKU to make a judgement on Intel's iGPU support level.

- The other reason is how to monetize this investment. As AMD clearly showed us, better graphics aren't able to command a huge price premium. Consumers do want better graphics, but not at any price, so whatever money you sink in iGPU must be carefully considered.

This of course refers to the Core line. If we go to Atom, yes, better graphics are something badly needed, but Intel mistakes there weren't limited to graphics. The core badly needs an update too.

I think you have to look at the benefit of better GPU and support for Intel more in context:

Intel had good succes with their 945g. You could fx. play Sims 2 on it :)

No matter what enthusiats had to say about it, it put Intel on the gfx market and into millions and millions of notebooks.

I am quite sure, Intel primarily looked as it as a way to use depreciated equipment, giving the customers something nearly free, but the move was essential for moving the entire market towards mobile. The market today outside server.

IMHO it was a huge success, that should have told Intel to speed up the development here. Had they done that today they would have an effective weapon to sell the more expensive CPU. It would have been a cheap way to give the customers more benefit, and especially really good cost/benefit for the brand.

The comparison to AMD is not the best. Intel have the good cpu tech, and they have the ressources to add the GPU tech if they have adressed it fx. back 2006. AMD bought ATI, the only moment in history they made any real profit, and it drained the company putting an interest burden on it. What Intel is doing now is using 3 party ip and software for the gpu for the atoms !!!, while speeding up 14nm tech to enter the mobile market. Its way unbalanced.

Ofcourse its alway easy to judge in hinsight, but Intel is often a little bit slow moving, albeit when it changes it does so with momentum.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
If you recognize that Llano could not compete compete with 4C SNB but only with SNB 2C, which is half Llano die size, how can you not consider Llano a failure? You are pointing out exactly the reason why the APU concept is a failure: The price premium AMD expected for their iGPU didn't pan out, so they ended up selling 220mm^2 processors for the same price of Intels t 120mm^2 procesors. Higher costs, lower margins. Do you want a bigger failure than that?
Llano has sold quite well and probably earned them a decent profit. It could have been a lot better if Llano itself had been a better performer, but I wouldn't consider it a failure as much as Phenom 1 or Bulldozer was. It was clear early on that Llano would be a stop gap only and thus it probably didn't get as much resources to polish the chips' rollout.

The ultrabook market isn't about the chip only, Intel is flexing a lot of their financial muscles in the supply chain. Chassis, displays, SSDs just to name a few components. AMD cannot compete here. And 35W isn't enough. Even 17W isn't enough. That's why Intel is pushing Broadwell to sub-10W levels.
I agree. But Jaguar could be a decent performer in the Ultrabook segment as well. Those machines don't have to be lightning fast, battery time is more important. And so far the current crop of Ultrabooks is a lot worse than Sony Vaios from 2008, where "all day battery life" wasn't just 5 hours while doing nothing, but actually 8-10 hours with the extra battery pack in the cd tray mount.

(desperate try to get btt)
Actually, Arm chips could be just as good (no active cooling required) if only the ecosystem would be there.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Llano has sold quite well and probably earned them a decent profit. It could have been a lot better if Llano itself had been a better performer, but I wouldn't consider it a failure as much as Phenom 1 or Bulldozer was. It was clear early on that Llano would be a stop gap only and thus it probably didn't get as much resources to polish the chips' rollout.

What's your definition of sell "quite well"? Because Llano didn't improve AMD margins or provided significant added volume. Quite the opposite, AMD volumes and margins shrunk with the APU concept (albeit the entire debacle cannot be blammed on APU only). Sure, it opened some new markets for AMD, but did so competing at the very bottom, but that's too few. You don't design and manufacture >220mm^2 chips to sell it only in the $50-$100 market, and this is exactly what AMD got with Llano and more so with Trinity.

AMD APU concept is a failure because the TAM is far smaller, and far different, of what was predicted and this results in smaller ROI than was predicted, but by no means I'm giving a judgement on the products individually.

I agree. But Jaguar could be a decent performer in the Ultrabook segment as well. Those machines don't have to be lightning fast, battery time is more important. And so far the current crop of Ultrabooks is a lot worse than Sony Vaios from 2008, where "all day battery life" wasn't just 5 hours while doing nothing, but actually 8-10 hours with the extra battery pack in the cd tray mount.

Ultrabook is all about shrink the performance delta between a notebook and a desktop, all that in a shinny and light package with a lot of top end parts (good displays, SSDs, etc). It's a premium platform, not the kind of platform you'd put Jaguar inside as the BoM will be too high regardless of the processor.

But if we are talking about cheap thin notebooks, yes, Jaguar will be a competent contender there. The remaining question is how competent and how will be priced HSW Celerons and Pentiums. If priced properly Celeron and Pentium SoCs can spoil AMD party here. It's a pity that Brazos die shrink was cancelled, AMD could have made *a lot* of money with it.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
IMHO it was a huge success, that should have told Intel to speed up the development here. Had they done that today they would have an effective weapon to sell the more expensive CPU. It would have been a cheap way to give the customers more benefit, and especially really good cost/benefit for the brand.

I was seeing things as "more money with the current model", while you were arguing a different model for Intel graphics. Yes, I think you are correct here. Intel should have gone for a different model for their GPU developments years ago and that is biting them badly on mobile. Had the x86 market a descent competitor, it could bite them too.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
But if we are talking about cheap thin notebooks, yes, Jaguar will be a competent contender there. The remaining question is how competent and how will be priced HSW Celerons and Pentiums. If priced properly Celeron and Pentium SoCs can spoil AMD party here. It's a pity that Brazos die shrink was cancelled, AMD could have made *a lot* of money with it.

Agree, the celerons is way underrated as competitor - I just saw a dirt cheap packard bell notebook med Intel tech, it was so cheap i examined what the cpu was - an celleron 1000n - whatever, but it was made on 22nm IB, 2500 graphics 1.7Ghz.

That is an formidable competitor to jaguar, as it clearly will have better cpu performance where it matters - compared to 4 core jaguar. Usable gfx. Man, in all fairness nearly the same as the ulv ib processor in my über ultrabook with something disabled. I think for normal usage, extra performance lies in SSD, and then the attention will turn elsewhere.

The basic problem for AMD is that Intel will start to dump expensive process nodes, and just plain and simply expensive cpu on the market with very strong performance. The reason; marginal cost. Very bad for Intel business, devastating for AMD because it will force margins down. The jaguar will have advantages because its highly integrated meaning lower cost for the OEM, and have excellent battery life. Nice, - and that will make it the preferred product to an 22nm celeron. But dumping basickly todays highend product on the market late h1, while introducing Haswell is a very strong lineup.

But it also begs the question what is Intel doing now with their old equipment in the future?
 

Piroko

Senior member
Jan 10, 2013
905
79
91
What's your definition of sell "quite well"? Because Llano didn't improve AMD margins or provided significant added volume. Quite the opposite, AMD volumes and margins shrunk with the APU concept (albeit the entire debacle cannot be blammed on API only).
Without knowing the actual margin split we can't proof or deny that Llano had decent margins. But considering that for a while their margins were "fairly stagnant" while Opteron marketshare took a dive and Bobcat ramped up (low margin by definition) I would think that Llano wasn't doing as terrible as you depict.

AMD APU concept is a failure because the TAM is far smaller, and far different, of what was predicted and this results in smaller ROI than was predicted, but by no means I'm giving a judgement on the products individually.
Depends on how much investment was needed to create current APUs and how much ROI you could have gotten by investing in CPU tech only. Not much and even less would be my answer. Desktop sales are probably shrinking no matter the investments and Notebook chips will probably keep integrating more and more.

Ultrabook is all about shrink the performance delta between a notebook and a desktop, all that in a shinny and light package with a lot of top end parts (good displays, SSDs, etc). It's a premium platform, not the kind of platform you'd put Jaguar inside as the BoM will be too high regardless of the processor.
Uh I disagree. The 17W TDP chips from Intel are quite a bit slower than even 400$ notebooks' cpus. The mandatory SSD gives a decent boost, but these are pushing into Notebooks and Desktops as well. Ultrabooks grew the performance delta to desktops, their sales only show that performance isn't as much of a buying decision anymore.

But yea, Jaguar will probably be seen in cheaper Notebooks. I do see a hole with cheap and not-as-bad-as-Atom 11" and 13" notebooks which could even eat into the 800$ Ultrabook share, considering that these have to make some annoying tradeoffs to reach that price point.

But if we are talking about cheap thin notebooks, yes, Jaguar will be a competent contender there. The remaining question is how competent and how will be priced HSW Celerons and Pentiums. If priced properly Celeron and Pentium SoCs can spoil AMD party here.
If priced properly and if system builders can put them into non-Ultrabooks. It would hurt my brain if Intel decided to keep all 17W chips in Ultrabooks only...
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Without knowing the actual margin split we can't proof or deny that Llano had decent margins. But considering that for a while their margins were "fairly stagnant" while Opteron marketshare took a dive and Bobcat ramped up (low margin by definition) I would think that Llano wasn't doing as terrible as you depict.

In the end the only victory that matters is the commercial victory. It is the amount of money that a given product makes that separates winners from losers. I could give you plenty of names such as Commodore, Transmeta, DEC, companies that had technically innovative products but couldn't monetize their technical prowess.

And when I say that Llano was a failure, I'm not saying that the product is a failure. I'm saying that they didn't hit projected ROI, just that. From a financial standpoint, if I have two product that aims to 50% gross margins and I have only 30% and the other 40% and both miss volume targets, both are failures (but one is far bigger than the other) no matter how much fond the few buyers are of them.

I doubt that AMD reached the target gross margins for the Llano line up having to sell >220mm^2 only in the sub $100 segment. Deneb had a comparable die size and they started selling it for almost $300, and it spent much of its time on the market above the $200 mark. The Athlon X4 had comparable prices but had a die 30% smaller than Llano and was manufactured in a cheaper process, so Llano probably didn't get better returns than the Athlon X4 too. So, from a business standpoint, Llano was a failure.

The other point is integration. Integration per se isn't wrong, the entire industry is heading there. But AMD APU concept is wrong. AMD APU is an anemic CPU coupled with a big iGPU. In the end the price is determined by the anemic CPU and the iGPU becomes only cost burden. I do think they would be better with a stronger CPU and a smaller GPU, something analog to what Intel is doing in the last few years.

And don't get me wrong, I'm a Llano fan myself. I think Stars is a much better starting point than Derpdozer. Hadn't AMD invested all the money and the mother in developing Derpdozer and instead improved what was already very solid architecture, and then putting smaller iGPUs on the die, things could be different.
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
The best part is AMD still doesn't have a smartphone strategy and no viable tablet strategy (Windows 8, lol).

But AMD APU concept is wrong. AMD APU is an anemic CPU coupled with a big iGPU

But that was the point - to use the GPU in HSA mode to beat Intel. They probably won't last long enough to see it through though.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The jaguar will have advantages because its highly integrated meaning lower cost for the OEM, and have excellent battery life. Nice, - and that will make it the preferred product to an 22nm celeron. But dumping basickly todays highend product on the market late h1, while introducing Haswell is a very strong lineup.

The "highly integrated" Kabini advantage goes kaput with Haswell, as mobile SKUs will be SoCs too.
 
Last edited: