Intel may dismiss tick-tock after Haswell, no performance CPUs (LGA based) anymore?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
It will be sad when the enthusiast sector dies out though. Not much to differentiate from if they take all of that away.

Maybe Intel will be kind enough to sell us Xeons and server based motherboards for a little while longer. If they are very kind, they may unlock a couple of SKUs and make them available to end customers only and not OEMs. Of course, these new "enthusiast" systems will probably be affordable only to single people and doctors :|
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Maybe Intel will be kind enough to sell us Xeons and server based motherboards for a little while longer. If they are very kind, they may unlock a couple of SKUs and make them available to end customers only and not OEMs. Of course, these new "enthusiast" systems will probably be affordable only to single people and doctors :|

The new usd 1000 enthusiast cpu will be made on old rusty depreciated equipment, several nodes behind. And we will gladly put our huge expensive coolers on top, buy the k version. And overclock it to 150 watts, while the rest of the world is on 5 Watt. We will be looked upon as those hifi fanatics buying exotic loudspeaker cables for usd 1000 a pair. We will be more excluded than we are - yes it is possible!
 

happysmiles

Senior member
May 1, 2012
340
0
0
I think our high end CPUs will be relevant for the next 6 years, it's not about raw performance but better coding, CPUGPU, resource management and instructions (with energy efficiency being king of all)

programs run better now on my Atom than it did with my AMD 5200+ desktop
 

pablo87

Senior member
Nov 5, 2012
374
0
0
Thank you Pablo. It might be rough on the edge, but shows the need for action.

I know a lot of people dont like to hear talk like that.

What makes Intel a very profitable company, is that they over the years have been very agile in adapting for changes.

And they will do here to, but this is the worst strategic situation they have been in for years. And again reflected in share value.

The days when expensive x86 is needed is soon gone. A15 and Windows 8 is the start of the end.

The consolidation is accelerating, and Intel is fighting with Apple brand and Samsung production and huge vertical and horisontal span, be it products or technology. Add. everyone and his brother is buying from TSMC.

The ARM cost we get is often blown out of proportions. Zakate e350/450 production cost ex packaging was 9usd for aprox 70mm2 on 40nm TSMC - almost 2 years ago. I wouldnt be surpriced if you get a dualcore A15 on 28nm for the same H1 2013. Any new numbers on that?
Add. 2-3 usd for packaging and logistics. I dont remember ARM licensing, but is it even double digic %? - And all that is sans taking the risk.

Lets see what Intel does,- if they react and change plans now, it just show they are very much alive.
But dont count on your old desktop SB 2500k is rendered useless for highend gaming even in the next 6 years :)

Krumme, You and I are on the same page. I don't know the cost of this stuff, I used to get tidbits from Taiwan chip vendors years ago fwiw..

Intel is a stubborn culture though. They stayed on the Netburst program well past the expiry date. The mantra at the time was Ghz Ghz Ghz. I have a similar feeling about Haswell, the "graphics monster". Their most profitable markets - server, power users, enthusiasts, gamers - don't need it. Most value customers don't care - its really an oxymoron to put so much energy into trying to make "good enough" "much better". Even the ultrabook market and certainly tablets would probably be better served with an Imagination core.

That's what made Grove so brilliant - he welcomed the feedback from "Cassandra's" and could lead an otherwise group think ( is it bad to say this? if so, I apologize) culture in a different direction. To me, Otellini doesn't have that same quality.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
What's the point of that? Then why not use a socket for the CPU to begin with as previously?

I think the idea is to increase the upgrade potential of the mainboard. But I'm sure the depth of the upgrade path will depend on how much stuff can get transferred over to the CPU AIB?
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
I have a similar feeling about Haswell, the "graphics monster". Their most profitable markets - server, power users, enthusiasts, gamers - don't need it. Most value customers don't care - its really an oxymoron to put so much energy into trying to make "good enough" "much better". Even the ultrabook market and certainly tablets would probably be better served with an Imagination core.

I strongly disagree for one reason- high pixel density screens. Laptops are already woefully lagging behind tablets and phones in terms of pixel density. Only Apple's Retina Macbook Pros have really pushed it- and in the 13" Macbook Pro, the Intel integrated graphics are slow enough to make the system feel laggy even just in daily use, not gaming. From the Verge review:

The 2.5GHz Core i5 in the 13-inch Pro offers terrific raw CPU performance in benchmarks, running Geekbench at a solid 6700-6800 range, but the integrated Intel HD 4000 graphics chip can struggle driving such a high-resolution display. Oddly, it showed up for me more during day-to-day usage than under any crazy test situation I came up with: RAW files in Aperture scroll around just fine while QuickTime is playing back 1080p movie trailers, but Safari and Chrome both stutter a little while scrolling simple web pages. And they stutter a lot with image-heavy sites like The Verge and Polygon.

Hopefully Haswell's graphics will make the UI buttery smooth.
 

pablo87

Senior member
Nov 5, 2012
374
0
0
I strongly disagree for one reason- high pixel density screens. Laptops are already woefully lagging behind tablets and phones in terms of pixel density. Only Apple's Retina Macbook Pros have really pushed it- and in the 13" Macbook Pro, the Intel integrated graphics are slow enough to make the system feel laggy even just in daily use, not gaming. From the Verge review:



Hopefully Haswell's graphics will make the UI buttery smooth.

That's a good comeback IF that was their plan, and IF the industry standardizes on higher pixel density panels (assuming there is sufficient capacity). If so, I stand corrected.;:$
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
The new usd 1000 enthusiast cpu will be made on old rusty depreciated equipment, several nodes behind. And we will gladly put our huge expensive coolers on top, buy the k version. And overclock it to 150 watts, while the rest of the world is on 5 Watt. We will be looked upon as those hifi fanatics buying exotic loudspeaker cables for usd 1000 a pair. We will be more excluded than we are - yes it is possible!

FWIW, they'll probably only be one node behind, and I wouldn't be surprised if the price per CPU is more like $2K, after all, they'll be EP processors :rolleyes:
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Lots of discussion on capex, that's for sure.

According to this Intel PDF there are supposed to be five 22nm fabs: D1D (22nm), D1C (32nm, 22nm), Fab 32 (32nm/22nm), Fab 28 (45nm, 22nm), Fab 12 (65nm, 22nm)

Global-Intel-Manufacturing_FactSheet-0.png


So I wondering how they plan on juggling 22nm? (due to the high Ivy Bridge inventory mentioned in the 2012 Q3 conference call.)
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Well the same way as usual i guess. They will just postpone Haswell a month or two ??

What they need badly is the the new Atom, at least to show some presence in the market, when you can have a A15 for next to nothing, running arm win 8 with office preinstalled wrapped in cheap plastic.

No wonder broadwell is in no socket.

We are soon back to the days when the cpu needed no fan. Except the cpu (apu+ all the rest) is 10% cost. Hell even the smallest cooler will disapear eventually :(
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
What it all boils down to is the ability to give the consumer something that makes a difference. CPU/GPU have a harder, and harder time doing that.

The future business will be in areas like perhaps sensor tech (look at Sonys investment here on the camera side). Screen tech. Areas that interacts with the consumers. Factors that facilitate new experiences for the body.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
So I wondering how they plan on juggling 22nm? (due to the high Ivy Bridge inventory mentioned in the 2012 Q3 conference call.)

They have said its quite easy to put them with minimal impact if such situation is necessary.

But I don't think they are planning 5 fabs JUST with their CPUs. They probably have some other plans but that's the part we don't know.

Of course, the alternative may be that 5 is for TOTAL. Maybe 2 or 3 out of that keeps changing to the cutting edge node while the 2 left happens later when 22nm becomes a legacy node.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
A fab that becomes qualified to run a given process node virtually never stops producing that node. All fabs run multiple nodes, even brand new fabs will have older technology qualified to run in the fab. You must do this in order to have a diverse enough product mix within the fab to elegantly manage fab utilization and loadings.

Excepting for a handful of new tools required to enable critical steps on a new node, the tools themselves are repurposed and the same tool is used in different ways across multiple process nodes.

Think of the oven in your kitchen, you can use that same oven in the morning to bake biscuits for breakfast at one temperature and time, again use the same oven at lunch to warm up leftover pizza from dinner the prior evening, and yet again for dinner to cook some fish or chicken (at yet another temperature).

Same tool, different products, etc.

To give you an indication of just how close this allegory actually applies to the reality that exists inside a fab, there are tools which we do call "ovens" and the sequence of processing steps that the wafer experiences when it is being processed on any given tool is called a "recipe".

A 32nm that is "converted" to running 22nm does not literally/physically have tools removed, nor is the 32nm process itself stopped as if the fab suddenly only produces 22nm wafers from then on.

A conversion means the handful of new required tools to enable 22nm are brought in and qualified and the fab is then qualified to run 32nm product using a combination of the newly installed 22nm-dedicated tools plus the pre-existing 32nm tools that were intentionally used in the development of the 22nm process such that the installed 32nm toolbase could be repurposed and reused for the production of 22nm wafers.

The 14nm conversion will be the same. An existing 22nm fab will have a few new tools brought in and released to production after a routine EE/PE (equipment engineering, process engineering) qualification release process is conducted. In the meantime the necessary 14nm recipes are copied to the other existing tools in the fab that are concurrently being used for 22nm and 32nm WIP. A qualification run of 14nm wafers is conducted, results are compiled and analyzed and the fab will be deemed "14nm qualified" if it hits all the internal yield and reliability benchmarks.

If it is a brand new fab then it will be fitted with all the tools needed to support 14nm along with at least 22nm (and possibly 32nm depending on the global loadings strategy) and qualifications will occur on a per-node/per-product basis as the fab ramps in total wafer volume.

Then the loading manager sits back and queues up the various fabs with the desired product loadings depending on demand projections and operating margins on a site-by-site basis. Geography factors into this as well, as demand itself is regional and so too are distribution costs as well as consumable supply costs (electricity, water, electronic chemicals, personnel compensation, taxes etc).

Now it is true that sometimes tools really are removed from a fab, but that is something that is more the rare event than the norm. Once a company has gone to all the trouble and expense of bringing a tool into a fab, qualifying it and releasing it to production, there is a huge barrier to removing that tool from the fab.

Instead the fab's engineers and managers will spend copious amounts of time finding things to repurpose the tool to do for years and years rather than replace it and take the capex hit from doing so.
 

pablo87

Senior member
Nov 5, 2012
374
0
0
According to this Intel PDF there are supposed to be five 22nm fabs: D1D (22nm), D1C (32nm, 22nm), Fab 32 (32nm/22nm), Fab 28 (45nm, 22nm), Fab 12 (65nm, 22nm)

Global-Intel-Manufacturing_FactSheet-0.png


So I wondering how they plan on juggling 22nm? (due to the high Ivy Bridge inventory mentioned in the 2012 Q3 conference call.)

CB, they just announced 22nm based pentium and celeron parts. If you look at the 32nm parts they're replacing, they're excellent and don't need replacing for a while yet imo :oops: