Intel intentionally delayed release of Broadwell?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
That is exactly what they are doing.

http://www.extremetech.com/computin...-42-in-arizona-but-its-nothing-to-worry-about

They are putting the only 14nm production (Oregon?) into mobile Intel Atom / Quark / Aveton chips to better compete with the ARM Army while the PC market as a whole slows down and instead coming out with Haswell Refresh chips still at .22nm. The demand no longer justifies the PC being the leading edge process node.

This isnt something new. If we rewind to 22nm. we had 22nm for server, mobile and desktop. However atoms was on 32nm. The only thing changed is that atom and desktop changed places.

Intel is not letting 14nm sit idle. The Arizona fab is not have any equipment yet. Right now fab42 is simply an empty shell ready for 450mm wafers. And it had to be build due to the grants. Remember those.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
That is exactly what they are doing.

That's not a fab, that's a building. They didn't furnish it with all the necessary equipment for production.

Plus they have other fabs manufacturing 14nm for them. And having this 14nm fabs ready for production and let them rooting to make room for 22nm to reach its targeted ROI is asinine, no executive worth its salt would do it.
 

jdubs03

Golden Member
Oct 1, 2013
1,305
907
136
Like I said the the dedicated fab thread, delaying fab 42 doesn't really impact anything, intel spent about one-third the total cost for a functional fab will depreciate it accordingly and shift the machinery to 450mm and/or 10nm. that additionally lowers the shorter-term bring-up costs.
 
Last edited:

deputc26

Senior member
Nov 7, 2008
548
1
76
FWIW I spoke with an intel engineer yesterday who is working on 14nm Atom. He claimed that yields at 14nm really have been the issue, he also claimed that the solutions worked out for 14nm will apply almost directly to 10nm and that 10nm will be a relatively easy transition for Intel.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
FWIW I spoke with an intel engineer yesterday who is working on 14nm Atom. He claimed that yields at 14nm really have been the issue, he also claimed that the solutions worked out for 14nm will apply almost directly to 10nm and that 10nm will be a relatively easy transition for Intel.

If I was an engineer @ Intel and happened to be your best friend, and gave you information like that, which could cost me my job......

...and then you went and posted what I said on a public message board frequented by industry insiders of all types.....

:eek:
 

jdubs03

Golden Member
Oct 1, 2013
1,305
907
136
interesting, that may imply that 14nm and 10nm have the same transistor structure. so InGaAs/SiGe tfets at 7nm? seems more likely now.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
What do you base that assumption on?

Note the word "may" in his sentence, so it isn't really an assumption. But I can see why 10nm might still be silicon, since moving away from silicon after 40+ years, doesn't sound "relatively easy (transition)" to me.
 

jdubs03

Golden Member
Oct 1, 2013
1,305
907
136
What do you base that assumption on?

My assumption was based mainly on the concept of applying lesson learned at 14nm to make an easier transition for 10nm.

Meaning its either new transistor structure at 14nm or likelier at 7nm.
 

deputc26

Senior member
Nov 7, 2008
548
1
76
If I was an engineer @ Intel and happened to be your best friend, and gave you information like that, which could cost me my job......

...and then you went and posted what I said on a public message board frequented by industry insiders of all types.....

:eek:

The dude was clearly speculating, didn't know him at all, i met him at work, he mentioned he worked for intel so we got into a lengthy conversation. You don't lose your job for personal speculation.
 
Last edited:

DigDog

Lifer
Jun 3, 2011
14,674
3,020
136
*COUGH*
Moore's Law is [rubbish]
**COUGH**

Here is a simplified version of moore's law, with the maths taken out:
"'puters, they get better"

Also known as the Law Of Obvious

No profanity in the tech forums, please
-ViRGE

sorry :(
 
Last edited:

DigDog

Lifer
Jun 3, 2011
14,674
3,020
136
People tend to quote Moore's law as if it were fact; that the number of transistors doubles every two years. All Moore did was *observe* that every two years circa, the number of transistors did in fact double.
Because that is what CPU design was focusing on. But essentially the doubling of transistors is just a manifestation of the constant improvements of industrial CPU design; maybe in the near future the interest will veer away from that point. Also, it was observed while heading towards (but quite far from)a barrier of logarithmic complexity - you get to the point where design becomes too small, and doubling is not only not feasible, but not in the interest of design anymore.

Obviously computers will get better; they might get worse too - it has happened with many tech products. Flatscreens are just now approaching the same quality that CRTs used to have; ofc some other factors have changed, the weight for example, and well, "flatness". We don't know yet what our "flatness" will be, what element of our search for more power will be sacrificed in order to improve something which we haven't even noticed to be a defect, yet. When CRTs were king, nobody said "oh boy, it's just too big". Or at least, not for a long while, but when the CRT became perfected, people sought other areas to improve, and size came to mind.

And with that, many sacrifices were made. So if Moore's law via classic interpretation was to hold true, we would *never* sacrifice our pixel response for flatness, but we will. Thats why it's not a law, because it's open to interpretation - with the switch to LCD, and the subsequent research to get the same quality but with added features, have we taken a step back, or forward?

Well, neither - we took a step sideways. Because going forward didn't interest us anymore. And CPUs will follow suit.
 
Last edited: