CPU Architect from Intel does an AMA on Reddit. Few interesting excerpts.

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
AMA= Ask me anything.

Link: http://www.reddit.com/r/IAmA/comments/15iaet/iama_cpu_architect_and_designer_at_intel_ama/

Q: Was Global Foundary's split from AMD a wise move? Purchase of ATI wise?

A: They didn't have a choice if they wanted to stay in business. They do not have enough silicon revenue to sustain it. In retrospect the ATI purchase was necessary, the sad part is they did overpay by a large margin. Also execution missteps in coming out with their "APUs" allowed us to come very close.

Q: Pentium 4 era was obviously a cluster-f. Interesting enough, AMD has started to wander down this path in their latest processors -- hoping to get to higher frequencies with longer pipelines. There has to be SOME technical justification for AMD to 'repeat' the past's mistakes. What technical detail is elusively close yet fails to be reached time and again?

A: In my mind, Netburst, much as it's maligned, brought some very good things internally for Intel design teams. First, unbelievable circuit expertise (the FP logic was running at 8GHz in Prescott stock!). Next, the trace cache which you can see reimplemented in Sandy and Ivy Bridge. Also, SMT. Building a validation team that could validate the beast pre- and post-silicon. The power-perf thinking i.e. frequency through power savings. Finally, the development of tools and project management required to do that kind of extreme design. All of these learning continue to this day and it's a very large contributor to why in client and server CPUs Intel can sustain the roadmap we have.

Q:How well do you think journalists (Anandtech in particular) cover your latest architecture? It seems to go over a majority of the tech-news writers' heads...

A: Anandtech and Real World Tech (sometimes The Tech Report) are the best sites with the most accurate information. Especially with Real World Tech, we are sometimes surprised at the accuracy of many of the inferences. Anandtech's latest Haswell preview is also excellent; missing some key puzzle pieces to complete the picture and answer some open questions or correct some details but otherwise great.


They get close. There are a couple of things to note here: sometimes the architectural information is not enough, the circuit implementation is incredibly important and that is not often discussed. I guess it's lower on the totem pole. Sometimes we do keep some information from the press that end up in patents, conference papers, etc... But eventually we disclose everything, I think is because we try to outdo ourselves every generation as well as being proud and wanting to share our accomplishment. Ask Apple for a disclosure of Swift.

I like Real World Tech the most and find that Anandtech and The Tech Report do good jobs too. I also read Semiaccurate for its humor value and to level set.

Q: Does Intel have any plans to make graphics chips this millennium?

A:On-die, are you willing to pay for the die area? I suggest you look the perf/mm2 and perf/W of our Gen graphics. We're working very hard to improve Windows and Linux drivers to compliment the hardware. If you're expecting discrete graphics, then you'll be disappointed.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,624
745
126
Yes, indeed very interesting. I'll add some more highlights:

Q: As a person who just bought an Ivy-bridged based system, is there anything you can tell me to convince myself to save up for a haswell or broadwell system?

A: What do you usually do with your system? If you like to overclock, Haswell is worth it (can't tell you why but read the Haswell Anandtech preview very carefully for buried treasure). On-die graphics is improving quite a bit as well. If you're into energy efficiency or even more graphics, Broadwell. I think the tech community will be very pleasantly surprised with Broadwell. But I'm biased, so we'll just going to have to prove it the hard way.

Q: I'm sure you heard about all the Intel-is-going-to-BGA news about a month ago. Any reactions?

A: It's wrong. As I mentioned above, the people who leak the rumors assume too many things incorrectly. That said, we are definitely going very BGA-heavy for a variety of reasons.

Q: Intel 3000 graphics were a disappointment, intel 4000 are slightly better but not much. Any plans to make any bigger leaps in terms of improving graphics? (not sure if this applies to you)

A: Haswell will improve over Ivy Bridge. Broadwell will be a bigger jump.

Q: Will desktop PCs all move to ARM processors? Will Intel ever be relevant in the mobile space?

A: It's my job to make all form factors of computing machines run on our processors. To be completely candid (personal opinion disclaimer), with what I know is coming from Intel, the likelihood of ARM getting into the PC space is very low. Nevertheless, we're not taking our foot off the gas. For mobile, we have no choice but to become relevant. As for when, my personal prediction for substantial growth is Q1 2014, domination in 2015.
 
Last edited:
Mar 10, 2006
11,715
2,012
126
"It's my job to make all form factors of computing machines run on our processors. To be completely candid (personal opinion disclaimer), with what I know is coming from Intel, the likelihood of ARM getting into the PC space is very low. Nevertheless, we're not taking our foot off the gas. For mobile, we have no choice but to become relevant. As for when, my personal prediction for substantial growth is Q1 2014, domination in 2015."

Boom. Goodbye, ARM. Not only is Intel's focus now on low power, but it's got many years of experience of high end CPU design to leverage here.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,525
6,050
136
"It's my job to make all form factors of computing machines run on our processors. To be completely candid (personal opinion disclaimer), with what I know is coming from Intel, the likelihood of ARM getting into the PC space is very low. Nevertheless, we're not taking our foot off the gas. For mobile, we have no choice but to become relevant. As for when, my personal prediction for substantial growth is Q1 2014, domination in 2015."

Boom. Goodbye, ARM. Not only is Intel's focus now on low power, but it's got many years of experience of high end CPU design to leverage here.

People were saying the same things about nVidia when Larrabee was announced... ;)
 
Mar 10, 2006
11,715
2,012
126
People were saying the same things about nVidia when Larrabee was announced... ;)

Except Medfield is already really, really competitive with the ARM compatible stuff. And it's based on a 5 year old Atom core.

Brand new core. Process technology lead. Desperation to win in the market.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Q: Does Intel have any plans to make graphics chips this millennium?

A:On-die, are you willing to pay for the die area?

:cough: what i've been saying for the past year :cough: igpu will never compete with discrete because it would make for a die that cost too much. if a cpu is 1X area, and a gpu is 2X area, you have to realize that you can't just add them together and think that 3X costs the same as 1X + 2X... each unit area costs more than the last because of the effect on the yield. That and the cost of gpus are so high (because they are huge).
 
Last edited:

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
There was a great line in there also where someone asked him if he could make a particular instruction faster, and he replied (paraphrasing): "Sure, if you tell me what I can make slower".
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Except Medfield is already really, really competitive with the ARM compatible stuff. And it's based on a 5 year old Atom core.

Brand new core. Process technology lead. Desperation to win in the market.

Yeah, problem is only that A9 is also old (more than three years) and Intel can't compete with ARM on the same process node.

They even so desperate that they compared their latest ATOM SoC to an one year old A9 SoC. :awe:
 
Mar 10, 2006
11,715
2,012
126
Yeah, problem is only that A9 is also old (more than three years) and Intel can't compete with ARM on the same process node.

They even so desperate that they compared their latest ATOM SoC to an one year old A9 SoC. :awe:

The current Atom core is also well over 3 years old. That same atom competes very favorably with the very new Krait. You know, on a 28nm process node.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Krait based on A9 and it not in the league of A15.
Samsung has a straigt A9 quadcore design and competes "very favorably with the very new Krait". Krait is overhyped for a 28nm SoC.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
"For mobile, we have no choice but to become relevant. As for when, my personal prediction for substantial growth is Q1 2014, domination in 2015."

I have no doubts that will come to fruition. Intel has a lot of R+D money to burn through, intel overtaking ARM is inevitable.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
:cough: what i've been saying for the past year :cough: igpu will never compete with discrete because it would make for a die that cost too much. if a cpu is 1X area, and a gpu is 2X area, you have to realize that you can't just add them together and think that 3X costs the same as 1X + 2X... each unit area costs more than the last because of the effect on the yield. That and the cost of gpus are so high (because they are huge).

I get excited from time to time too, but typically not for "figuring out" the obvious. Intel not getting into the discrete GPU business and big chips = more money... Which one of these was ever a hot topic to the contrary?
 

IlllI

Diamond Member
Feb 12, 2002
4,929
11
81
wish i could read the whole Q&A but i cant stand the layout of that website. its too cluttered and muddled to figure out who said what :(
 

Haserath

Senior member
Sep 12, 2010
793
1
81
Q: As a person who just bought an Ivy-bridged based system, is there anything you can tell me to convince myself to save up for a haswell or broadwell system?

A: What do you usually do with your system? If you like to overclock, Haswell is worth it (can't tell you why but read the Haswell Anandtech preview very carefully for buried treasure). On-die graphics is improving quite a bit as well. If you're into energy efficiency or even more graphics, Broadwell. I think the tech community will be very pleasantly surprised with Broadwell. But I'm biased, so we'll just going to have to prove it the hard way.
I noticed this and looked back at the anand preview.

I didn't realize Haswell actually has three clock domains again. I wonder of L3 holds it back at all?
 
Mar 10, 2006
11,715
2,012
126
I noticed this and looked back at the anand preview.

I didn't realize Haswell actually has three clock domains again. I wonder of L3 holds it back at all?

Caches are way more sensitive than other parts of the chip when it comes to overclocking. I would not be surprised if the decoupled cache will lead to very significant core clock speed improvements.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Caches are way more sensitive than other parts of the chip when it comes to overclocking. I would not be surprised if the decoupled cache will lead to very significant core clock speed improvements.

That looks good on paper, but lacks in practical improvement.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Except Medfield is already really, really competitive with the ARM compatible stuff. And it's based on a 5 year old Atom core.

Brand new core. Process technology lead. Desperation to win in the market.

Well, how many phones is sold using Medfield?

You wondered why Medfield is not selling at all when Anand shows you browser graphs telling its very fast?

Is the rest of the worlds specialist stupid?
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Except Medfield is already really, really competitive with the ARM compatible stuff. And it's based on a 5 year old Atom core.

Brand new core. Process technology lead. Desperation to win in the market.

Yeah. Larrabee was a failure because Intel never even got it to the point where they could put a GPU on the market. Intel already has a low-power CPU that fits right into a cell phone. Now it's only a matter of tweaking the architecture to make a really appealing product. I've never expected Intel to take off on phones until they brought their ultra mobile architecture in line with the desktop architecture (or rather, the opposite). Once that happens though, competitors are going to feel it.

Intel is in a strong enough position that they're not going to lose the notebook market, and they're going to be pushing hard into the tablet market, especially now that Windows 8 has arrived to bring Windows 8 x86 backwards compatibility to the tablet space (as long as it's not RT...). ARM doesn't have prayer of touching the desktop. The only thing that seems to be out of Intel's reach is the mobile phone market, and only just; not because of inferior technology, but because they don't have a foothold in the market. If Intel adopts the policy that this employee indicated -- "we have no choice but to become relevant" -- that means they will probably end up breaking the business restraints they've placed on themselves and their OEM customers that have kept them from gaining market share. If Intel can flood the phone market with low cost Atoms and the tablet market with low cost, low power Cores by 2014, OEMs will be hooked, and "domination" will be within reach.
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
A: Haswell will improve over Ivy Bridge. Broadwell will be a bigger jump.

Intel consistently use design that will improve graphics big time - not on the next arch - but the one after it. At least for 22nm we are able to play high res video content without stuttering. Drivers are improving dramaticly in each generation.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Intel consistently use design that will improve graphics big time - not on the next arch - but the one after it. At least for 22nm we are able to play high res video content without stuttering. Drivers are improving dramaticly in each generation.

Intel claimed it was Haswell that was supposed to be the big guns in GPU.

Now it says Broadwell. Or was it Larabee? i get confused with all the hyperbole.

Dominate the mobile market in 2015? LOL

They cant make a decent GPU but apparently they can predict the future.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Yeah. Larrabee was a failure because Intel never even got it to the point where they could put a GPU on the market. Intel already has a low-power CPU that fits right into a cell phone. Now it's only a matter of tweaking the architecture to make a really appealing product. I've never expected Intel to take off on phones until they brought their ultra mobile architecture in line with the desktop architecture (or rather, the opposite). Once that happens though, competitors are going to feel it.

Intel is in a strong enough position that they're not going to lose the notebook market, and they're going to be pushing hard into the tablet market, especially now that Windows 8 has arrived to bring Windows 8 x86 backwards compatibility to the tablet space (as long as it's not RT...). ARM doesn't have prayer of touching the desktop. The only thing that seems to be out of Intel's reach is the mobile phone market, and only just; not because of inferior technology, but because they don't have a foothold in the market. If Intel adopts the policy that this employee indicated -- "we have no choice but to become relevant" -- that means they will probably end up breaking the business restraints they've placed on themselves and their OEM customers that have kept them from gaining market share. If Intel can flood the phone market with low cost Atoms and the tablet market with low cost, low power Cores by 2014, OEMs will be hooked, and "domination" will be within reach.

Name one long term successful company that moved into a market by flooding it with cheap products? Last time i checked its the worst business practice to destroy the value and profit in a market that you want to make money in.

Not forgetting you are competing with Samsung who make 500m smartphones who will never buy intel. Apple who design their own Socs who own the biggest single smartphone and tablet in the world now and Qualcomm who are kicking ass with the S4 and also design the SOC's and who also own significant IP when it comes to radio tech used in every phone. All who pay ARM an 'ARM & A LEG' to use their designs who actually have no production overheads and very limited costs and risks.

So remind me again... what exactly does intel have?