CPU Technology - Ahead? Behind? Or right on track?

Hulk

Diamond Member
Oct 9, 1999
5,168
3,786
136
I know there is quite a bit of disappointment at each successive Intel release and I'm wondering how the engineers and scientists of say 20 years ago would look at the Intel fabrication of today?

For example, you walk into the Intel lab of 1994 with a 14nm Broadwell part and provide tech specs and a demonstration. The engineers would be struggling with the first generation PII (Klamath) at 350nm and 300MHz.

Would they be stunned by the advancements that had been made in 20 years?

Would they be disappointed?

Or would they be thinking that's about what they expected?

To quote a famous Star Trek quote I have a feeling all of these incremental advances we've been making would make those old processes look like "stone knives and bearskins!"
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I suspect they might be disappointed in that there have been no real revolutionary breakthroughs, but the complexity of chips today is on an entirely different scale.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Surely they would be stunned by 20 years of evolution according to Moore's exponential Law. It's hard to conceive what else might be possible in those 20 years.

After the initial excitement, they'd probably notice how the clock speeds have reached a wall and maybe other things that you need a PhD for to notice. The people doing the semiconductor research, on the other hand, would likely be amazed by the exciting technologies that are used in those transistors at such small geometries.

More interesting would be the question how Intel engineers from a decade ago would look at Broadwell. Those engineers might be more disappointed, but the biggest accomplishment that they would recognize might be their 4+ year process lead.

I'd say Intel has done a pretty good job at staying on track, since you simply can't get around quantum mechanics.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I think they would be quite amazed how much potential is extracted, not to mention the node itself. Because I am sure the engineers can see the difference between low hanging fruit and hard to reach goals.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
As long as they didnt see what these chips were running, they'd be impressed. But as soon as they saw a version of windows that took even longer to boot than win 95, and then web browsers full of terabytes worth of fail videos, they'd want to shoot themselves and throw their corpses into the machines.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I know there is quite a bit of disappointment at each successive Intel release and I'm wondering how the engineers and scientists of say 20 years ago would look at the Intel fabrication of today?

For example, you walk into the Intel lab of 1994 with a 14nm Broadwell part and provide tech specs and a demonstration. The engineers would be struggling with the first generation PII (Klamath) at 350nm and 300MHz.

Would they be stunned by the advancements that had been made in 20 years?

Would they be disappointed?

Or would they be thinking that's about what they expected?

To quote a famous Star Trek quote I have a feeling all of these incremental advances we've been making would make those old processes look like "stone knives and bearskins!"

The first node I worked on was the 0.5um node, and as one of those "process engineers from 20 yrs ago", I can tell you the answer is unquestionably one of being "stunned and impressed" with today's 14nm process technology.
 

kimmel

Senior member
Mar 28, 2013
248
0
41
But as soon as they saw a version of windows that took even longer to boot than win 95

I don't really understand this comment. I can barely walk over to my chair after pushing the power button before my system is fully ready which is hugely different from even 10 years ago. I could probably fully reboot my windows machine 5 times before IOS/Android boots up once. Frankly you'll know that mobile performance has arrived when they stop using spinney graphics, swipe to unlock and loading screens everywhere to hide latency.

That being said the progress in the last 20 years has been amazing. Not only with fabrication of logic but density and performance of storage as well.
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
Their first reaction would be one of disappointment over the mediocre uncore bandwith. Only after extended study would they see the limitations and compromises that led to our current designs. And they would hate it.

But they would be fascinated over 14nm FinFets.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
The first node I worked on was the 0.5um node, and as one of those "process engineers from 20 yrs ago", I can tell you the answer is unquestionably one of being "stunned and impressed" with today's 14nm process technology.
Very interesting, as always (although I certainly wouldn't mind less superficial comments).

I'm glad 2is made that thread: http://forums.anandtech.com/showthread.php?t=2377778
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Intel need someone to push them again, like AMD did with the Athlon.

What result do you expect? Some 50% IPC increase?

Software is the main thing that keeps hardware back. And there is no solution in sight for software.
 

Berliner

Senior member
Nov 10, 2013
495
2
0
www.kamerahelden.de
What result do you expect?

Nothing specific, just some noticably bigger steps.

I just gave an example when Intel abandoned the P4 ship and went back to the PIII/P-M/Core architecture with improved IPC, pushed by AMD who did it first. That was an exciting time, when you had an actual choice, where you could really see improvements being made.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
You are all forgetting that Both Conroe/Wolfdale and Nehalem bring more than 100% of IPC increase against Pentium 4 based uarchs. More than 100% IPC in four years.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Nothing specific, just some noticably bigger steps.

I just gave an example when Intel abandoned the P4 ship and went back to the PIII/P-M/Core architecture with improved IPC, pushed by AMD who did it first. That was an exciting time, when you had an actual choice, where you could really see improvements being made.

The P4 was a speedracer design that didnt work. AMD repeated the exact same mistake with Bulldozer. You can turn it around and say that the failed uarchs like P4 and Bulldozer is due to competition.

I have a feeling you think there is massive gains waiting around the corner, but its somehow limited due to lack of competition. Thats simply wrong.

Competition isnt something that magically fixes everything. Its for example the reason why we are stuck with x64. See the quote in my signature as well.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Competition isnt something that magically fixes everything. Its for example the reason why we are stuck with x64. See the quote in my signature as well.

And for an example of an industry hamstrung by the lack of competition, look at broadband. If your theory was correct, with very little competition, then infrastructure upgrades must be very efficient, and we all should have 1Gbit fiber-optic lines by now.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
And for an example of an industry hamstrung by the lack of competition, look at broadband. If your theory was correct, with very little competition, then infrastructure upgrades must be very efficient, and we all should have 1Gbit fiber-optic lines by now.

You compare apples and oranges. Also the case for your country doesnt apply to other countries.

And I can guarantee you that you dont magically get fiber due to competition either.
 
Last edited:

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
What result do you expect? Some 50% IPC increase?

Software is the main thing that keeps hardware back. And there is no solution in sight for software.

As a hardware design guy, I definitely share that sentiment. You have no idea how many performance problems starts with the sentence "Well... so there's this old FORTRAN library..."
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Nothing specific, just some noticably bigger steps.

I just gave an example when Intel abandoned the P4 ship and went back to the PIII/P-M/Core architecture with improved IPC, pushed by AMD who did it first. That was an exciting time, when you had an actual choice, where you could really see improvements being made.

I still need a citation that is in actuality still possible to extract a meaningful amount of IPC out of a CPU (without a hugely disproportional increase in power). If it was so easy to still get big gains, why hasn't AMD released such a chip, or why hasn't Apple doubled CPU performance again with the A8(X)?
 

kimmel

Senior member
Mar 28, 2013
248
0
41
I still need a citation that is in actuality still possible to extract a meaningful amount of IPC out of a CPU (without a hugely disproportional increase in power). If it was so easy to still get big gains, why hasn't AMD released such a chip, or why hasn't Apple doubled CPU performance again with the A8(X)?

There are plenty of people here who think magic pixie dust makes their electronic devices go and expect companies to 2x the pixie dust every generation. Those who think that any of this progress is easy or automatic need to really think critically about that point of view.
 

Socio

Golden Member
May 19, 2002
1,732
2
81
CPU Technology - Ahead? Behind? Or right on track?

Correct me if I am wrong but considering most software being released today is still not written to make full use of multi-core processors I would say CPUs are ahead.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You compare apples and oranges. Also the case for your country doesnt apply to other countries.

And I can guarantee you that you dont magically get fiber due to competition either.

Before, I only had Comcast in my area. It was great, I had highspeed internet. However, I had constant dips in my connection due to Comcast not having enough throughput for everyone. Then, this magical thing called competition came to my area. Verizon came with a FIOS line.

Now?

I have amazing throughput with Verizon, and Comcast has also upgraded their lines in the area MULTIPLE times with Verizon also upping their speeds to provide a "competitive" rate/speed to the "competition".

I can't believe you'd seriously advocate against competition.....
 

Hulk

Diamond Member
Oct 9, 1999
5,168
3,786
136
The first node I worked on was the 0.5um node, and as one of those "process engineers from 20 yrs ago", I can tell you the answer is unquestionably one of being "stunned and impressed" with today's 14nm process technology.

Well that pretty much answers my question!
It's so easy to look at the latest and greatest from Intel (or any manufacturer) and think "bah, less power usage, smaller process, 5% IPC improvement, that's it?!"

Meanwhile the technology that goes into those "meager" improvements is the culmination of human evolution!
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
As long as they didnt see what these chips were running, they'd be impressed. But as soon as they saw a version of windows that took even longer to boot than win 95, and then web browsers full of terabytes worth of fail videos, they'd want to shoot themselves and throw their corpses into the machines.

I don't know if it's because you haven't used Windows for a while but Windows 8.1 is faster than Windows 8 which is faster than Windows7 which is also faster than Windows Vista.

The newer Windows have problems but speed and efficiency isn't really one anymore.
 

Spawne32

Senior member
Aug 16, 2004
230
0
0
Correct me if I am wrong but considering most software being released today is still not written to make full use of multi-core processors I would say CPUs are ahead.

Depends on how you look at it really. I would say that the reason that is because of how far behind programming is compared to the hardware. If you look shit operating systems like Windows 8, its things like this that hold back the potential of what companies like intel and AMD are actually producing. I would say hardware technology is actually behind from what companies actually predict in their own roadmaps for production, as it just gets increasingly harder to produce new technology and smaller chips then what they originally anticipate on paper, leading to delays and setbacks in the order of 1-4 years sometimes. Look at roadmaps from 2010 from intel and they show us being on 14-12nm process 6 months ago.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Very interesting, as always (although I certainly wouldn't mind less superficial comments).

I'm glad 2is made that thread: http://forums.anandtech.com/showthread.php?t=2377778

Happy to make anything less superficial, but mind you my PhD dissertation is >400pgs...I'm not exactly what you'd call a "guy with few words to say" when it comes to things I consider worth expanding upon.

So what are you curious about, narrow it down for me so my post will fit within the 10,000 character minimum (yes, it has been a problem for me in the past, multiple times). :whiste: