Imec Tech Forum: When it's only engineers, not marketing

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
GlobalFoundries employee says it all:

“The reality is scaling has slowed dramatically, these node names are a marketing tool and the value proposition has reduced considerably,” Patton told attendees.

The days of 35% lower costs and 20% greater performance every two years “ended at the 20nm node when the need for double patterning created significant cost increases, then we added FinFETs and called it 14/16nm, but really it’s a retrofit,” he added, debunking today’s leading-edge processes.

Not stopping there, Patton called the 10nm process his rivals TSMC and Samsung are racing to deliver “more of a half node.” By contrast, “at 7nm there’s hope for a full node, measured from 14/16,” he said.

And he [someone else from above quote] acknowledged the out-of-whack marketing trying to paint a pretty face on the reality. “When you look at a 14nm process, it’s hard to find anything that’s 14nm there,” he joked.

2-EUV-poll.jpg


http://www.eetimes.com/document.asp?doc_id=1329777
 

know of fence

Senior member
May 28, 2009
555
2
71
Certainly Intel's 14nm does nothing for the desktop, except reduce manufacturing cost perhaps, but TSMCs Finfet node certainly seem very promising. I'm also not sure how as the cycles get longer the value proposition changes for the consumer. Doesn't more use time/years also equal increased value after all?
For servers you don't get impoved performance at lower power, that quickly recoups cost, but that barely matters if the PC runs for a few hours a day.

Typically as pressure mounts, people start to lie and cheat, so I worry that the tech press in its current state does not have the means to critically assess the new reality of slowly forward inching improvements paired with humongous hyperbole.
 

DrMrLordX

Lifer
Apr 27, 2000
22,883
12,939
136
My word why do all the men sit like that on stage? Did they take an etiquette class where they learned that pose? Oh wait the guy on the end is breaking the trend . . .

but yeah, nothing to see here folks, move right along.
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
My word why do all the men sit like that on stage? Did they take an etiquette class where they learned that pose? Oh wait the guy on the end is breaking the trend . . .

but yeah, nothing to see here folks, move right along.

Every guy I see in informal meetings sits like this. It's even more awkward looking when they lean back while still holding their knee.
 

DrMrLordX

Lifer
Apr 27, 2000
22,883
12,939
136
Weird. Sometimes I have to cross my legs when sitting in an uncomfortable chair to take stress off my back (it shifts weight onto the lower leg) but the hands thing is just creepy.

Proper etiquette demands that you sit like Al Bundy.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
So FInFet is an scam that only improves a little the performance of the chips and the real deal and the true successor of SOI is EUV?
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Every guy I see in informal meetings sits like this. It's even more awkward looking when they lean back while still holding their knee.

Nobody actually finds corporate culture comfortable so we call it "informal" so we can keep pretending people finds corporate culture comfortable.
 

bononos

Diamond Member
Aug 21, 2011
3,936
190
106
The boxes they are sitting on have no armrests so its more comfortable for them to sit the way they are, or they have to sit hunched over.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
So FInFet is an scam that only improves a little the performance of the chips and the real deal and the true successor of SOI is EUV?

No. Marketers and engineers tell you that FinFET is the biggest improvement in a decade, century, whatever.

But in reality FinFET is merely an enabler for Moore's Law to continue. Without FinFET, you'd get near zero benefits from process. With it, you get some portion of the benefits you used to get.

The GF guy is claiming 35% reduction in costs and 20% better performance that ended at 20nm. Intel said something similar, that process gen gives you 30% reduction in power OR 20% better performance.

That seems quite nice, but don't forget we didn't need all these fancy nomenclature technologies to get that sort of improvement as far back as 65nm. Remember when Pentium 4 went from 0.18u to 0.13u? Clock speed improved by 50%. A straight shrink gave you 40% power reduction. A 100w CPU on 0.18u turned to 60W on 0.13u.

After about 45nm, you needed various things to get that 30%/20%. Strained Silicon, Copper Interconnects, High-K, SOI, FinFET.

Now with FinFET, despite ALL that work, you seem to get that improvement only on certain segments. Like teeny IoT chips and Smartphones.

Smart guy from GF said:
node names are a marketing tool

If you can isolate it to a company, and product class, you can reasonably say node names represent something. So Intel 14nm is a generation from Intel 22nm. TSMC 16 is a generation from TSMC 20 and TSMC 28. Samsung 14 is a generation from Samsung 20.

Intel 14nm means compared to Intel 22nm, its somewhat faster, somewhat lower power, some what lower cost, and roughly 2x transistors per area. Of course with TSMC and GF and Samsung 20nm names are really for density, and nothing else. But with 14/16 you get the performance right?

As long as the X node is faster, lower cost, lower power than X+1 node who cares? It's only when you bring competitors everything becomes stupid. Like how Intel can't compare their 14nm to everyone else because in practice they don't compete in the same market, period.
 
Last edited:

know of fence

Senior member
May 28, 2009
555
2
71
As long as the X node is faster, lower cost, lower power than X+1 node who cares? It's only when you bring competitors everything becomes stupid. Like how Intel can't compare their 14nm to everyone else...

So the dimensions are out there for the various processes, but it's rarely said that as TSMC and Samsung are delivering Finfets: they seem pretty close in terms of performance per watt - if not area, even though there are no direct comparisons yet.

14nm/16ff has been a huge leap for Graphics, which lead Console makers to contemplate a doubling and trippling of TFLOPs and a never before seen mid generation hardware upgrade. Whereas intel's 14nm was just a refinement. However momentum seems to reverse again going to 10nm. Intel's been promising real improvements with their 10nm, while that EEtimes article tempers the expectations for Samsung's 10nm process calling it "more of a half-node".

14nmFinfet3_575px.png
 

Sheep221

Golden Member
Oct 28, 2012
1,843
27
81
At this point I think they gonna continue to improve IGP at faster rate than IPC of computation cores. Although dismissal of discrete video cards in basic computers and mediocre gaming machines would be huge leap forward.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
No. Marketers and engineers tell you that FinFET is the biggest improvement in a decade, century, whatever.

But in reality FinFET is merely an enabler for Moore's Law to continue. Without FinFET, you'd get near zero benefits from process. With it, you get some portion of the benefits you used to get.

The GF guy is claiming 35% reduction in costs and 20% better performance that ended at 20nm. Intel said something similar, that process gen gives you 30% reduction in power OR 20% better performance.

That seems quite nice, but don't forget we didn't need all these fancy nomenclature technologies to get that sort of improvement as far back as 65nm. Remember when Pentium 4 went from 0.18u to 0.13u? Clock speed improved by 50%. A straight shrink gave you 40% power reduction. A 100w CPU on 0.18u turned to 60W on 0.13u.

After about 45nm, you needed various things to get that 30%/20%. Strained Silicon, Copper Interconnects, High-K, SOI, FinFET.

Now with FinFET, despite ALL that work, you seem to get that improvement only on certain segments. Like teeny IoT chips and Smartphones.
FinFets also have a issue with EM, and that translates to more heat, as much as 30% more, and it seems they also have a reliability issue.
I am wondering if that is one of the reasons why clocks are so turbulent, and how this translates to, the higher the base clock, the more sporadic the clocks become in order to try and mitigate the EM?

Or, looking at it another way, just how long will FinFet designs last in the real world compared to previous designs? Are we looking at video cards (and CPUs) that may only last 5 or so years now?
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
A bit related, BK has stated their performance targets for 10 and 7nm: 10-20%.

We plan to have consistent performance improvements 10% to 20% in each one of these nodes.

http://seekingalpha.com/article/397...d-c-bernstein-strategic-decisions?part=single

So, again this gets back to what level of performance are you really providing with each one of those nodes. So if you come out with a new node, but you have very little improvement, but you say, hey I went from 14 or 16 nanometers to 10 nanometers, but I provided a 5% to 10% improvement, that's not really a node, they are not well. What we'll do is stay forward out a little bit longer, but yes we're going to provide a standard 20% improvement, 50% scaling factor. So we're going to keep all those numbers very consistent as we go and that's actually what we believe continues to keep us in our leadership.

Edit: Also interesting is his comment about tablets. They were afraid tablets would take over PCs. From that POV, if the 30B PC market had shifted to a 30B tablet market (to be simplistic), then the 10B that Intel spend to make sure they don't lose their volume just in casr isn't all too bad, maybe.

So it depends on how much back in time, how much is the way back machine works. It actually go even after three years, so let’s say three years. Okay, so I only get up small way back. So three years ago right, we were all worried to death about the tablet and 30% growth and I guess if I could go back and look at it I'd say, you know what take your beatings you'll be right. The tablets are going to fade away and don't make the investments. I don't think any of us could have known, but sure I would have loved to have that foresight and say, yes tablets are going to die within a year and start declining 10% to 20% would it have saved us a lot of efforts and all.
 
Last edited:

24601

Golden Member
Jun 10, 2007
1,683
40
86
A bit related, BK has stated their performance targets for 10 and 7nm: 10-20%.



http://seekingalpha.com/article/397...d-c-bernstein-strategic-decisions?part=single



Edit: Also interesting is his comment about tablets. They were afraid tablets would take over PCs. From that POV, if the 30B PC market had shifted to a 30B tablet market (to be simplistic), then the 10B that Intel spend to make sure they don't lose their volume just in casr isn't all too bad, maybe.

"Nobody could have predicted" "Nobody could have known"

Redacted. Everyone and their mother that knew what they were talking about saw the Tablet and Smartphone crash ages ago.

The only people who say they didn't are the same people who do news stories and simple trash information injection to pump and dump stocks.

The second someone knew internal information on the Pentium III derivatives that Intel were planning to release to the laptop market (yonah) they would have immediately known that they should have used those cores in tablets and smartphones as opposed to the designed to fail while costing billions in design costs Atom which was always the most braindead possible thing they could have done and basically anyone that knew of the Atom knew that.

And don't give me that crap that taking a laptop chip and bolting on SoC elements onto it takes 5 years +7 years of fore-planning or some other bull.

The R&D and Design of chips was always the major cost, and Atom saved die space that didn't matter since Intel was a foundry since time immemorial.




No profanity in tech.


esquared
Anandtech Forum Director
 
Last edited by a moderator:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
And don't give me that crap that taking a laptop chip and bolting on SoC elements onto it takes 5 years +7 years of fore-planning or some other bull.

The R&D and Design of chips was always the major cost, and Atom saved die space that didn't matter since Intel was a foundry since time immemorial.

The Atoms did one thing that Core chips never were able to(even today). That's to get it into platform power levels competitive with ARM ones.

The Skylake Core M chips are barely there with 10-inch+ designs while Atom scaled even lower to 7-8 inch devices.

Now, its a fault of Intel that they couldn't scale it down. I honestly think that has to do with their PC-mentality strategy. Atom was able to get it to that level.

Also, it may be that some design philosophies of theirs might have been applied too rigidly? Remember the rule that circuits can only be implemented if 1% performance improvement brought only 1% increase in power usage? And they changed that to 2% performance/1% power use? What if during that they lost sight of the big picture? Perhaps in their likely frenzy to catch up to ARM competitors in power they did less than they would have otherwise.

Regardless of the reason, the conclusion being:
-Core can't scale in platform power, meaning its relegated to big devices
-They are simply not competitive at all in the mobile space, period
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
The Atoms did one thing that Core chips never were able to(even today). That's to get it into platform power levels competitive with ARM ones.

The Skylake Core M chips are barely there with 10-inch+ designs while Atom scaled even lower to 7-8 inch devices.

Now, its a fault of Intel that they couldn't scale it down. I honestly think that has to do with their PC-mentality strategy. Atom was able to get it to that level.

Also, it may be that some design philosophies of theirs might have been applied too rigidly? Remember the rule that circuits can only be implemented if 1% performance improvement brought only 1% increase in power usage? And they changed that to 2% performance/1% power use? What if during that they lost sight of the big picture? Perhaps in their likely frenzy to catch up to ARM competitors in power they did less than they would have otherwise.

Regardless of the reason, the conclusion being:
-Core can't scale in platform power, meaning its relegated to big devices
-They are simply not competitive at all in the mobile space, period

Could have easily done so by taking a single core yonah core and bolting on the SoC parts that weren't 2 nodes behind like Intel loves to do for their platform chips.

Clock the thing at vmin and you now have an apple style chip.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
The Atoms did one thing that Core chips never were able to(even today). That's to get it into platform power levels competitive with ARM ones.

The Skylake Core M chips are barely there with 10-inch+ designs while Atom scaled even lower to 7-8 inch devices.

Now, its a fault of Intel that they couldn't scale it down. I honestly think that has to do with their PC-mentality strategy. Atom was able to get it to that level.

Also, it may be that some design philosophies of theirs might have been applied too rigidly? Remember the rule that circuits can only be implemented if 1% performance improvement brought only 1% increase in power usage? And they changed that to 2% performance/1% power use? What if during that they lost sight of the big picture? Perhaps in their likely frenzy to catch up to ARM competitors in power they did less than they would have otherwise.

Regardless of the reason, the conclusion being:
-Core can't scale in platform power, meaning its relegated to big devices
-They are simply not competitive at all in the mobile space, period

Forget raw CPU perf/W, Intel are leagues behind ARM when it comes to overall SoC integration. Which isn't helped that OEMs outside the traditional PC business had 2 decades of hindsight to wisely placed their bets against Intel and yet another possible CPU supplier monopoly.