Question Intel Q3: Ouch

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,196
136
The process roadmap has always been blank when you go 10 years out, it isn't like back in 1985 they had a roadmap with everything planned out to 1995, let alone 2020.

The industry doesn't need to know where it is going in 10 years for companies ASML and Canon to build the next generation of equipment. ITRS serves that industry, so they don't need to care beyond that horizon, even though there are obviously some people at places like IBM or academia doing blue sky research further out. Even with the roadmap, we end up seeing detours like how EUV was delayed and delayed and delayed and we ended up doing double and even quadruple patterning on 193nm sources. EUV was out near the end of the roadmap for at least a decade until they were finally able to make it work, but it didn't halt progress in the meantime.

I'm sure there will be similar detours if high NA EUV is delayed, and we'll have to see whether they can go to an even shorter "beyond EUV" wavelength or can make e-beam fast enough for mass production. As long as there are some companies willing to fund that exploration, they'll find a way like they always have in the past.

Even if TSMC ended up the last man standing at the leading edge of logic, there will still be the big DRAM firms who need to keep shrinking (though their "process name" metrics are different, they work in the same process domain - Samsung has just begun making DRAM using EUV) so there will be companies willing to fund this because demand for more RAM has never stopped even during shortages when prices shoot up.

Before the switch to FinFETs, fabs knew for years that it would be needed sub 20 nm and planned accordingly. EUV delays has caused some delays/shifts in plans, but EUV was the known path forward for over a decade now, it was just a matter of time of getting it ready for volume production. These techniques are in R&D for many years before the general public ever even hears of their existence. There are additional things coming down the pipe that will get the industry through the next 5 - 8 years with a clear path forward (though of course delays may happen). Further out, it gets a lot less clear. I have a pretty good relationship with one of my old university professors who specializes in device physics and process engineering so I chat him up every once in a while to see what is coming and I am just relaying the info I get from him who is heavily into the research side of process engineering. Again, I'm not saying that there aren't things in development that will extend further advancements 10+ years, just that it's not so clear as it has been in the past what will actually allow this to happen.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
731
126
My lord, you don't have a clue.

Have you seen the projected cost of Fabs as we go forward?
Do you realize the chip output needed to finance R&D into further node advancement?

Many have seen the future suggests only the largest with high utilization will be able to afford the expenses needed.
Have you seen how much revenue and how much net income intel makes compared to tsmc?
TSMC has to use very expensive cutting edge nodes while intel makes twice the revenue and net income because they use very cheap old nodes and they are their only customers so they don't care.
You are right in theory with what you say but the reality is that intel is making big bucks so why would they stop making big bucks?What is the motivation in using a more expensive node and losing some of the margin?

 

moinmoin

Diamond Member
Jun 1, 2017
4,965
7,706
136
Before the switch to FinFETs, fabs knew for years that it would be needed sub 20 nm and planned accordingly. EUV delays has caused some delays/shifts in plans, but EUV was the known path forward for over a decade now, it was just a matter of time of getting it ready for volume production. These techniques are in R&D for many years before the general public ever even hears of their existence. There are additional things coming down the pipe that will get the industry through the next 5 - 8 years with a clear path forward (though of course delays may happen). Further out, it gets a lot less clear. I have a pretty good relationship with one of my old university professors who specializes in device physics and process engineering so I chat him up every once in a while to see what is coming and I am just relaying the info I get from him who is heavily into the research side of process engineering. Again, I'm not saying that there aren't things in development that will extend further advancements 10+ years, just that it's not so clear as it has been in the past what will actually allow this to happen.
There are many possibilities. As the costs are rising, exploring those possibilities becomes more costly as well (prohibitively so for purely academic research) which keeps the path ahead not as clear as in the past. In the past that was often covered by basic research already, with foundries picking up the pieces. Now the foundries more and more need to make it work themselves which is what caused the repeated delays with EUV (and may also be an influence on Intel's process node cadence falling apart).
 

maddie

Diamond Member
Jul 18, 2010
4,762
4,728
136
Have you seen how much revenue and how much net income intel makes compared to tsmc?
TSMC has to use very expensive cutting edge nodes while intel makes twice the revenue and net income because they use very cheap old nodes and they are their only customers so they don't care.
You are right in theory with what you say but the reality is that intel is making big bucks so why would they stop making big bucks?What is the motivation in using a more expensive node and losing some of the margin?

Stunning, is all I'll say.
 

NTMBK

Lifer
Nov 14, 2011
10,245
5,035
136
Have you seen how much revenue and how much net income intel makes compared to tsmc?
TSMC has to use very expensive cutting edge nodes while intel makes twice the revenue and net income because they use very cheap old nodes and they are their only customers so they don't care.
You are right in theory with what you say but the reality is that intel is making big bucks so why would they stop making big bucks?What is the motivation in using a more expensive node and losing some of the margin?


I'm just going to leave this here..

tsmc-2019-2018-revenue.png
 

moinmoin

Diamond Member
Jun 1, 2017
4,965
7,706
136
Have you seen how much revenue and how much net income intel makes compared to tsmc?
TSMC has to use very expensive cutting edge nodes while intel makes twice the revenue and net income because they use very cheap old nodes and they are their only customers so they don't care.
You are right in theory with what you say but the reality is that intel is making big bucks so why would they stop making big bucks?What is the motivation in using a more expensive node and losing some of the margin?

Thanks, you outline excellently why at this rate Intel is destined to become a commodity manufacturer far away from the competitive cutting edge tech.
 

Scarpozzi

Lifer
Jun 13, 2000
26,389
1,778
126
I bet this is white collar covid job related losses. My assumption is that most mobile employees are working off laptops and offices aren't following typical updates this year. My office had me slated to upgrade to a new i7 desktop back in March and it never happened. I've been remote, so I didn't push for it. This will be a lot of lost revenue, but I'm guessing a lot of orders will eventually come in to replace systems out of warranty to backfill...just will equate to lost revenue for 4-6 quarters as business stabilizes again and people maybe return to desktop computing.
 

ondma

Platinum Member
Mar 18, 2018
2,725
1,288
136
Have you seen how much revenue and how much net income intel makes compared to tsmc?
TSMC has to use very expensive cutting edge nodes while intel makes twice the revenue and net income because they use very cheap old nodes and they are their only customers so they don't care.
You are right in theory with what you say but the reality is that intel is making big bucks so why would they stop making big bucks?What is the motivation in using a more expensive node and losing some of the margin?

The motivation to use a more expensive node is to make a superior (or even competitive) product so you can maintain market share without drastic price cuts. Yes, Intel is making a lot of money right now using 14 nm, but it cant continue indefinitely.
 

Nereus77

Member
Dec 30, 2016
142
251
136
Intel is screwed as long as they cannot get their nodes working properly. Look at what being stuck on 28nm did to AMD. If they can fix 10nm, 5nm and have 3nm working timeously, then maybe they can stablize their marketshare at 50%. If not, they will find their marketshare shrinking more rapidly than it is currently happening...
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
The motivation to use a more expensive node is to make a superior (or even competitive) product so you can maintain market share without drastic price cuts. Yes, Intel is making a lot of money right now using 14 nm, but it cant continue indefinitely.

Yeah, TheElf seems to think that Intel can just keep falling farther and farther behind both AMD an TSMC and the money will keep falling off trees for them. Intel as we know them are done. You can stick a fork in them. Their big advantage is gone never to come back.
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
Intel is screwed as long as they cannot get their nodes working properly. Look at what being stuck on 28nm did to AMD. If they can fix 10nm, 5nm and have 3nm working timeously, then maybe they can stablize their marketshare at 50%. If not, they will find their marketshare shrinking more rapidly than it is currently happening...
The momentum is certainly gaining on them and it will eventually be like a dam bursting.
 
  • Like
Reactions: Tlh97

Nereus77

Member
Dec 30, 2016
142
251
136
The motivation to use a more expensive node is to make a superior (or even competitive) product so you can maintain market share without drastic price cuts. Yes, Intel is making a lot of money right now using 14 nm, but it cant continue indefinitely.
Another 14nm+++++ CPU from Intel is going to look really silly against a 5nm Zen 4, yet it seems that is what is going to happen....
 
  • Like
Reactions: Tlh97

TheGiant

Senior member
Jun 12, 2017
748
353
106
Another 14nm+++++ CPU from Intel is going to look really silly against a 5nm Zen 4, yet it seems that is what is going to happen....
I must say I am excited to see what Intel more14+ can actually do. Mainly in gaming. Desktop CPUs are for me today from 200+EUR enough in throughput for home/office productivity tasks.
 
  • Like
Reactions: Zucker2k