Why does AMD/Intel/Nvidia/ATI etc have to go step by step when changing die size?

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
This is not even near my area, but I notice all these companies go 110nm, 90nm, 65, etc etc.... why not just go straight to a lower number? Does it really require the hands on of the previous size to understand how go smaller? Or what?
 

PingSpike

Lifer
Feb 25, 2004
21,758
603
126
I'd imagine they have to change all their tools every time they do a die shrink. Its probably a costly move you don't want to do more often then you need too.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Yeah but why not jump to 45nm or 10nm or whatever and boost performance by a ton?
 

DBSX

Senior member
Jan 24, 2006
206
0
0
It also probably has to do with the fact that smaller die sizes are not easy to acheive. It's not like they are holding out on smaller die sizes because they want to. It's because they can't do it. Each step in an increment, representing a step that they can make, and did.

At least that is my understanding.

\Dan
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
Originally posted by: Frackal
Yeah but why not jump to 45nm or 10nm or whatever and boost performance by a ton?

Because the technology isn't developed yet. Each step down in process technology takes years and billions of dollars to develop and perfect.

One of the members on this board is/was a process technician for Intel - I can't remember his name right of the bat, but you should be able to find one of his posts in the highly techincal archives.
 

Kakumba

Senior member
Mar 13, 2006
610
0
0
Yeah, I think there are a total of 2 members who are working/ have worked for Intel, and one or 2 others who are damn knowledgable about this stuff. As has been said, its very hard to actually go to a smaller technology.
 

stevty2889

Diamond Member
Dec 13, 2003
7,036
8
81
Basicly it's like what the others have said. It takes a long time, and a lot of research and development to do the shrinks, then you need some new equipment, some equipment upgrades, then you have to develop the proccess, then you have to make everything work, then you have to get it working well enough to actualy make good enough yields to make a sellable product. It they tried to go say from 90nm strait down to 45nm, they would have to run 90nm a lot longer, because it's more difficult to shrink it in half than to go from 90 to 65.
 

bjc112

Lifer
Dec 23, 2000
11,460
0
76
Originally posted by: A5
Originally posted by: Frackal
Yeah but why not jump to 45nm or 10nm or whatever and boost performance by a ton?

Because the technology isn't developed yet. Each step down in process technology takes years and billions of dollars to develop and perfect.

One of the members on this board is/was a process technician for Intel - I can't remember his name right of the bat, but you should be able to find one of his posts in the highly techincal archives.

WingZ
 

ND40oz

Golden Member
Jul 31, 2004
1,264
0
86
Originally posted by: PingSpike
I'd imagine they have to change all their tools every time they do a die shrink. Its probably a costly move you don't want to do more often then you need too.

They have to change Fabs everytime they do a die shrink. Intel builds their Fabs around the die and wafer size. When they change, off to a new Fab, it's cheaper to build a new one, then retool an old one.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
One of the members on this board is/was a process technician for Intel - I can't remember his name right of the bat, but you should be able to find one of his posts in the highly techincal archives.
Of people who post regularly, I know of two fab guys: Wingznut (Intel) and Eskimo (not disclosed). And I know there are a couple of circuit designers: pm (Intel) and dmens (Intel). And there are many others who post sporatically.

This is not even near my area, but I notice all these companies go 110nm, 90nm, 65, etc etc.... why not just go straight to a lower number? Does it really require the hands on of the previous size to understand how go smaller? Or what?

The answer to the OP's question is that fab technology is amazingly complex, takes a long time to develop, is super expensive and always looks like it's two or three generations away from "impossible". The reason for the jumps... 180nm, 130nm, 90nm, 65nm, is that they are a target. An achieveable target than can be worked towards because it's enough of an incremental improvement over the previous one to be realized, and it also results a doubling of transistors per generation - fulfilling the self-fulfilling prophecy of Moore's Law. It may look like it's a seamless easy jump from generation to generation, but in reality there always seem to be insurmountable roadblocks that will completely halt all of the progression in silicon CMOS process technology improvements. I could give several examples of ones that I have seen over the years if you are interested.

One example is the latest big "crisis" was the emerging problems with "deep-UV" lithography a few years ago. Current lithography is being done with 193nm lasers. Think about that for a minute: we are making transistors for a 90nm process using lasers that have a wavelength ~300% larger. This would seem to be impossible at first glance. Trying to do this, you'd think you'd end up with a really blurry image caused by diffraction. Instead getting a square shape, you'd get circles. But someone came up with an idea called OPC (optical proximity correction) , which basically adds "Mickey Mouse ears" to the corners of the square so that instead of getting a circle, you get something that mostly resembles a square. It's a neat trick and it made 180nm and 130nm possible. Around 90mn, the "trick" was starting to look like it was running out of steam. It might make it to 65nm, but 45nm with just OPC was not going to happen. And each generation with OPC was getting trickier to implement, more software runs taking longer and longer to calculate, and the results becoming less and less ideal. So in parallel lots of work was happening on "deep-UV" - to reduce the wavelength of the laser light used in lithography. But that ran into a seemingly endless series of obstacles with materials and qualiy as well as cost. Other proposals were floated but they all seemed unworkable. And then, about 3 years ago, someone floated the idea of "immersion" lithography - basically covering the surface of this ridiculously pure silicon wafer with water. And pretty much the initial response of everyone that I knew was "well, that will never work". And then the obstacles were knocked down one by one, and now the plan is for immersion lithography at 45nm and it seems mostly workable. There are still plenty of issues to resolve, but a fundamental obstacle to future progression was side-stepped. Again. The list of things like this is extensive. For more details on the whole OPC and laser issue, check this out: link to EETimes

Gordon Moore (a founder of Intel, famous for "Moore's Law") gave an interview a few years back (link here where he said, "I remember thinking 1 micron (a milestone the industry blew past in 1986) was as far as we could go" (because of the wavelength of visible light used at the time). The industry switched to ultraviolet light and moved on. Then the limit was going to be 0.25um... and that fell too. And then 45nm... and that looks ok now too.


There's an organization called ITRS (International Technology Roadmap for Semiconductors) who tracks current status of all of the development. They have a massive document called, unimaginitivly enough, the International Technology Roadmap for Semiconductors. It's here: http://public.itrs.net/ If you look at the 2005 Update, and then click lithography, look at page 10 and then look at the amount of red in that picture at 2007 - which, to state the obvious, is next year - and red indicates "Manufacturable solutions are NOT known." IF you look through the rest of the document, even if you don't understand a word of it, look at the amount of red boxes. The amount of things that no one knows how we are going to do. Strangely enough, if you looked at the 1995 ITRS edition and looked at lithography, you'd see something similar. And yet, all of those issues were resolved to get us where we are today.

Another quote from Gordon Moore on how hard it has been to get where we are, and how hard it is to predict the future (link). He's responding to a question about the end of Moore's Law and whether it's in the near future. "No. I figure we've got another 10 or 15 years or so to keep doing what we've done in the past. Roughly. A generation in our industry is about every three years. So 10 years is three generations. Really, that's as far ahead as I've ever been able to see. Maybe we'll come up with some ideas that will let us go beyond that. " The bit that I think is interesting in this is "really, that's as far ahead as I've ever been able to see." Gordon Moore, founder and former CEO of Intel, throughout his career was only able to how things would work out for as far as as 10 years at a time.

One last comment that I'd like to make is that I remember watching the second Terminator movie, and there's this point where they break into a company (that was ironically named something akin to Cyrix) to steal the chip that the researchers were using to develop future microprocessors to stop progress. And I remember thinking, "Well, if only it was that easy. That rather than having to think every step carefully and try 10 different ideas before we can find one that works, that we could just look at the answers."


Patrick Mahoney
Microprocessor Circuit Design Engineer
Intel Corp.
Fort Collins, CO
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Ok, since I spent a good deal of time writing and researching the above post and no one commented at all - leaving me to wonder if many even read it. I've decided to take the unprecedented step (for me anyway) of bumping it. :)
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
just as a side note, Intels 45nm won't use immersion lithography, which is interesting that an Intel guy was talking about 45nm and immersion lithography, and yet thats not what Intel will use :p
 

slpaulson

Diamond Member
Jun 5, 2000
4,414
14
81
As far as nvidia and ati go, I would think that yields are too low on the latest and greatest technology. Also nvidia, and I believe ATI don't have their own fabs, so they are counting on other fabs to have technology ready. Intel has traditionally been ahead in manufacturing technology, so that is also another reason why video cards aren't on 65nm yet.

Plus they have to look ahead at what technology will be available when they are doing their design, so it is probably safer designing for incrementally smaller technologies rather than hoping something will be ready. It seems to me nvidia got bit in the ass with their original TNT video card because the technology they were counting on wasn't ready yet. They probably didn't want that to happen again.

Also, each new technology brings up new issues that need to be resolved, and it's hard to make such a huge leap.
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
Originally posted by: pm
Ok, since I spent a good deal of time writing and researching the above post and no one commented at all - leaving me to wonder if many even read it. I've decided to take the unprecedented step (for me anyway) of bumping it. :)
I read it - pretty much ends the thread though ;)
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Originally posted by: BrownTown
just as a side note, Intels 45nm won't use immersion lithography, which is interesting that an Intel guy was talking about 45nm and immersion lithography, and yet thats not what Intel will use :p

That "intel guy" would be me? :) I wasn't aware that we aren't using immersion for 45nm. To be honest, I'm not entirely sure what we have planned for 45nm. I work in test and debug lately. The last project that I did circuit design on (and thus had a "need to know" about the process technology) was 90nm and ended two years ago. So I admit that I'm a bit out of the loop. Where did you read the details on what Intel's planning for 45nm? Since I'm not likely to be told in-house what is going on (I don't need to know), I'm always up for reading what we are doing from an external site.

Still, my point about immersion litho and OPC and lasers is all fundamentally correct - even if the timeline for execution may not be exactly correct. Do you agree?
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
118
116
Originally posted by: pm

One example is the latest big "crisis" was the emerging problems with "deep-UV" lithography a few years ago. Current lithography is being done with 248nm lasers. Think about that for a minute: we are making transistors for a 90nm process using lasers that have a wavelength ~300% larger. This would seem to be impossible at first glance. Trying to do this, you'd think you'd end up with a really blurry image caused by diffraction. Instead getting a square shape, you'd get circles. But someone came up with an idea called OPC (optical proximity correction) , which basically adds "Mickey Mouse ears" to the corners of the square so that instead of getting a circle, you get something that mostly resembles a square. It's a neat trick and it made 180nm and 130nm possible. Around 90mn, the "trick" was starting to look like it was running out of steam. It might make it to 65nm, but 45nm with just OPC was not going to happen. And each generation with OPC was getting trickier to implement, more software runs taking longer and longer to calculate, and the results becoming less and less ideal. So in parallel lots of work was happening on "deep-UV" - to reduce the wavelength of the laser light used in lithography. But that ran into a seemingly endless series of obstacles with materials and qualiy as well as cost. Other proposals were floated but they all seemed unworkable. And then, about 3 years ago, someone floated the idea of "immersion" lithography - basically covering the surface of this ridiculously pure silicon wafer with water. And pretty much the initial response of everyone that I knew was "well, that will never work". And then the obstacles were knocked down one by one, and now the plan is for immersion lithography at 45nm and it seems mostly workable. There are still plenty of issues to resolve, but a fundamental obstacle to future progression was side-stepped. Again. The list of things like this is extensive. For more details on the whole OPC and laser issue, check this out: link to EETimes



Patrick Mahoney
Microprocessor Circuit Design Engineer
Intel Corp.
Fort Collins, CO

This part in particular was fascinating, thank you very much for the insight :thumbsup:
 

Henny

Senior member
Nov 22, 2001
674
0
0
It's also incredibly expensive. You need to make the "new" technology production worthy as measured by cost effectice ouput in the 100's of millions of units produced and at the same time you need to continue using the "old" technology until it's no longer viable since it's undergone a large chunk of depreciation.

I think Gordon Moore claimed that Intel's switch to 300 mm wafer technology was the most expensive re-tooling effort in the history of worldwide manufacturing.
 

RebateMonger

Elite Member
Dec 24, 2005
11,586
0
0
Originally posted by: Kakumba
Yeah, I think there are a total of 2 members who are working/ have worked for Intel, and one or 2 others who are damn knowledgable about this stuff. As has been said, its very hard to actually go to a smaller technology.
I was with Motorola for about 20 years.

As stated above, reducing the feature size is a step-by-step process. Each step requires improvements in wafers, imaging, developing, etching, diffusion, implant, design, and other technologies. And lots of time and money. Some decreases in size require that completely new technologies be developed.
 

JonnyBlaze

Diamond Member
May 24, 2001
3,114
1
0
Originally posted by: pm
Ok, since I spent a good deal of time writing and researching the above post and no one commented at all - leaving me to wonder if many even read it. I've decided to take the unprecedented step (for me anyway) of bumping it. :)

I read it and I'm re-reading it now to see if I can understand it. ;)


Good post. :thumbsup:
 

T9D

Diamond Member
Dec 1, 2001
5,320
6
0
Well they are working on those other smaller sizes. It's just the larger size will be done first. If by some miracle they managed to finish the smaller one before they finished the larger one I'm sure they would put it out and produce it. So it's not like they wouldnt' want to.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Originally posted by: BrownTown
just as a side note, Intels 45nm won't use immersion lithography, which is interesting that an Intel guy was talking about 45nm and immersion lithography, and yet thats not what Intel will use :p

I found the articles about Intel's intent not to pursue immersion for 45nm but to rely on phase-shifting and OPC.
http://www.eetimes.com/news/semi/showArticle.jhtml?articleID=177103820
Instead, Intel (Santa Clara, Calif.) plans to extend its existing and conventional 193-nm wavelength ?dry? scanners for use in processing the critical layers at the 45-nm node, said Mark Bohr, senior fellow and director of process architecture and integration at the microprocessor giant. ?That is the plan for our standard technology,? Bohr said. ?Immersion is still an option that we?re looking at for 32-nm,? he added.

Thanks for the clarification, Browntone. It's very interesting. As I said, I've been immersed in all things "high-volume test" in the last two years, so I'm out of the loop on process issues.