AMD cant fab at 65nm but can at 55?

taltamir

Lifer
Mar 21, 2004
13,576
6
76
It just occured to me that AMD has been having a ton of problems with the whole 65nm transition. Resulting in lower speeds and yields... the older Windsor processors are still the fastest thing AMD has to offer, meaning their 65nm processor makes SLOWER processors due to manufactuering issues...

On the other hand AMD has described their transition with video card cores to 55nm as perfect. They expected a bunch of problems and not a single one of them surfaced... it just worked the way they wanted it to right away...

Which makes me wonder. Were they making a 55nm fab and a 65nm fab at the same time? Not even intel is that agressive and I thought AMD was loosing money...

And why don't they manufacture the CPUs on the perfect 55nm fab rather then the defunct 65nm one? (especially considering, and I am just assuming here, that a cpu has a higher margin then a gpu... due to most of the money paying for the core... and not for things like the ram, board, cooler, etc)

Now I would assume there is more to this story otherwise they would be doing it... can anyone enlighten me as to what it is?
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
The GPU's are designed by AMD/ATI, but they are prduced by a third party, probably TSMC. ATI didn't have any fabs of their own, and neither does nvidia.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Aye, the 55nm GPUs are made by TSMC. Why they don't contract TSMC to fab 55nm CPUs, I am uncertain, but someone with more technical knowledge than I will likely explain it.
 

jones377

Senior member
May 2, 2004
461
64
91
Originally posted by: Bateluer
Aye, the 55nm GPUs are made by TSMC. Why they don't contract TSMC to fab 55nm CPUs, I am uncertain, but someone with more technical knowledge than I will likely explain it.

Lots of reasons, with one of the fundamental ones is that TSMC doesn't use SOI.

 

Mana

Member
Jul 3, 2007
109
0
0
Originally posted by: jones377
Originally posted by: Bateluer
Aye, the 55nm GPUs are made by TSMC. Why they don't contract TSMC to fab 55nm CPUs, I am uncertain, but someone with more technical knowledge than I will likely explain it.

Lots of reasons, with one of the fundamental ones is that TSMC doesn't use SOI.

Another being that one of the terms of the AMDs x86 license is that they must produce most of the CPUs on their own.
 

MetaDFF

Member
Mar 2, 2007
145
0
76
Originally posted by: Mana
Another being that one of the terms of the AMDs x86 license is that they must produce most of the CPUs on their own.

Yep that is correct, the cross-license says that not more than 20% of AMD's microprocessors be manufactured by a third party.

For a while I thought AMD was contracting some CPU manufacturing to UMC. Whatever happened to that?
 

AlabamaCajun

Member
Mar 11, 2005
126
0
0
GPUs are only running at roughly 700mhz to 1G. With GFX they could change the code to run on massively pipelined procs with simple ring buffers. CPUs on the other hand are having to deal with stale x86 linear instructions because we are stuck with intel and microsoft. Give them credit for the last few years of moving toward multi-threaded apps with the latest dotnet apps moving closer to a parallel arch. Give AMD a lot of credit for pushing multi-cores into the mainstream and NVidia for the stream processor in video cards and one of the game boxes for cell processing. There is a lot more to do today than pointing fingers at was just recently the most advanced consumer processor manufacturers. One final note on CPUs is the massive about of current rushing through wire smaller than ever but still achieving 2Ghz. AMD designed all this to run at 2.0GHZ which was just becoming common when the XP line was moving up and the Northwood hit. It took over a year to reach 3G on an Athlon and later before a commercial version was out. Intel will use its massive number of FAB lines to produce millions of chips to achieve enough EX chips over a short period of time. Ever wonder why the market is still flooded with old chips at $25? High bins come at a higher prices but if you can afford it, you can produce more high dollar chips. AMD is now trying to do this on a lean scale and getting pretty good at it. Look at Brisbanes now retailed at 2.7G for $120 bucks less in Euros. The pressure is so high with demand for better stuff that we forget that we just got here. It's too expensive now to shift CPUs to 55nm with already so much set up to go 45nm. If someone seens a progress report, please post a link.
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Originally posted by: AlabamaCajun
CPUs on the other hand are having to deal with stale x86 linear instructions because we are stuck with intel and microsoft.

This is interesting revisionist history. Intel actually tried to move us away from x86 with 64-bit EPIC-based Itanium chips and MS built Windows for that instruction set, but AMD insisted in extending x86 to 64-bits and thus gave us at least another decade of "stale x86" chips. I personally don't have a problem with that, because x86 is working fine right now, but it's incorrect to "blame" Intel and MS for the continued existence of the x86 instruction set.
 

AlabamaCajun

Member
Mar 11, 2005
126
0
0
Originally posted by: SexyK
Originally posted by: AlabamaCajun
CPUs on the other hand are having to deal with stale x86 linear instructions because we are stuck with intel and microsoft.

This is interesting revisionist history. Intel actually tried to move us away from x86 with 64-bit EPIC-based Itanium chips and MS built Windows for that instruction set, but AMD insisted in extending x86 to 64-bits and thus gave us at least another decade of "stale x86" chips. I personally don't have a problem with that, because x86 is working fine right now, but it's incorrect to "blame" Intel and MS for the continued existence of the x86 instruction set.

Ok that's fair, Add AMD to the mix for introducing the 64 bit X86. I just thought the Itans were 32bit extended X86, my bad.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I thought one of the points of AMD buying ATI was that they could produce the GPU on the home fabs.... I guess I was wrong.

And sexyK is right. Itanium was a completely redeisgned chip. So yea, the only reason we DIDN'T move away from the x86 is because AMD made their AMD64 platform... and then intel had to retool and conform to it.
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: SexyK
This is interesting revisionist history. Intel actually tried to move us away from x86 with 64-bit EPIC-based Itanium chips and MS built Windows for that instruction set, but AMD insisted in extending x86 to 64-bits and thus gave us at least another decade of "stale x86" chips. I personally don't have a problem with that, because x86 is working fine right now, but it's incorrect to "blame" Intel and MS for the continued existence of the x86 instruction set.

Originally posted by: taltamir
And sexyK is right. Itanium was a completely redeisgned chip. So yea, the only reason we DIDN'T move away from the x86 is because AMD made their AMD64 platform... and then intel had to retool and conform to it.

AMD64 was the most viable solution to all the issues IA-64 presented outside the big iron segment. Of course Microsoft was more than willing to pick up this new ball and play it instead of putting all their money on a horse with a complicated and uncertain future. Its more than likely that we would still be playing the 32bit fiddle even without AMD64 (hell, the majority still is, despite AMD64). IA-64 and its development was just not attractive enough for the consumer space to begin with.

As far as bullying the market into a completely new architecture is concerned, Intel failed. And thats probably not that bad (do you really think they would have licensed IA-64 to AMD right away? We'd be stuck with just Intel).

 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
As people have pointed out, AMD isn't manufacturing the GPUs in-house.

There are some significant differences between the small-scale structures used in GPUs vs CPUs - the tranistors are a little different, the gates are designed differently and the metal layers are also different. Unfortunately, I can't write a very good post about the differences because there's not enough publicly-available information. There are also significant differences in the design methodologies used that make it much quicker to port a GPU between manufacturing processes than a CPU (which is part of the reason you don't usually see x86 CPUs at half nodes like 110nm, 80nm, and 55nm).
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Originally posted by: MetaDFF
Originally posted by: Mana
Another being that one of the terms of the AMDs x86 license is that they must produce most of the CPUs on their own.

Yep that is correct, the cross-license says that not more than 20% of AMD's microprocessors be manufactured by a third party.

For a while I thought AMD was contracting some CPU manufacturing to UMC. Whatever happened to that?

Interesting, didn't know that. I do recall them outsourcing some production to UMC or IBM, but don't recall hearing anything else.
 

justly

Banned
Jul 25, 2003
493
0
0
As others have already mentioned GPUs are not manufactured in AMD fabs.

The only thing left to discuss is your assumption that AMDs 65nm process has manufacturing issues.

If you really want me to believe that then you should have no problem explaining why
1. AMD would focus 65nm on what is probably their largest volume selling parts instead of focusing it on their highest clocked (and probably lowest volume) nitch parts.
2. AMD would stop 90nm production.
3. so many overclockers favored the 65nm parts compared to the equally clocked 90nm parts.
4. AMD would even attempt to build a monolithic quad on a process that has issues.

What did you expect AMD to do with a fully functional 90nm fab, build low end parts that they would end up with no profit margin on, or scrap the fab completly and cut thier own production capacity for no good reason.

I will acknowlage that fab rampping and the learning curve associated with any new process can take some time, but claiming that their 65nm process has issues just because they chose to maximize production capacity and profit margins the way they did makes me wonder if you are seeing the whole picture.

BTW the outsourcing was to Chartered Semiconductor.
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: taltamir
I thought one of the points of AMD buying ATI was that they could produce the GPU on the home fabs.... I guess I was wrong.

And sexyK is right. Itanium was a completely redeisgned chip. So yea, the only reason we DIDN'T move away from the x86 is because AMD made their AMD64 platform... and then intel had to retool and conform to it.

Or how about Itanium wasn't the success that Intel thought it was?

To be successful, it had to emulate the multitude of x86 code out there so people could "switch over" at their leisure... however we know that the original Itanium's sucked badly at emulating x86 code.

Successfully changing to another entirely different instruction set takes a lot more than just a new CPU and Operating System (unless you just want to sit there and play Solitaire..?)

It also takes all the applications that you use on a daily basis.. and that just wasn't going to happen..

So AMD saw an opportunity (that I guess Intel didn't?) and developed the AMD64 instruction set... and I do remember Intel saying it wasn't going to go anywhere and that their Itanium was the way to go...

And I also clearly remember that an Intel rep said at one point that the market wasn't ready for a 64bit x86 processor (as if AMD was wasting their time)... then about 3-4 months later they announced their own competing CPU.. (wow the market REALLY changed in 3-4 months didn't it!)..
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Saying AMD "can't" fab at 65nm is a misnomer. I have heard numerous reports of the more current A64 5000+ and 3600+ chips overclocking to 3.4ghz or so. That is faster than many C2Ds can reach (and I have heard of several cases where a C2D couldn't get north of 2.7ghz).

They're having problems with Phenom at 65nm, but really that's less related to the die size. IMO the bulk of their probmes stem from the complexity of the chip, and really mainly due to the fact that what they are producing has never been done before (single-die quad-core with IMC and SSE4+).
 

AlabamaCajun

Member
Mar 11, 2005
126
0
0
Originally posted by: taltamir
I thought one of the points of AMD buying ATI was that they could produce the GPU on the home fabs.... I guess I was wrong.

And sexyK is right. Itanium was a completely redeisgned chip. So yea, the only reason we DIDN'T move away from the x86 is because AMD made their AMD64 platform... and then intel had to retool and conform to it.

Itanium was a failure so you are saying X86 is the best IS to run on :Q
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: justly
As others have already mentioned GPUs are not manufactured in AMD fabs.

The only thing left to discuss is your assumption that AMDs 65nm process has manufacturing issues.

If you really want me to believe that then you should have no problem explaining why
1. AMD would focus 65nm on what is probably their largest volume selling parts instead of focusing it on their highest clocked (and probably lowest volume) nitch parts.
2. AMD would stop 90nm production.
3. so many overclockers favored the 65nm parts compared to the equally clocked 90nm parts.
4. AMD would even attempt to build a monolithic quad on a process that has issues.

What did you expect AMD to do with a fully functional 90nm fab, build low end parts that they would end up with no profit margin on, or scrap the fab completly and cut thier own production capacity for no good reason.

I will acknowlage that fab rampping and the learning curve associated with any new process can take some time, but claiming that their 65nm process has issues just because they chose to maximize production capacity and profit margins the way they did makes me wonder if you are seeing the whole picture.

BTW the outsourcing was to Chartered Semiconductor.

AMD, like intel, ownes multiple plants. One plant is upgraded to a newest process, while the older ones produce the older process parts, continuing to pay off the cost of their last upgrade. The upgrades are staggered between the plants to decrease costs and maximize revenue.

It is normally a 3 plant cycle. Plant A is upgraded to the newest proces (v2) while B and C are old process (v1). x months later plant B is upgraded to the same process as part A (v2). x monthes later C is upgraded to a newer process (v3) then A and B. X monthes later A is upgraded to the same process as C (v3). And so on..
This gives each plant a maximal run with its existing equipment.. making the most money on the purchase of the machines involved.
Early in a process they manufacture it in one plant while two plants make the older parts. Later as prices go down and demand increases for the newest parts two plants make the new process while one makes the old process. Finally that last old process plant is replaced with a newer process then all of them allowing for new high end parts.

AMD has serious problems with the 65nm process. It is the first time ever that a new, smaller process cannot be used to make faster processors then the older process, instead making slower ones. This is why AMD is in so much trouble. They should be subsidizing the costs of the new upgrade by selling the parts at 300-1000$ per cpu. Which would slowly decrease... so that by the time the second plant is upgraded to 65nm they would be selling in the 150+ range with 100 and less bering reserved for 90nm oldest parts (in about 10 monthes from today). The problem is that the 65nm parts are SLOWER then the 90nm parts...


So to answer the specific points
1. AMD focuses on 65nm because it is their newest, smallest process, they NEED it to work to survive.
2. AMD will stop production when the last 90nm plant is upgraded... the reason they will upgrade to a problematic process is because despite being problematic it is still their future, and it allows for more processors per waffer, greatly decreasing their cost per CPU.
3. It is cheaper, it makes less heat, it takes less power... all very good reasons to use it to overclock... Rather then taking a maxed out 90nm part and trying to OC it.. oh yea, and because of the black edition 5000 prcessor having an unlocked multi but NOT being maxed out... (like the 90nm black edition part which sells for much more and comes with 3.2ghz default clock). The problematic part is that it achives max speeds lower then the previous process, which is unheard of... new processes always achieve MUCH faster max speeds.
4. It is impossible for them to build it on anything less then 65nm.. even on 65nm it is too big... besides asking why they would try making a quad core on a problematic process is stupid. They didn't KNOW the process will be problematic when they started designing them years ago. They had to release quads with max speed of 2.4ghz instead of 3ghz+ like they expected BECAUSE the process is problematic.


Originally posted by: AlabamaCajun
Originally posted by: taltamir
I thought one of the points of AMD buying ATI was that they could produce the GPU on the home fabs.... I guess I was wrong.

And sexyK is right. Itanium was a completely redeisgned chip. So yea, the only reason we DIDN'T move away from the x86 is because AMD made their AMD64 platform... and then intel had to retool and conform to it.

Itanium was a failure so you are saying X86 is the best IS to run on :Q

Thats a pretty big leap from what I said isn't it?
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: taltamir
It just occured to me that AMD has been having a ton of problems with the whole 65nm transition. Resulting in lower speeds and yields... the older Windsor processors are still the fastest thing AMD has to offer, meaning their 65nm processor makes SLOWER processors due to manufactuering issues...

On the other hand AMD has described their transition with video card cores to 55nm as perfect. They expected a bunch of problems and not a single one of them surfaced... it just worked the way they wanted it to right away...

Which makes me wonder. Were they making a 55nm fab and a 65nm fab at the same time? Not even intel is that agressive and I thought AMD was loosing money...

And why don't they manufacture the CPUs on the perfect 55nm fab rather then the defunct 65nm one? (especially considering, and I am just assuming here, that a cpu has a higher margin then a gpu... due to most of the money paying for the core... and not for things like the ram, board, cooler, etc)

Now I would assume there is more to this story otherwise they would be doing it... can anyone enlighten me as to what it is?

No, no, no......!

1. AMD doesn't have any problem with their 65nm process, the problem is with their 65nm chip design.
2. AMD's 65nm yields are excellent...better than their 90nm yields. Again, the problem is one of chip design.

As has been determined in previous threads, there is no longer any such thing as an optical shrink. This means that they can't just take a design and shrink it, they have to redesign the chip for each shrink...
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: justly

If you really want me to believe that then you should have no problem explaining why
1. AMD would focus 65nm on what is probably their largest volume selling parts instead of focusing it on their highest clocked (and probably lowest volume) nitch parts.
2. AMD would stop 90nm production.
3. so many overclockers favored the 65nm parts compared to the equally clocked 90nm parts.
4. AMD would even attempt to build a monolithic quad on a process that has issues.


They can obviously produce working chips at a given clockspeed with 65nm, thus it behooves them to get their value and mainstream chips on that process for cost reasons.

I got schooled on point 2 a few weeks ago in another thread. AMD's highest X2 models are still 90nm. This includes the X2 6000+, 6400+, and FX 74 models. I don't see how they can't be having problems with 65nm if they had to go back to 90nm for new high-end SKUs over a year after they introduced 65nm chips.

I think 4 is fairly self-explanatory given Phenom's current clock-scaling issues.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: aka1nas
Originally posted by: justly

If you really want me to believe that then you should have no problem explaining why
1. AMD would focus 65nm on what is probably their largest volume selling parts instead of focusing it on their highest clocked (and probably lowest volume) nitch parts.
2. AMD would stop 90nm production.
3. so many overclockers favored the 65nm parts compared to the equally clocked 90nm parts.
4. AMD would even attempt to build a monolithic quad on a process that has issues.


They can obviously produce working chips at a given clockspeed with 65nm, thus it behooves them to get their value and mainstream chips on that process for cost reasons.

I got schooled on point 2 a few weeks ago in another thread. AMD's highest X2 models are still 90nm. This includes the X2 6000+, 6400+, and FX 74 models. I don't see how they can't be having problems with 65nm if they had to go back to 90nm for new high-end SKUs over a year after they introduced 65nm chips.

I think 4 is fairly self-explanatory given Phenom's current clock-scaling issues.

On point 2, the 90nm chips are now inventory only. AMD stopped producing 90nm chips at the end of Sept when they shut down Fab 30 for conversion to Fab 38. Fab 36 is 65nm only...
While they still have the option to make 90nm chips out of Chartered, they only do that as overflow (if you read Chartered's most recent reports, you'll see that AMD is making very, very few chips there...).
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well, the margins on 90nm chips should be lower due to the increased die size... what is interesting is that if they don't fix their problems pronto we will run out of availability on the truely fast amd chips, like the 6000, 6400, and FX74. But then, those chips aren't really worth buying since they are so expensive compared to a similarly performing c2d (due to the fact they are the cherry picked top of the top of the 90nm process)...
It is a shame too because I want a CPU upgrade.. I just bought a new AM2 nforce 5 based motherboard (for the 6SATA ports) 3 weeks ago or so to consolidate my computers (I am now running a raptor OS drive and 2 raid arrays all in one computer, and am gonna take the others apart for selling on ebay). But now I wish I would have gone with a c2d upgrade instead, because I am either stuck with my 2ghz x2 3800+ or with throwing money on a tiny upgrade.

Very interesting point about the fact that optical shrinks don't work anymore and that they HAVE to redesign chips and that is what giving them the trouble... and that they are actually getting better yields on 65 then on 90.

That makes me wonder though, what are the steppings then? I always thought a higher stepping was the exact same design but with slight improvements to the manufacturing process, resulting in higher quality processors that can clock higher. But you are saying that basically steppings are differences in the actual chip design meant to fix problems?
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: taltamir


That makes me wonder though, what are the steppings then? I always thought a higher stepping was the exact same design but with slight improvements to the manufacturing process, resulting in higher quality processors that can clock higher. But you are saying that basically steppings are differences in the actual chip design meant to fix problems?

Essentially, yes...
For instance, the main difference in the B3 stepping is fixing the TLB errata...though there may be many other fixes as well.
Many people think that a new stepping can only achieve a minor improvement on a CPU, and that is proportionally correct. Most errata fixes don't represent much in the way of substantial changes because the previous release was pretty good already
However in certain cases (such as this one, where the chip was rushed to market), the errata fix can represent a very large boost in both performance and scaling.

Another good example of this would be the AMD T-bred chips...T-bred "A" chips came out and they ran very hot and couldn't clock over 1.8 GHz. A few months later, AMD released a new stepping called T-Bred "B", and it ran FAR cooler and scaled to 2.2 GHz.
The fix made with the new stepping ended up being adding an additional layer to the design of the CPU.

As to changes in the manufacturing process, this goes on daily...
It's interesting that Intel and AMD use different methodolgies for adjusting their manufacturing. They are both different, both excellent, and both suited to the personalities of the individual companies.

Intel uses a method called "Copy Everything". This means that they continually research how the process is going at a small facility (or a small area of a large facility). When they discover a means to improve their manufacturing (say increasing the doping in a particular area), they copy the new method to all of their Fabs. This means that all of their lines everywhere must be set up identically so that new methodologies can be copied in.

AMD uses a more automated process called APM. AMD's Fabs are far more automated than anyone elses...even moving the wafers around is done completely via an automated sealed carrier system. The software/hardware system that controls this is called APM (Automated Precision Manufacturing), and it is both extremely flexible and granular...
It can adjust the doping on an individual transistor of a single die on a specific wafer.
When dice are tested, the data is automatically correlated to every step of that die's manufacture (which wafer, where it came from, etc...), so the system has instant feedback on how manufacturing effects end results...it also automatically updates current manufacturing instantly, including wafers that are already in progress.

Copy Everything suits Intel to a tee because they are very large scale, and it's the most cost effective way to proceed.
APM suits AMD because they have fewer Fabs and they have to be much more efficient and quick in order to compete. They also don't have the employee numbers that Intel does...
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
So steppings are also arbitrarily decided by the companies... they could make a slight modification and call it a new chip. Or they could make huge modifications and call it a stepping...
That is EXTREMELY promising for phenom... becaue the chip is obviously not ready, they could fix some problems and make it ready and have HUGE improvements in a "stepping". Rather then having a chip that works as good as its design can be... and has only room for some slight optimization... so a stepping will give very little and you have to wait for a completely new design.

Kinda like nvidia is now selling G92 based video cards that are completely and absolutely different then G80 yet are still called 8800 (GT).

also, what is doping?

Anyways, wouldn't combining both systems make the most sense?