AMD to transition to 28nm bulk in 2013 (digitimes)

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
I have this feeling that there must be a thread about this already on here, but I haven't see it yet and I searched a bit and so I'll post it and hope that this isn't a dupe.

I saw this this morning and thought it was interesting news:
http://www.digitimes.com/news/a20120615PD210.html

AMD is set to make a major change in its manufacturing process in 2013 and will fully switch from the existing SOI manufacturing processor to 28nm Bulk CMOS process, according to Mark Papermaster, senior vice president and chief technology officer of AMD.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I think you posted it first which is surprising because usually, yes, we tend to get AMD press releases/news stories linked to here disturbingly quickly after they are released ;)

Good info.
 

Azuma Hazuki

Golden Member
Jun 18, 2012
1,532
866
131
Will switching from SoI to bulk CMOS cause problems? I thought silicon-on-insulator was supposed to be a better process, lower leakage, and so on...
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Will switching from SoI to bulk CMOS cause problems?
Probably. This is AMD. The last time they didn't have major process problems going against them was when they beat Intel the punch using copper. SOI is one more feature to work with, and nobody but AMD is using it.

PD-SOI made a lot of sense back when they could piggyback IBM's advancements at a moderate initial cost, and use margins to make up the difference (in AMD's case, bleed less :)). That would depend on having a competitive CPU, that could command high demand (in the server market, FI, the total CPU cost is low enough that AMD would have trouble giving away 45nm Opterons), though, and general interest in IBM's manufacturing.
I thought silicon-on-insulator was supposed to be a better process, lower leakage, and so on...
It also costs more. When they started using it, getting the chips to high speeds was a problem even with fairly low total power. When TDP started becoming an issue, the performance was good enough that commanding a higher price for higher performance might still more than offset it. Recall, FI, that Athlon II and Phenom II CPUs held their own fairly well until Sandy Bridge.

I'm not sure if we can figure out their costs of SOI v. not using it, but typical estimates are around 10+5%, so unless it were to provide that much benefit, they may be better off without it. It may improve power consumption, FI, but how much would it improve it, against taking the costs of SOI and re-investing them into improving the chip in general, or working on relevant limiting aspects of GF's bulk process, where the design itself can't be massaged well enough?

Consider this situation:
  • PD-SOI on 28nm will cost x million, then y more per wafer.
  • Designing the updated chip for SOI may also cost some extra.
  • AMD's CPUs easily reach very high speeds for each functional CPU, and this is not likely to become a problem in the future.
  • The designers know of many physical optimizations that an be made, but AMD can only afford the manhours to implement a small subset of them.
  • Nobody that GF is begging to use their fabs wants or needs SOI, so whether the balance sheets directly reflect it or not, AMD pays for the implementation.
  • GF would probably be open to AMD's engineers helping make the standard process better for high-performance chips.
  • Without SOI, they can more easily use all of GF's manufacturing capability for a given process.
Now, how would you split up your funding?
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Pretty sure FD-SOI costs the same as bulk.

Edit: why is my post up here?

Given that the process by which you made an FD-SOI requires you to start with a bulk-Si wafer and then do a bunch of processing steps to it to create the FD-SOI wafer, I'm not sure how an FD-SOI wafer could ever be the same cost as a bulk wafer.

The absolute most optimistic of optimistic projections I have seen peg the cost adder at no less than 10% over that of bulk.

The benefit is that you don't need to spend say $4B developing a 14nm node if the 14nm node is going to use FD-SOI, you can get away with only spending say $3.5B in development costs.

So it is a matter of production volume versus development cost. If you have high volume then it behooves you to spend more upfront in developing the node but developing one that is less expensive in production. If your volumes are low then it makes financial sense to spend less upfront in development in exchange for paying a bit more per wafer for production.

It is very much the same financial analysis that goes into a company's decision to be fabless and use a foundry to produce their wafers or to be an IDM and bring manufacturing inhouse and own the fabs outright (along with the development costs).

It all literally comes down to projected wafer volumes of the product line and company developing the product line. SOI is what you do when you aren't quite large enough (wafer volumes) to slog out a bulk-Si process but you aren't quite small enough to make the jump to being fabless.

AMD eventually became small enough that being fabless made financial sense, and so it is of no surprise that being SOI also no longers makes financial sense.

The foundry is better off developing bulk-Si nodes since their wafer volumes will be higher, that is the model of why a foundry can function in the first place. So GloFo will invest that $4B whereas AMD with its smaller wafer volumes would have been compelled to invest just $3.5B and go with SOI...now AMD doesn't need to make that choice and it makes little sense for GloFo to stick with SOI either.

GloFo's wafers will be less expensive to their customers in bulk-Si, even with the higher node development cost amortized into them, which is great for their customers and great for us consumers.

The loser in all this is SOITEC. Their existence depends on the niche market that was AMD's reliance on SOI, and being a middle-man their existence was a cost-adder. That cost-adder is being squeezed out of the equation.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Odd that you say that, given this announcement: http://www.st.com/internet/com/press_release/c2680.jsp

What is odd about what I said?

GloFo does a lot of things that don't make sense. Hence:
Mubadala: Globalfoundries Makes Losses, Unlikely to Become Profitable Shortly.
http://www.xbitlabs.com/news/other/...es_Unlikely_to_Become_Profitable_Shortly.html

I didn't say GloFo was dropping SOI, I said they would be better off if they did and that it didn't make sense for them to stick with it.

Just as it doesn't make sense for them to do a lot of things they do that currently lead them to losing money.

Don't get me wrong here, I love SOI, search the forums and you'll find lots of pro-SOI threads/links from me including the tests reported in which 40% lower power usage came from SOI versus bulk at 45nm and I was hoping that Nvidia and AMD would migrate their GPU's to SOI.

But I have also dealt with SOI personally, hands-on, as a process development engineer and I'm not speaking from a position of ignorance when I make my statements.

It has its pro's and con's, and glofo is much better off investing more money upfront in developing a bulk-Si process the same as Intel and TSMC (two other companies that make lots of money and don't need SOI to do it) versus trying to convince its customers to pay ~10% more per wafer over that of TSMC for roughly the same parametric properties at the device level.

That said, GloFo is clearly not concerned with profits or the bottom line, something that also makes little sense but given their pedigree and heritage it is understandably a corporate culture thing that they have yet to iterate beyond.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
What is odd about what I said?
I just found it humorous that you said one thing and GloFo went and did the opposite.

Isn't it possible that FD-SOI would be worth the added cost? PD-SOI clearly wasn't worth it on the 28nm node for AMD to stick with it, but perhaps FD-SOI will. Is there an added cost to FD-SOI versus PD-SOI?
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Vishera will be Pilediver and the last AM3+ CPU. If you want Steamroller you need to buy the successor to Trinity. 2 modules+iGPU.
Source? I'm not trying to be a jerk, I just haven't found anything that says what AMD has up there sleeves after Vishera for the "performance" market. It would be unprecedented for them to make a AM3+ CPU after PD, but I haven't seen any mention of AM4 or whatever. Will they drop CPU only and try to turn an APU into their performance part?
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Will they drop CPU only and try to turn an APU into their performance part?

If I had to guess, yea that's their bet. Which is why they're putting so much into HSA and the convergence of GPU and CPU into a true APU for both server and desktop/laptops. The slides have only showed Vishera as the last of the AM3+ chips with nothing thereafter.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Source? I'm not trying to be a jerk, I just haven't found anything that says what AMD has up there sleeves after Vishera for the "performance" market. It would be unprecedented for them to make a AM3+ CPU after PD, but I haven't seen any mention of AM4 or whatever. Will they drop CPU only and try to turn an APU into their performance part?
Anand Lal Shimpi said:
Piledriver you already know about, it's at the heart of Trinity, which is the 2—4 core APU due out in early 2012. Piledriver will increase CPU core performance by around 10—15% over Bulldozer, although it will initially appear in a lower performance segment. Remember that final generation of AM3+ CPU I mentioned earlier? I fully expect that to be a GPU-less Piledriver CPU due out sometime in 2012.
http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Given that the process by which you made an FD-SOI requires you to start with a bulk-Si wafer and then do a bunch of processing steps to it to create the FD-SOI wafer, I'm not sure how an FD-SOI wafer could ever be the same cost as a bulk wafer.


FD-SOI can be cheaper than bulk.

The Substrate cost (wafer) is higher than a bulk-Si but Litho and FEOL process steps are less making the FD-SOI cheaper overall.

Because my knowledge stops here i would like to ask you the following,

It seams to me that the difference is only in the wafer. AMD can purchase ready FD-SOI wafers.
Can AMD or any other use the same 28nm process at GloFo but use FD-SOI wafers ???

Im asking because the transistors will be created over the FD-SOI Thin undoped body.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
If I had to guess, yea that's their bet. Which is why they're putting so much into HSA and the convergence of GPU and CPU into a true APU for both server and desktop/laptops. The slides have only showed Vishera as the last of the AM3+ chips with nothing thereafter.
Hmm, interesting.

They’re really going to have to make HSA work in order for an APU to be a worthy “performance” part, otherwise the GPU is just dead weight. For us (gamers) I just don’t see the advantage as we already have discrete GPUs. I bet APUs with non-functional and sheared off GPUs become popular.

That’s also going to be a problem for overclockers if the CPU and GPU don’t clock separately. Given how the current Llano chipsets have been pretty uninspiring so far they are going to need to really step up their game on the chipset side.

I think you’re right but I definitely see some pain with going down that path.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
HSA is meant to alleviate that pain.

Also, the performance gains are quite good and can be seen with openCL and CUDA enabled software. With shared memory address space and reduced latency those gains will only increase.

Think of the on-die GPU as a "co-processor" rather than just an on-die GPU strictly for gaming. So it'll do gaming but it will also help with "dumb" FP-heavy workloads.

For desktop enthusiasts it's a different story, but we spend more on the GPU than on the CPU anyway and there's a reason for that: CPUs are already good enough. You're not gaining any FPS in opting to go with the 2011 workstation platform and 6/12 chip over a much cheaper 1155 4/8 alternative or even the 4/4 2500K/3570K. So it's not like we're buying non-GPU CPUs anyway. In fact, the on-die GPUs are only increasing in size from both AMD and Intel and that's not a trend that's slowing down but accelerating at a far faster pace than are the CPU performance gains.

AMD abandoning a "straight CPU" style architecture shouldn't be shocking. Intel did this with the core architecture and nobody's been complaining. The issue AMD has isn't that they're likely planning to go all-APU but rather increasing the performance of the modules.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Think of the on-die GPU as a "co-processor" rather than just an on-die GPU strictly for gaming. So it'll do gaming but it will also help with "dumb" FP-heavy workloads.
This is where I think the "pain" portion comes in. Using the GPU to do dumb FP calculations was always the promise, and is why the bulldozer architecture comes with half as many FPUs. The trick will be to actually get software to use the GPU for FP calculations. Given how little weight or pull AMD has in the world I just don't see the vast majority of developers making code to support this. AMD is going to need to do this transparently in the hardware and (I suspect) its going to be a hit-or-miss implementation.

On the gaming side, if we use today's hardware as an example, using hybrid crossfire the GPU in Trinity is not going to help a 78XX or higher GPU at all (assuming they were even compatible, which their not). So in this case the GPU becomes essentially useless except for any FP calculations it can assist the CPU with.

However it works out it should be interesting to watch.
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Kaveri is the first APU with true HSArchitecture and is also GCN-based so crossfire support isn't an issue. It comes standard with the 7750-based on-die GPU.

Had AMD tried this on their own... yea, it would've failed before it started, but with ARM, TI, Imageon, and software companies already on board it's far from lacking developer support.

I'd also expect AMD to offer some FPUs with x86/x87-based ISA extensions like AVX/AVX2 since they've already covered the FMA4/FMA3 base. Therefore there shouldn't be an issue of dropping "legacy" FPU support as well as covering the future advancement of FPUs as well.

It's definitely interesting. It's basically AMD's answer to the question of what the hell are you going to do with this GPU when it sits idle and you're not gaming? Why not use it for compute also? :p
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Source? I'm not trying to be a jerk, I just haven't found anything that says what AMD has up there sleeves after Vishera for the "performance" market. It would be unprecedented for them to make a AM3+ CPU after PD, but I haven't seen any mention of AM4 or whatever. Will they drop CPU only and try to turn an APU into their performance part?

There wont be any AM4 either.

Now I only found the serverslide.
24577_2_amd_details_new_mobile_server_roadmaps.jpg


Plus we all know this one:
AMD_Roadmap2013DeskMob_689.jpg


AMD is about value, not performance anymore. APUs will be all there is for value/mainstream. No such thing as AMD performance/highend anymore. (Havent been for quite along time either.)

Its all FM2, FM3 etc in the future.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
There wont be any AM4 either.

AMD is about value, not performance anymore. APUs will be all there is for value/mainstream. No such thing as AMD performance/highend anymore. (Havent been for quite along time either.)

Its all FM2, FM3 etc in the future.
You don't have any road maps to verify this. Don't make claims that you cannot back up with facts.