"Good luck on those price drops" - TSMC

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Well, its not like games which are worth playing AND needs more than a 5850 justifies an upgrade over that card of mine.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,630
136
Can't Samsung, Globalfoundries, or IBM do it?

As far as I know none of them are doing anything large scale at 28 nm, but it doesn't matter much anyway since switching fabs would likely require some heavy redesigns and several months of time. That's also assuming that any of those companies have the extra production capacity.

Samsung is probably trying to use a lot of their cutting edge stuff for their own stuff, and Apple might be getting some of their newer stuff so they may not have the spare capacity. IBM also does a lot of their own stuff, so I'm not even sure if they'd be interested. GF would probably like to do it, but who knows what things are like there at the moment.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,630
136
Do any of them make anything 500mm^2 (or 300mm^2) on 28nm? Something that large is a lot of the problem.

Unless they have the world's tiniest wafers, I don't see why GPUs would be a problem. IBM makes their POWER CPUs, which probably get quite large for some models.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Unless they have the world's tiniest wafers, I don't see why GPUs would be a problem. IBM makes their POWER CPUs, which probably get quite large for some models.

Larger chips are harder to produce. Look at the issues nVidia has been having with their big chips. Wafer size isn't the issue.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Larger chips are harder to produce. Look at the issues nVidia has been having with their big chips. Wafer size isn't the issue.

All else equal (including holding the chip design constant), larger wafers reduce wasted silicon (around the perimeter of the wafer) and improve the per-chip economics (since cost doesn't scale linearly with wafer size). This is why NV mentioned working with TSMC to encourage them to use larger wafers in the future:

http://www.extremetech.com/computin...y-with-tsmc-claims-22nm-essentially-worthless (Scroll down to their "Wafer price is hiking up" slide and read the last line: "collaborate to move to bigger (450mm) wafers."

More:

http://www.sumcosi.com/english/products/next_generation/large_diameter.html

That said, it is unclear to anyone without insider information if there are massive increases in yield waiting to be had, if only NV would engineer their designs "better." Charlie at SA has been incessantly beating that drum, and there is some truth to that (see, e.g., how AMD used double vias to deal with TMSC's poor 40nm yields), but I haven't ever seen any hard evidence that NV is as incompetent at designing chips as Charlie insists.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
All else equal (including holding the chip design constant), larger wafers reduce wasted silicon (around the perimeter of the wafer) and improve the per-chip economics (since cost doesn't scale linearly with wafer size). This is why NV mentioned working with TSMC to encourage them to use larger wafers in the future:

http://www.extremetech.com/computin...y-with-tsmc-claims-22nm-essentially-worthless (Scroll down to their "Wafer price is hiking up" slide and read the last line: "collaborate to move to bigger (450mm) wafers."

More:

http://www.sumcosi.com/english/products/next_generation/large_diameter.html

That said, it is unclear to anyone without insider information if there are massive increases in yield waiting to be had, if only NV would engineer their designs "better." Charlie at SA has been incessantly beating that drum, and there is some truth to that (see, e.g., how AMD used double vias to deal with TMSC's poor 40nm yields), but I haven't ever seen any hard evidence that NV is incompetent at designing chips.

I wasn't talking about costs. I wasn't talking about wafer size, either. I was just talking about the apparent difficulties with making these big monolithic chips. I'm not blaming nVidia. I'm not blaming anyone. AMD has avoided the issues, but they don't design big chips.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,630
136
I wasn't talking about costs. I wasn't talking about wafer size, either. I was just talking about the apparent difficulties with making these big monolithic chips. I'm not blaming nVidia. I'm not blaming anyone. AMD has avoided the issues, but they don't design big chips.

Big chips are only harder to make in the sense that if there are a given number of defects per wafer, with bigger chips it's more likely that multiple defects will occur in the given area that makes up that chip. For some things this really isn't a problem, as those dies are just harvested and resold with some components disabled.

The real issue is if defects occur in an area that make it almost impossible to salvage the chip. To some degree some of these issues can be designed around (e.g. using double vias, as blastingcap mentioned) or other design considerations can be made to counteract them. nVidia is usually quite aggressive with their designs so they may not have taken as many steps to ensure that the chips could tolerate errors in the fabrication process.

They could make bigger chips that aren't problematic, but they probably wouldn't be anywhere near as efficient. nVidia doesn't want to eat the extra cost, and consumers don't want to pay the additional costs either. That said, we still don't even know how much of a problem this really is. All we know is that nVidia has said that yields are low and that 680s are in short supply right now. If they launch new parts in get more 680s out in the weeks ahead, obviously it wasn't a huge problem. However if it's June, we still don't have good 680 availability, and nVidia hasn't launched other products yet, obviously there were some issues.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
I wasn't talking about costs. I wasn't talking about wafer size, either. I was just talking about the apparent difficulties with making these big monolithic chips. I'm not blaming nVidia. I'm not blaming anyone. AMD has avoided the issues, but they don't design big chips.

The 7970 is actually bigger than the GTX680 Kepler. As the poster above me noted, this is less likely to do with the size but rather the architecture/complexity of the design itself that's causing more issues with nVidia's Kepler than AMD's GCN (could be the clock speed-to-TDP as well. We've seen how conservatively AMD has clocked its 7970 by how much headroom the cards have even in reference designs as opposed to the GTX680 which has only a marginal OC with the turbo factored in). I think the GTX680/GK104 > GTX670Ti rebranding is indicative of that being the case. nVidia has different models but currently they all seem to be focused on the mobile platforms rather than the desktop. At the moment and for the near future the only desktop chip is the GTX680 and who knows when or even if we'll see any others.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The 7970 is actually bigger than the GTX680 Kepler. As the poster above me noted, this is less likely to do with the size but rather the architecture/complexity of the design itself that's causing more issues with nVidia's Kepler than AMD's GCN (could be the clock speed-to-TDP as well. We've seen how conservatively AMD has clocked its 7970 by how much headroom the cards have even in reference designs as opposed to the GTX680 which has only a marginal OC with the turbo factored in). I think the GTX680/GK104 > GTX670Ti rebranding is indicative of that being the case. nVidia has different models but currently they all seem to be focused on the mobile platforms rather than the desktop. At the moment and for the near future the only desktop chip is the GTX680 and who knows when or even if we'll see any others.

I know the 7970 is bigger than the 680. Those aren't the chips we're talking about. We're talking about BigGK. The +500mm^2 chip that's nowhere in sight. I think it has a lot to do with size.

AMD does seem to be better on new processes overall. GCN is a pretty complex design. More so than the 680, from what I can tell (Disclaimer: I'm not an engineer :)). It's been reported that AMD's clock speed for the 7970 was set low to improve yields. I've never seen this reported by AMD though. So, it's just rumor, innuendo, speculation, whatever. It does seem like the 7970 was intended for higher clocks though. We'll have to wait and see if that materializes.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Thanks for the news. I am now expecting:

7970 3GB - $490
7950 3GB - $390
7950 1.5GB - $370
7870 2GB - $300
7850 2GB - $230
 

Quantos

Senior member
Dec 23, 2011
386
0
76
Thanks for the news. I am now expecting:

7970 3GB - $490
7950 3GB - $390
7950 1.5GB - $370
7870 2GB - $300
7850 2GB - $230

Well I don't think there's a mention of cuts on the 78xx yet. It seems it's only for the 79xx and 7770 for now.

Still, at $490, I'm buying one over a 680. ^_^

This news deserves a new thread imo :p
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
http://www.hardwarecanucks.com/news/video/breaking-news-amd-hd-7970-price-drop-incoming/
It looks like AMD’s bean counters may know something we don’t.

I would imagine this means they are expecting 680s to become less supply constraint. And while it's true that:

At the moment and for the near future the only desktop chip is the GTX680 and who knows when or even if we'll see any others.

I imagine that Nvidia will soon have enough die harvested GK104s to release a GTX670 of some sort. NV seem to be experts at die harvests.