Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Home and Garden
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 04-08-2012, 09:33 PM   #26
StrangerGuy
Diamond Member
 
Join Date: May 2004
Posts: 7,215
Default

Well, its not like games which are worth playing AND needs more than a 5850 justifies an upgrade over that card of mine.
StrangerGuy is offline   Reply With Quote
Old 04-08-2012, 10:53 PM   #27
Mopetar
Diamond Member
 
Join Date: Jan 2011
Posts: 3,022
Default

Quote:
Originally Posted by SickBeast View Post
Can't Samsung, Globalfoundries, or IBM do it?
As far as I know none of them are doing anything large scale at 28 nm, but it doesn't matter much anyway since switching fabs would likely require some heavy redesigns and several months of time. That's also assuming that any of those companies have the extra production capacity.

Samsung is probably trying to use a lot of their cutting edge stuff for their own stuff, and Apple might be getting some of their newer stuff so they may not have the spare capacity. IBM also does a lot of their own stuff, so I'm not even sure if they'd be interested. GF would probably like to do it, but who knows what things are like there at the moment.
Mopetar is offline   Reply With Quote
Old 04-08-2012, 11:17 PM   #28
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,597
Default

Quote:
Originally Posted by SickBeast View Post
Can't Samsung, Globalfoundries, or IBM do it?
Do any of them make anything 500mm^2 (or 300mm^2) on 28nm? Something that large is a lot of the problem.
3DVagabond is offline   Reply With Quote
Old 04-09-2012, 06:19 PM   #29
Mopetar
Diamond Member
 
Join Date: Jan 2011
Posts: 3,022
Default

Quote:
Originally Posted by 3DVagabond View Post
Do any of them make anything 500mm^2 (or 300mm^2) on 28nm? Something that large is a lot of the problem.
Unless they have the world's tiniest wafers, I don't see why GPUs would be a problem. IBM makes their POWER CPUs, which probably get quite large for some models.
Mopetar is offline   Reply With Quote
Old 04-09-2012, 08:09 PM   #30
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,597
Default

Quote:
Originally Posted by Mopetar View Post
Unless they have the world's tiniest wafers, I don't see why GPUs would be a problem. IBM makes their POWER CPUs, which probably get quite large for some models.
Larger chips are harder to produce. Look at the issues nVidia has been having with their big chips. Wafer size isn't the issue.
3DVagabond is offline   Reply With Quote
Old 04-09-2012, 08:16 PM   #31
blastingcap
Diamond Member
 
blastingcap's Avatar
 
Join Date: Sep 2010
Posts: 5,890
Default

Quote:
Originally Posted by 3DVagabond View Post
Larger chips are harder to produce. Look at the issues nVidia has been having with their big chips. Wafer size isn't the issue.
All else equal (including holding the chip design constant), larger wafers reduce wasted silicon (around the perimeter of the wafer) and improve the per-chip economics (since cost doesn't scale linearly with wafer size). This is why NV mentioned working with TSMC to encourage them to use larger wafers in the future:

http://www.extremetech.com/computing...ally-worthless (Scroll down to their "Wafer price is hiking up" slide and read the last line: "collaborate to move to bigger (450mm) wafers."

More:

http://www.sumcosi.com/english/produ..._diameter.html

That said, it is unclear to anyone without insider information if there are massive increases in yield waiting to be had, if only NV would engineer their designs "better." Charlie at SA has been incessantly beating that drum, and there is some truth to that (see, e.g., how AMD used double vias to deal with TMSC's poor 40nm yields), but I haven't ever seen any hard evidence that NV is as incompetent at designing chips as Charlie insists.
__________________
Quote:
Originally Posted by BoFox View Post
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.
Main: 3570K + R9 290 + 16GB 1866 + AsRock Extreme4 Z77 + Eyefinity 5760x1080 eIPS

Last edited by blastingcap; 04-09-2012 at 08:20 PM.
blastingcap is offline   Reply With Quote
Old 04-09-2012, 08:24 PM   #32
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,597
Default

Quote:
Originally Posted by blastingcap View Post
All else equal (including holding the chip design constant), larger wafers reduce wasted silicon (around the perimeter of the wafer) and improve the per-chip economics (since cost doesn't scale linearly with wafer size). This is why NV mentioned working with TSMC to encourage them to use larger wafers in the future:

http://www.extremetech.com/computing...ally-worthless (Scroll down to their "Wafer price is hiking up" slide and read the last line: "collaborate to move to bigger (450mm) wafers."

More:

http://www.sumcosi.com/english/produ..._diameter.html

That said, it is unclear to anyone without insider information if there are massive increases in yield waiting to be had, if only NV would engineer their designs "better." Charlie at SA has been incessantly beating that drum, and there is some truth to that (see, e.g., how AMD used double vias to deal with TMSC's poor 40nm yields), but I haven't ever seen any hard evidence that NV is incompetent at designing chips.
I wasn't talking about costs. I wasn't talking about wafer size, either. I was just talking about the apparent difficulties with making these big monolithic chips. I'm not blaming nVidia. I'm not blaming anyone. AMD has avoided the issues, but they don't design big chips.
3DVagabond is offline   Reply With Quote
Old 04-10-2012, 09:37 AM   #33
Mopetar
Diamond Member
 
Join Date: Jan 2011
Posts: 3,022
Default

Quote:
Originally Posted by 3DVagabond View Post
I wasn't talking about costs. I wasn't talking about wafer size, either. I was just talking about the apparent difficulties with making these big monolithic chips. I'm not blaming nVidia. I'm not blaming anyone. AMD has avoided the issues, but they don't design big chips.
Big chips are only harder to make in the sense that if there are a given number of defects per wafer, with bigger chips it's more likely that multiple defects will occur in the given area that makes up that chip. For some things this really isn't a problem, as those dies are just harvested and resold with some components disabled.

The real issue is if defects occur in an area that make it almost impossible to salvage the chip. To some degree some of these issues can be designed around (e.g. using double vias, as blastingcap mentioned) or other design considerations can be made to counteract them. nVidia is usually quite aggressive with their designs so they may not have taken as many steps to ensure that the chips could tolerate errors in the fabrication process.

They could make bigger chips that aren't problematic, but they probably wouldn't be anywhere near as efficient. nVidia doesn't want to eat the extra cost, and consumers don't want to pay the additional costs either. That said, we still don't even know how much of a problem this really is. All we know is that nVidia has said that yields are low and that 680s are in short supply right now. If they launch new parts in get more 680s out in the weeks ahead, obviously it wasn't a huge problem. However if it's June, we still don't have good 680 availability, and nVidia hasn't launched other products yet, obviously there were some issues.
Mopetar is offline   Reply With Quote
Old 04-10-2012, 10:16 AM   #34
pelov
Diamond Member
 
Join Date: Dec 2011
Posts: 3,512
Default

Quote:
Originally Posted by 3DVagabond View Post
I wasn't talking about costs. I wasn't talking about wafer size, either. I was just talking about the apparent difficulties with making these big monolithic chips. I'm not blaming nVidia. I'm not blaming anyone. AMD has avoided the issues, but they don't design big chips.
The 7970 is actually bigger than the GTX680 Kepler. As the poster above me noted, this is less likely to do with the size but rather the architecture/complexity of the design itself that's causing more issues with nVidia's Kepler than AMD's GCN (could be the clock speed-to-TDP as well. We've seen how conservatively AMD has clocked its 7970 by how much headroom the cards have even in reference designs as opposed to the GTX680 which has only a marginal OC with the turbo factored in). I think the GTX680/GK104 > GTX670Ti rebranding is indicative of that being the case. nVidia has different models but currently they all seem to be focused on the mobile platforms rather than the desktop. At the moment and for the near future the only desktop chip is the GTX680 and who knows when or even if we'll see any others.
pelov is offline   Reply With Quote
Old 04-13-2012, 05:30 AM   #35
Imouto
Golden Member
 
Imouto's Avatar
 
Join Date: Jul 2011
Posts: 1,243
Default

http://www.hardwarecanucks.com/news/...drop-incoming/

???
Imouto is offline   Reply With Quote
Old 04-13-2012, 06:05 AM   #36
Freddy1765
Senior Member
 
Join Date: May 2011
Posts: 316
Default

Quote:
Originally Posted by Imouto View Post
Please let this be true! Prices are ridiculous at the moment..
Freddy1765 is offline   Reply With Quote
Old 04-13-2012, 06:21 AM   #37
3DVagabond
Diamond Member
 
Join Date: Aug 2009
Location: Christchurch, NZ
Posts: 8,597
Default

Quote:
Originally Posted by pelov View Post
The 7970 is actually bigger than the GTX680 Kepler. As the poster above me noted, this is less likely to do with the size but rather the architecture/complexity of the design itself that's causing more issues with nVidia's Kepler than AMD's GCN (could be the clock speed-to-TDP as well. We've seen how conservatively AMD has clocked its 7970 by how much headroom the cards have even in reference designs as opposed to the GTX680 which has only a marginal OC with the turbo factored in). I think the GTX680/GK104 > GTX670Ti rebranding is indicative of that being the case. nVidia has different models but currently they all seem to be focused on the mobile platforms rather than the desktop. At the moment and for the near future the only desktop chip is the GTX680 and who knows when or even if we'll see any others.
I know the 7970 is bigger than the 680. Those aren't the chips we're talking about. We're talking about BigGK. The +500mm^2 chip that's nowhere in sight. I think it has a lot to do with size.

AMD does seem to be better on new processes overall. GCN is a pretty complex design. More so than the 680, from what I can tell (Disclaimer: I'm not an engineer ). It's been reported that AMD's clock speed for the 7970 was set low to improve yields. I've never seen this reported by AMD though. So, it's just rumor, innuendo, speculation, whatever. It does seem like the 7970 was intended for higher clocks though. We'll have to wait and see if that materializes.
3DVagabond is offline   Reply With Quote
Old 04-13-2012, 06:35 AM   #38
Quantos
Senior Member
 
Quantos's Avatar
 
Join Date: Dec 2011
Location: Montreal, QC, CA
Posts: 383
Default

Quote:
Originally Posted by Imouto View Post
Woah, what the hell? I sure didn't expect this anymore! Hopefully it's actually true.

At, say, $480 MSRP, things are quite different!
__________________
Lancool PC-K7 | Gentle Typhoons Cooling | ASUS P8Z77 | Intel 2700k | Noctua NH-D14 w/ PWM Noctua Fans | Samsung MV-3V4G3D/US | Seasonic SS-660XP2 |120GB OCZ Agility 3 | 1TB & 2TB WD Caviar Blacks | MSI GTX670 PE | 2x ASUS VH242H | Mionix Naos 5000 | Beyerdynamic MX300

Quantos is offline   Reply With Quote
Old 04-13-2012, 06:42 AM   #39
blastingcap
Diamond Member
 
blastingcap's Avatar
 
Join Date: Sep 2010
Posts: 5,890
Default

Thanks for the news. I am now expecting:

7970 3GB - $490
7950 3GB - $390
7950 1.5GB - $370
7870 2GB - $300
7850 2GB - $230
__________________
Quote:
Originally Posted by BoFox View Post
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.
Main: 3570K + R9 290 + 16GB 1866 + AsRock Extreme4 Z77 + Eyefinity 5760x1080 eIPS
blastingcap is offline   Reply With Quote
Old 04-13-2012, 06:45 AM   #40
Quantos
Senior Member
 
Quantos's Avatar
 
Join Date: Dec 2011
Location: Montreal, QC, CA
Posts: 383
Default

Quote:
Originally Posted by blastingcap View Post
Thanks for the news. I am now expecting:

7970 3GB - $490
7950 3GB - $390
7950 1.5GB - $370
7870 2GB - $300
7850 2GB - $230
Well I don't think there's a mention of cuts on the 78xx yet. It seems it's only for the 79xx and 7770 for now.

Still, at $490, I'm buying one over a 680.

This news deserves a new thread imo
__________________
Lancool PC-K7 | Gentle Typhoons Cooling | ASUS P8Z77 | Intel 2700k | Noctua NH-D14 w/ PWM Noctua Fans | Samsung MV-3V4G3D/US | Seasonic SS-660XP2 |120GB OCZ Agility 3 | 1TB & 2TB WD Caviar Blacks | MSI GTX670 PE | 2x ASUS VH242H | Mionix Naos 5000 | Beyerdynamic MX300


Last edited by Quantos; 04-13-2012 at 06:49 AM.
Quantos is offline   Reply With Quote
Old 04-13-2012, 07:45 AM   #41
KompuKare
Senior Member
 
Join Date: Jul 2009
Posts: 474
Default

http://www.hardwarecanucks.com/news/...drop-incoming/
Quote:
It looks like AMDís bean counters may know something we donít.
I would imagine this means they are expecting 680s to become less supply constraint. And while it's true that:

Quote:
Originally Posted by pelov View Post
At the moment and for the near future the only desktop chip is the GTX680 and who knows when or even if we'll see any others.
I imagine that Nvidia will soon have enough die harvested GK104s to release a GTX670 of some sort. NV seem to be experts at die harvests.
KompuKare is offline   Reply With Quote
Old 04-13-2012, 07:55 AM   #42
blackened23
Diamond Member
 
Join Date: Jul 2011
Posts: 8,556
Default

Quote:
Originally Posted by KompuKare View Post
http://www.hardwarecanucks.com/news/...drop-incoming/

I imagine that Nvidia will soon have enough die harvested GK104s to release a GTX670 of some sort. NV seem to be experts at die harvests.
Isn't a die harvest for a high binned chip? The opposite of a cut down chip.
blackened23 is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 02:29 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.