Fudzilla: Cypress yields 60-80% Fermi yields at 20% Fermi 20% faster than Cypress

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kuzi

Senior member
Sep 16, 2007
572
0
0
It's not that surprising to see low yields on Fermi, as it is a large chip and possibly around 500mm^2 in size or a bit larger. TSMC only improved their yields for Cypress a few weeks ago, a chip that is much smaller. I'm sure in a few months yields for Fermi will improve too.

From estimates about Fermi's die size, memory configuration (and yields), I would say a high end Fermi with 1.5GB GDDR5 would cost about ~$600 at release. Unless this card will perform +50% over an 5870 (highly unlikely), then it would be a tough sell at that price. Yes there will be some people that could find some use for the extra memory size/bandwidth, or the improved GPGPU functionality, but the majority of people buying these cards care about one thing and that is price/performance for the games they play.

Also keep in mind that for all we know prices for the 5870/5850/5770 may be lowered to say $350/$250/$150 respectively once Fermi releases in a few months. I think that's exactly what ATI will do. With the current market situation, how many people are willing to pay over $500 for a video card?

Hopefully Nvidia will have a mainstream Fermi card that performs and is priced similarly to the 5870. Nvidia is in a very tough situation right now.
 
Apr 20, 2008
10,161
984
126
20% yields? I'm not interested that much, but we need a ton more competition from nVidia. I need to get my hands on a cheaper 5850.
 

biostud

Lifer
Feb 27, 2003
18,193
4,674
136
The problem for nVidia is not only the problem with Fermi, but also beeing able to release cards in the $150-$300 price bracket, where the large sales are. With the 48xx cards Ati/AMD started a new trend of offering top performance at this price bracket, which they where able to do because of smaller GPU's running really fast. And then you could buy an x2 card if money wasn't a problem, earlier you needed to spend $5-600 for a top card and then double it for SLI/CF performance.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
The problem for nVidia is not only the problem with Fermi, but also beeing able to release cards in the $150-$300 price bracket, where the large sales are. With the 48xx cards Ati/AMD started a new trend of offering top performance at this price bracket, which they where able to do because of smaller GPU's running really fast. And then you could buy an x2 card if money wasn't a problem, earlier you needed to spend $5-600 for a top card and then double it for SLI/CF performance.
That's capitalism by NV at its best. :)
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
It wouldnt surprise me if they had lower than desired yields at this stage of the game. The 20% faster would actually surprise me. I figure it will be about the speed of a GTX 295. Which I think for the most part isnt 20% faster than the fastest single GPU solution from ATI.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
The problem for nVidia is not only the problem with Fermi, but also beeing able to release cards in the $150-$300 price bracket, where the large sales are. With the 48xx cards Ati/AMD started a new trend of offering top performance at this price bracket, which they where able to do because of smaller GPU's running really fast. And then you could buy an x2 card if money wasn't a problem, earlier you needed to spend $5-600 for a top card and then double it for SLI/CF performance.


That's capitalism by NV at its best. :)

I think there were plenty of cheaper SLI options that having only to resort to a top card.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
It wouldnt surprise me if they had lower than desired yields at this stage of the game. The 20% faster would actually surprise me. I figure it will be about the speed of a GTX 295. Which I think for the most part isnt 20% faster than the fastest single GPU solution from ATI.

It should be faster than a GTX295.
Assuming an increase in memory bandwidth (from GDDR5, despite the reduce mem bandwidth if it's 384), more than double the shader cores + equal clock speed and no hit from SLI since it's all a single GPU, it should be faster than a GTX295.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
I think there were plenty of cheaper SLI options that having only to resort to a top card.
It does not change the fact that NV took advantage of the situation that there was no competition from ATI. Good for them. They are here to make the money.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
ATi is doing the same right now, there just not as extreme about it. But they don't have the brand recognition either. It's shame when there on top it's a global recession bad timing on there part.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
It's shame when there on top it's a global recession bad timing on there part.

There was a time when I would have bought cards from both companies just to play with them. Now I have to "save up" just to buy a mid-range card from one company. :(
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
ATi is doing the same right now, there just not as extreme about it.

AMD are selling their newer, faster cards at a price lower than the competitions older, slower cards.

I don't get what you mean.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
It wouldnt surprise me if they had lower than desired yields at this stage of the game. The 20% faster would actually surprise me. I figure it will be about the speed of a GTX 295. Which I think for the most part isnt 20% faster than the fastest single GPU solution from ATI.

Actually the GTX295 is on average about 5% faster than a 5870, it all depends on which games you play of course. But I believe it's very possible for Fermi to be 20% faster than these cards, unless Nvidia really had to deactivate some shaders and/or lower the clocks like some recent rumors suggested.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
AMD are selling their newer, faster cards at a price lower than the competitions older, slower cards.

I don't get what you mean.

Well just that there upping there MRSP pretty much, but yeah they don't gouge NEARLY as bad as nVidia would in this situation.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Well just that there upping there MRSP pretty much, but yeah they don't gouge NEARLY as bad as nVidia would in this situation.

I don't know, they both seem committed to the highest prices they can get, it's just the way corporations work. If you think one is better than the other, you need to look up fiduciary responsibility. The *only* directive these companies have is to make as much $ as humanly possible.

ATI is not unknown to high $ video cards :

http://www.legitreviews.com/article/378/11/

$450 X1950XTX

http://www.neoseeker.com/Articles/Hardware/Reviews/hd5970launchreview/

$600 5970

http://www.techpowerup.com/reviews/Sapphire/Radeonx850xtpeagp/11.html

$415 X850

etc.

Either NV or ATI will charge as much as they can get away with, that's capitalism, not a particular company trying to gouge you.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Well just that there upping there MRSP pretty much, but yeah they don't gouge NEARLY as bad as nVidia would in this situation.

You seem to forget how much the 8800 GT MSRP was when it was released - and how absolutely NOTHING at all in it's price range came anywhere close to matching it's performance. But since it was an nvidia card and goe against what you are saying, it was either A) conveniently ignored or B) still price gouging because it had to be if it was from nvidia.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
I don't know, they both seem committed to the highest prices they can get, it's just the way corporations work. If you think one is better than the other, you need to look up fiduciary responsibility. The *only* directive these companies have is to make as much $ as humanly possible.

ATI is not unknown to high $ video cards :

http://www.legitreviews.com/article/378/11/

$450 X1950XTX

http://www.neoseeker.com/Articles/Hardware/Reviews/hd5970launchreview/

$600 5970

http://www.techpowerup.com/reviews/Sapphire/Radeonx850xtpeagp/11.html

$415 X850

etc.

Either NV or ATI will charge as much as they can get away with, that's capitalism, not a particular company trying to gouge you.

$600 for the 5970 isn't a gouge when you consider it's 2x~HD5860, which would be more than $300 ea. (although actual HD5970 prices seem to be $650+), but the other cards are overpriced.
IMO a dual GPU card is only gouging if it's more than double the price of an equiv. single card, so even though it's $600 ($650 real), it's not actual gouging.

Historically, in the UK at least, most prices have stayed pretty reasonable since the Geforce 4 days, once pricing has settled.
A mid-top card (GF4 Ti4400, ATI 9800, 7800GT, HD5850) has cost around £200~£220.
The HD4850 was the only one to break the mould at £150 or so at launch, but then it was slower than the NV GTX260, which dropped to about £200 once the ATI competition launched.

Sure, both guys have on occasion gouged when they had the advantage, but then you can only blame the consumer for letting themselves be taken advantage of if they don't feel like waiting.
Thankfully dual GPU cards have meant that it's not so easy to gouge. Sure ATI could have launched the HD5870 at $500, but then because NV have the GTX295, it would feel like gouging. Offering a dual GPU previous-gen card prevents next-gen gouging, which is why ATI can't do what NV did last gen.

Prices seem to end up at the same level (in the UK at least), once the playing field has settled down and both companies are on the same generation.
 

ScorcherDarkly

Senior member
Aug 7, 2009
450
0
0
$600 for the 5970 isn't a gouge when you consider it's 2x~HD5860, which would be more than $300 ea. (although actual HD5970 prices seem to be $650+), but the other cards are overpriced.
IMO a dual GPU card is only gouging if it's more than double the price of an equiv. single card, so even though it's $600 ($650 real), it's not actual gouging.

If they charged exactly double for a X2 card, then no one would ever buy two of the single cards to CF, as long as performance was equal. They should charge a bit more for a X2 card, say $25 or so, since they're saving you a PCI-E slot on top of giving you equal performance.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
^ Agreed, and I wasn't even saying the 5970 or the other cards listed were 'gouging', just that both ATI *AND* Nvidia (and basically all corporations) will charge the maximum amount the market will bear for their products. That's just business.

We see the same argument in Intel v. AMD, where people complain about Intel's prices while totally forgetting the epic pricing on the occasions where AMD was the performance leader. The 4800+ X2 was $1001 when it came out, ditto with most of the FX chips, which retailed between $800-$1000+ on release.

If you have the best product, and you charge a high price, the people who can afford the best will still buy it. If nobody is buying, you lower the price until people start buying, unless the price of production is an obstruction to making any profit at all with the lower price, which is when you EOL the line and go back to the drawing board altogether.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,230
2
0
If they charged exactly double for a X2 card, then no one would ever buy two of the single cards to CF, as long as performance was equal. They should charge a bit more for a X2 card, say $25 or so, since they're saving you a PCI-E slot on top of giving you equal performance.

You did say "as long as performance was equal", but no dual card so far has ever achieved that (the 5970 being one of the worst cases of underclocked gpus)... So realistically, if the dual card wasnt cheaper than 2x single card, no one would want it because it delivers less performance
 

T2k

Golden Member
Feb 24, 2004
1,664
5
0
Wasn;t the4850x2 the same clocks as single cards?

It is and it's probably the best freakin' card out there around $200 w/ the performance of a 5850 w/ 2GB total memory... BTW I have one for sale... ():)
 

T2k

Golden Member
Feb 24, 2004
1,664
5
0
The problem for nVidia is not only the problem with Fermi, but also beeing able to release cards in the $150-$300 price bracket, where the large sales are. With the 48xx cards Ati/AMD started a new trend of offering top performance at this price bracket, which they where able to do because of smaller GPU's running really fast. And then you could buy an x2 card if money wasn't a problem, earlier you needed to spend $5-600 for a top card and then double it for SLI/CF performance.

THIS.

Ever since GeForce 3-4 times Nvidia is hooked on the idea of GIANT, monolithic chips - and once again it blew up in their face.
I firmly believe the future is the ATI approach - at least until CPU-GPU Fusion arrives but that's probably the death sentence for Nvidia anywy, at least on the discrete desktop market -; modular high-end cards, excellent clocks with mainstream ones make a lot of sense every way (design, mfr'ing, business etc.)
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
It is and it's probably the best freakin' card out there around $200 w/ the performance of a 5850 w/ 2GB total memory... BTW I have one for sale... ():)

That's a good deal for sure, and a great card, but in the case of CF and SLI cards, doesn't the 'total' memory not make a difference?

I mean 1gb 4850 vs. a 2gb 4850x2, the game can only 'see' 1gb of vram, or am I wrong?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
THIS.

Ever since GeForce 3-4 times Nvidia is hooked on the idea of GIANT, monolithic chips - and once again it blew up in their face.
I firmly believe the future is the ATI approach - at least until CPU-GPU Fusion arrives but that's probably the death sentence for Nvidia anywy, at least on the discrete desktop market -; modular high-end cards, excellent clocks with mainstream ones make a lot of sense every way (design, mfr'ing, business etc.)

Well, it can go either way. The GTX 295 and 5970 are good examples of how the single card/multi-gpu approach can potentially backfire. To maintain ATX spec compliance, the mutli-gpu card has to be down scaled compared to its single gpu counterpart. A modular high end card is an intriguing concept, but will the ATX spec allow for the power needed for such a card to operate? Not currently.

Unless the ATX spec is re-vamped or ATI's gpus all of a sudden become much more efficient, ATI may not be able to pull off a 6-series dual gpu card next gen with two of their top gpus. Of course, this affects NVIDIA's plans as well, but it looks like to me that NVIDIA is hedging their bets a little more either way by going the large gpu route first, doing a shrink, and then going multi. Of course (as you mentioned), they have to do this at the cost/risk of designing a huge chip to begin with, which as NV themselves have said, is "fucking hard".