gk104 in production, April release? [fud]

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
... But Nvidia reportedly had poor yields, and paid by the wafer for 40nm, and that cost $5000 per wafer as well.
Pretty sure Nvidia did not pay for bad dice on 40nm. At worst, TSMC ate some of the costs, same with AMD so costs are most certainly higher on 28nm initially. Nvidia has specifically stated that lower margins will be a direct result of the above.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
you are making too much out of it anyway. that thread is about planning to buy Kepler. those plans can and will change for people depending on how the card actually performs and what it cost. I am sure many people were planning to buy the 7970 and 7950 too but changed their minds once they saw the price.

Yes, I'm well aware of that, as I said nothing is set in stone.

Re-read the conversation that was had, and you'll see - all I did was make a point and support it. Regardless of what is known, whether none/good/bad/logical/illogical, there is an audience stating their intention, again that doesn't mean they'll follow through, to buy it.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I really don't see nVidia undercutting AMD. As they have never done so as far as I recall.

8800GT. NV released a card that essentially made its own high end card (8800 GTX) and AMD's 2900XT obsolete for half the price.

Of course, not since then because AMD has been competing almost solely on price/performance. AMD now appears to be looking for higher margins with the 7-series, so there might be some more wiggle room for NV to compete on price and still make a profit.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
I can't find it, sorry. It was basically an internal newsletter to their employees pulling quotes from various reviews of the hd7970 that hit on it's cons (mainly lack of value vs. current 40nm offerings and not as big a jump over the previous best single GPU). It certainly doesn't mean that Nvidia won't price their new GPU's sky high as well, but it at least shows they notice and read what reviewers say.

Excuse me, but this is a load of BS.

Since when does the experts outsmart the market? If a gfx card can not sell at a price it will be lowered.

If a product is priced to high, it will not move of the shelves. If it does to quick, and there is a constant shortage, it is clearly priced to low. Its that simple.

Now then, is a 7970 priced to high?

When NV can sell cards consistently with higher price than AMD are they then priced to high? - Well no - they are priced as they should. Its very seldom we see clearly wrong priced cards, with the 58xx series as an exception.
 
Feb 19, 2009
10,457
10
76
Pretty sure Nvidia did not pay for bad dice on 40nm. At worst, TSMC ate some of the costs, same with AMD so costs are most certainly higher on 28nm initially. Nvidia has specifically stated that lower margins will be a direct result of the above.

Correct. Their CEO has made this point, that margins are going to hurt with low yields this time around because they are paying per wafer unlike earlier.

Looking at it from an outsiders point of view, neither companies are in the position to price-war.

Would be great for the consumers if NV price-wars when gk104 is out.. but, unlikely.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
When NV can sell cards consistently with higher price than AMD are they then priced to high? - Well no - they are priced as they should. Its very seldom we see clearly wrong priced cards, with the 58xx series as an exception.

Actually the 58xx are a great example of the markets responding. The cards were priced appropriate to the competition already available (5870 cost more than the GTX 285, and HD 5850 cost a little less.)

The cards were so well received, and in such short demand, that the price went up through retailer gouging. Even after competition arrived, the HD 58xx still had a price premium.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
I don't think anyone knows that for sure. On the 40nm node, TSMC offered their bigger customers (mainly AMD and Nvidia) a choice to go with per die pricing (working units) or per wafer pricing. From what I understand, both vendors did a mix and match depending on the SKU, and that mix also changed during the production run. We can't know exactly how the contracts were inked, but rumors were that TSMC bent over backwards to accommodate Nvidia during the initial Fermi runs and lost their shirt because it. Keep in mind this is speculation we simply don't know.

But it doesn't matter now because TSMC has adjusted their pricing model to prevent another potential Fermi like beating they supposedly took.

So when JHH says they have been doing a wafer based system "for some time" it probably means somewhere in the middle of 40nm production they went to the wafer pay structure.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
http://seekingalpha.com/article/370...2-results-earnings-call-transcript?part=qanda



That doesn't sound like they just started using wafer-based pricing. Does anyone know for sure (i.e. have a link) that Nvidia did not use wafer-based pricing with Fermi?



Charlie at SA Nvidia DOOM story was based upon Nvidia paying by the wafer and the yields is what was going to DOOM them. BSN also did a article saying that they bought by the wafer then also.


Even if Nvidia beats the initial production targets by ten times, its yields are still in the single digit range. At $5,000 per wafer, 10 good dies per wafer, with good being a very relative term, that puts cost at around $500 per chip, over ten times ATI’s cost. The BoM cost for a GTX480 is more than the retail price of an ATI HD5970, a card that will slap it silly in the benchmarks.
IMHO, this is just spins on how XXXX company is screwed. They are paying this way this article they are in trouble. Next year, cycle, they do business the opposite way, they are also screwed. You heard it here first/ sarcasm.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yeah everything I've seen indicated Nvidia paid by the wafer for GF100, and that wafer cost $5000.

But who knows, I just assumed when Nvidia said the lower yields cut into their margins they were simply stating an obvious fact less yield = lower profit = higher cost per die... Than anticipated, I didn't take it to mean 28nm cost more than 40nm, which was pretty rough for Nvidia.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Hopefully Nvidia can release a card for 300 dollars that is faster than $/value than last generation. Maybe a reviewer at one of the tech sites will address if the wafer pricing is effecting launch prices, negatively for us.
 
Feb 19, 2009
10,457
10
76
*crickets chirping*

Newsflash: NV is obviously being super cautious and not brief anybody/partners lest the word of how awesome Kepler is going to be gets out!!!
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
*crickets chirping*

Newsflash: NV is obviously being super cautious and not brief anybody/partners lest the word of how awesome Kepler is going to be gets out!!!

The partners were briefed. Did you miss that bit of news? One thing is for sure - Nvidia does a much better job at protecting unreleased product information than AMD. There are still plenty of gtx580's and gtx570's for sale. It is apparent that Nvidia is too stubborn to adjust their 500 series pricing until the very last minute. In the mean time, AIB's still want to sell as many cards as they can so the leaks aren't going to come until press packets are out and retail packaging is ready to ship.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
http://www.maximum-tech.net/confirmed-nvidia-geforce-kepler-gk104-coming-in-april-10041/

Latest rumor bit now says release will take place sometime during the first two weeks of April, not in March as was previously indicated somewhere else. It will be a hard launch with plenty of inventory. Not much else is said, other than a hint that GK104 might shake things up a little.

I'd still like to see some reviews a few weeks before it's out, like what AMD did with the hd7970. Regardless, the move to 28nm will have taken Nvidia 3 months longer than AMD - an improvement over the move to 40nm but still slow on the uptick. GK104 needs to nearly match the hd7970 on performance and/or be more price competitive to forgive the longer release time for big daddy kepler.

EDIT: If Charlie really did see GK104 back in early/mid January, I think he saw an earlier version of the chip. Which bodes well since he was initially flabbergasted. What he probably saw was A2 (Nvidia starts with A1) silicon, and between early January and April 1st, Nvidia would have enough time to respin the chip again, run tests, and go into full production. So even though it didn't come in February, and with this rumor saying it's not coming in March, to me it seems like Nvidia wanted to get it "right" on the first try (as opposed to all initial Fermi's having parts disabled). Either way, reviews can't come soon enough.
 
Last edited:
Feb 19, 2009
10,457
10
76
The performance claims so far is that its good with physx games and that its pretty subpar for everything else. Other than that, no other performance leaks.

Pretty stale actually, every hinted launch so far as been pushed back, I hope its not going to be in May.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
The performance claims so far is that its good with physx games and that its pretty subpar for everything else.


Not really.
Charlie's performance claim so far is anything you'd like it to be.
From losing to Pitcairn to dispatching Tahiti with ease.

For some reason you picked subpar performance...
 
Feb 19, 2009
10,457
10
76
Not really.
Charlie's performance claim so far is anything you'd like it to be.
From losing to Pitcairn to dispatching Tahiti with ease.

For some reason you picked subpar performance...

We've had this discussion b4.. lets not kill the dead donkey, again.