AMD responds to "tough" questions from nVidia

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: DefRef
Yawn... Gee, is it "ATI fanboys squealing like a Miley Cyrus audience over how their beloved Red company will trounce the Green team" time again?

Listen, kids, ATI put out a new card first and nVidia will have one coming later. There are two ways this can be spun:

1. The ATI fanboy way - "HA-HA! nVidia is crapping themselves because the new ATI 59000000000000000000000HDROFLCOPTERZOMGBBQ is teh fastester and since they aren't rushing to announce their GTX300 cards, it can only mean one thing - they got NUTHIN!!! Bwahahahahahaha!!!"

B. The sane person way - "ATI has unveiled their new card and it's quite impressive. However, now nVidia knows exactly where the price/power bar is set and could respond with something that smokes ATI's offering and offer gamers a better value. Since ATI has nothing to retaliate with for at least a year, those early sales will be pretty much it."

IF nVidia matches and/or exceeds what ATI has - pretty likely since, other than the 9700 vs. 5800 days, nVidia's had ATI's number for a decade - then I wonder how many people whooping it up now will become whiners that they got "stuck" with a slow card? Heh.

THIS is what I was talking about. We need more nvidiots to come in here and thread crap, at least that means that nvidia is putting up a fight. Honestly, I've been scared (ok, my wallet has been scared at least) by nvidia's almost complete silence. If the next 8800gtx was just around the corner would we have been hearing about physix for the past week?

When was the last time that a dominating card was released 2-6 mos AFTER the other team's card? What was that? Never? Oh, right. Yes nvidia has ati's specs, but these have not been nearly as well hidden as 4xxx was. Nvidia is behind right now because they are making more changes and are trying to make up the 1/2 step advantage that ati has enjoyed for so long. However, the longer we go with NO IDEA what gt300 will be like, the better the chances are that all of us (other than the zoners of course) are going to be pissed off because we'll end up spending more money on video cards than we're used to.

I went from a 3870 - 4850 - gtx 260 core 216 for $85 total net out of pocket after selling my old cards over the past 18 mos, something tells me that if nvidia doesn't roll something out pretty soon I might have to stay with the 260 for a while.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: OCguy
Originally posted by: SlowSpyder
Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.


nV releases X2 cards every generation. There was a report months ago they are working on a GT300 gx2 card.

Trying to discredit what could be the fastest single card for the next year or so by saying it "isnt planned" doesnt really work.


I am fairly certain the single PCB 295 was nV's test-run with a non-sandwich gx2 card. You really think they made that switch in the middle of the product's lifecycle for the hell of it?

didn't they make that switch after they went to 55nm? They probably made it as a 2 pcb solution originally b/c they couldn't make it work with high-enough clock speeds on a single pcb at 65nm, then switched to the less expensive design when the switch to 55nm allowed a die shrink.

edit: @ocguy: I don't see a massive market shift towards ati unless gt300 is VERY late and VERY slow. as competitive as 4xxx was, amd still gained little to no market share, even when competing against poor alternatives like g92 in many segments. Nvidia's marketing machine has just been too good for ati to compete with for a long time.
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
Originally posted by: bryanW1995

didn't they make that switch after they went to 55nm? They probably made it as a 2 pcb solution originally b/c they couldn't make it work with high-enough clock speeds on a single pcb at 65nm, then switched to the less expensive design when the switch to 55nm allowed a die shrink.

I was under the impression that the original 295 was already 55nm, because it was 2x275s, but I could be wrong.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: OCguy
Originally posted by: bryanW1995

didn't they make that switch after they went to 55nm? They probably made it as a 2 pcb solution originally b/c they couldn't make it work with high-enough clock speeds on a single pcb at 65nm, then switched to the less expensive design when the switch to 55nm allowed a die shrink.

I was under the impression that the original 295 was already 55nm, because it was 2x275s, but I could be wrong.

sorry, you're right. I forgot that it didn't release until january this year. That would be very interesting if they're able to put two gt300 chips on one pcb right from the start, it would definitely help them keep the costs under control.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: OCguy
Originally posted by: Azn
Yeah ATI really dropped the ball on Nvidia last round and continuing the same saga this round.

Who would have thought making upper mid-range chips and making high performance dual GPU cards would hurt Nvidia with their huge single chip design.

It made Nvidia imitate ATI's business strategy last round. This round is no different.

Now who here thinks GT300 can't be made into a gx2 card because of Nvidia's power hungry design and lose to 5870x2 cards once it's released?

I think it is too early to tell exactly what is happening this round. ATi is getting some early adopters, but there are a ton of people waiting to see what happens with G300.

If a single G300 is faster than the 5870, then I would bet the farm that the gx2 part is faster than the x2 part as well.

Gaining 3% marketshare and still having a net loss in money, when you have all the hype and great price/performance like 4890, isnt exactly "dropping the ball" on nV.

Of course this could all be moot if G300 comes out and does not compete with 5XXX. At that point, ATi would have both value and top performance. You would see a massive marketshare shift in AMD's favor.

It is early but 5870 is really putting the pressure on Nvidia. Nvidia has a tough road ahead of them.

As for gx2. That was the main question. I doubt Nvidia will make a gx2 unless Nvidia went to the same strategy as 5870 low power consumption which is highly unlikely and still perform faster than 5870. Will gx2 on the GT300 even possible? Perhaps a cut version of the card like how they did GTX295. That means Nvidia will have to perform 30% faster than 5870 to be competitive or something of that nature.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: Idontcare
thilan that doesn't really negate my point in regards to making diesize comparisons as the basis for cost estimations alone, if that isn't self-evident then I'll concede I failed to properly make my point to begin with and I'll happily go back to the drawing board and try again if think you would find value in my doing so. I don't mind making the effort.

Did you mean just the die selling for less than $300 or the whole card? Is it possible that it was the AIBs that were losing money and not nV?
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
Originally posted by: Azn

It is early but 5870 is really putting the pressure on Nvidia. Nvidia has a tough road ahead of them.

As for gx2. That was the main question. I doubt Nvidia will make a gx2 unless Nvidia went to the same strategy as 5870 low power consumption which is highly unlikely and still perform faster than 5870. Will gx2 on the GT300 even possible? Perhaps a cut version of the card like how they did GTX295. That means Nvidia will have to perform 30% faster than 5870 to be competitive or something of that nature.

We'll see, I am betting they hold the top single card slot with a GX2 and the top single GPU slot again.

Remember, this isnt nV just doubling the specs of GT200, this is a whole new arch, unlike 5XXX. So it really is a wild card.

Maybe they concentrated more on CGGPU so 3D gaming suffers? Maybe it is 25% faster than 5XXX? Nobody knows.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: OCguy
Maybe they concentrated more on CGGPU so 3D gaming suffers? Maybe it is 25% faster than 5XXX? Nobody knows.

Personally I think they'll do REALLY well with GPGPU (after all the emphasis and effort they've been putting into it)j but their gaming performance will only be maybe 5-10% faster than the 5870. Either way I hope prices stay down.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: OCguy
Originally posted by: SlowSpyder
Nvidia can always releasae an x2 part as well, but it seems like their x2 parts are more reactionary than planned... I could just see an x2 Nvidia part taking a while to show up, assuming it's needed.


nV releases X2 cards every generation. There was a report months ago they are working on a GT300 gx2 card.

Trying to discredit what could be the fastest single card for the next year or so by saying it "isnt planned" doesnt really work.


I am fairly certain the single PCB 295 was nV's test-run with a non-sandwich gx2 card. You really think they made that switch in the middle of the product's lifecycle for the hell of it?

Not sure if I came off as I'm trying to somehow discredit an x2 GT300, but that wasn't what I was aiming for.

I *thought* that I had seen that 5870x2 is supposed to be out by November... no idea if that's true at this point or not. If Nvidia gets their GT300 out in the December time frame, then they may be launching it with both the 5870 and 5870x2 already out, abut when will Nvidia's x2 part arrive? Will it arrive?

I guess why I said that the Nvidia x2 parts seem to me to be reactionary is, unless I'm wrong here, after the 3870x2 came about, that's when Nvidia launched the 9800GX2. When the 1950XTX was out, Nvidia's answer was to release a 7950x2. After the 4870x2 was out, Nvidia found a way (it seemed this was the case) to get an x2 part out. Maybe 'reactionary' is the wrong term, but maybe planninng is done and the part is only brought out if need be, whereas x2 parts are all over AMD's roadmaps.
 

mmnno

Senior member
Jan 24, 2008
381
0
0
Originally posted by: Idontcare
Originally posted by: mmnno
Thinking more logically, nVidia probably will not let ATi beat them in value. They will either release a more powerful card at a higher price, which enthusiasts will be happy to pay if it is reasonable, or they will release a competitive card and force ATi to cut prices.

The only way nVidia can disappoint is if they release a competitive card at a higher price, thinking that features like PhysX and CUDA give them pricing power. That would suck but I suspect ATi might cut prices anyway, so I'm confident that waiting will bring better deals.

Also who is to say that NV has any interest in lowering prices? They might just like the price/performance level set by AMD and they choose to simply align the ASP's of the SKU's to that price/performance segmentation.

Well, ATi (or is it their partners?) has shown that they aren't willing to compete at an equal price point with nVidia. Granted, they haven't had a part that matches up exactly recently, but the 4890 was pretty close to the 275 and ATi still went down the chain in a hurry. So if nVidia releases a card that matches the 5870 at the same price, I'm betting ATi will either price drop or consistently undersell.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: thilan29
Originally posted by: Idontcare
thilan that doesn't really negate my point in regards to making diesize comparisons as the basis for cost estimations alone, if that isn't self-evident then I'll concede I failed to properly make my point to begin with and I'll happily go back to the drawing board and try again if think you would find value in my doing so. I don't mind making the effort.

Did you mean just the die selling for less than $300 or the whole card? Is it possible that it was the AIBs that were losing money and not nV?

I was just speaking in regards to the IC itself, regardless whether they actually sold it for $200 or $300 or $400 my point is that it was designed to be capable of being manufactured so as to enable a pricepoint that is basically an order of magnitude lower than that for comparably sized chips.

To do that, coming from where I came from and seeing what I've seen, is a testament to something that I am simply incapable of effectively communicating with words that aren't riddled with meaningless acronyms like DFM itself, but I try anyways ;)

If you look at chips that weigh in at >400mm^2 you'll notice that they tend to command a retail price in excess of $1000 as that is what is necessary to offset to typically low yields such products experience at the hands of a fab's D0 (defect density).

Designing redundancy into a chip, such as cache line redundancy and fuses, is one of the earlier methods of designing a chip for the realities of the manufacturing environment from a yield-enhancement standpoint. Harvesting is yet another rung up the ladder, as was clockspeed binning from the beginning of time.

But if there ever was to be a prologue to the rather lengthy book on DFM for the GT200 it would simply read - you can buy them, in retail products, for less than $300...QED - and just about anyone who is in the industry in the business of designing and fabbing silly large chips (and the GT200 was just that) would agree with that prologue without bothering to read the rest of the book. The proof of the pudding is indeed in the eating.

If this chip had cost $3000 and was targeted strictly at niche HPC applications no one in those markets would have batted an eye at that pricepoint, and that pricepoint would have enabled gross margins that would have enabled far more riskier architecture and aggressive reliance on more corner-like conditions in the xtor metrics, etc. That they drove that thing thru the fab with the intent to sell it at a pricepoint that enabled a $300 product in retail is a ZOMFG moment to folks like myself who spent our lives working on fabbing/yielding silly large chips (for their time).
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: Idontcare
I was just speaking in regards to the IC itself, regardless whether they actually sold it for $200 or $300 or $400 my point is that it was designed to be capable of being manufactured so as to enable a pricepoint that is basically an order of magnitude lower than that for comparably sized chips.

I can't find the link anymore but there was a table showing the cost of each component on a card and it put the GTX285 die at $90 (probably for the AIB partner so even less for nV?) and knowing now what you've said I'm glad they ARE able to sell the whole card for so cheap.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: thilan29
Originally posted by: Idontcare
I was just speaking in regards to the IC itself, regardless whether they actually sold it for $200 or $300 or $400 my point is that it was designed to be capable of being manufactured so as to enable a pricepoint that is basically an order of magnitude lower than that for comparably sized chips.

I can't find the link anymore but there was a table showing the cost of each component on a card and it put the GTX285 die at $90 (probably for the AIB partner so even less for nV?) and knowing now what you've said I'm glad they ARE able to sell the whole card for so cheap.

When those first GT200 chip stats and reviews came out I can't tell you how much time the engineers at IBM, TI, Intel, AMD, and others wasted at the watercooler with their minds boggling at what had just been done.

"How did they do that!?" "Can you believe they did that!? And on TSMC's node no less!?" "Why can't WE do that!?"

Seriously, in that small world it was like working on the apollo program to get a man on the moon and in struts China and puts a man on mars before you even get to the moon, and all for 1/10 the pricetag as your own efforts to get to the moon.

There were many a project budget reviews thereafter, many.

Let me see, how to work in the mandatory car analogy here...imagine you work at Toyota or Honda as an automotive design engineer working on an SUV product line that has traditionally garnered a meager 16-19 mpg EPA rating and General Motors releases an equivalent weight and size class SUV with 60 mpg and has somehow managed to design the thing so it can be produced and sold for $10k versus your $20k or $30k pricepoint.

You'd be gobsmacked at how they just did that. Oh, and they get it manufactured in a outsourced production plant you'd consider dilapidated compared to your inhouse state of the art production facilities, seemingly without any compromise in build quality.

Needless to say I gained a whole other level of respect for fabless design houses as well as foundry process and yield capabilities that day. But just like AMD's (GF's) APM, what makes it happen is not that NV has the right DFM tools (everyone has access to those) but rather its the people, the risk takers and the managers. Clearly, proof is in the pudding, they were willing to do something few have ventured to attempt.

Not saying it was business genius and theirs is the path that leads to gold at the end of the rainbow. Clearly the margins and price/performance situation was (give or take) on par with AMD's in the grand scheme of things.

Just saying that of ALL the IC design houses out there right now the last one I'd place in the "chip is gonna be big, so yields will be low and chip will be costly" folder is Nvidia when it comes to GT300 prognostications. I simply would not want to underestimate the capabilities of the people working there again.