• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia stockpiles 55nm parts for a massive assult on ATI

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: nRollo
Unless you own NVIDIA, how does that matter to you Creig?

GTX260Core216s allegedly cost more to produce than 1GB HD4870s, yet sell for less, outperform them, and offer unique features. Apparently costing more to produce doesn't always matter much to the buyer.

The market will set the price of these cards, like any other. If they're the highest performing, they'll likely cost the most.

It's interesting to me how very many times I've seen ATi fans post about production costs for NVIDIA products when they only cost that matters to any of us is the one in the store.

What's next? Comparisons of the real estate taxes on facilities and wages for engineers? I don't get it.

I was explaining biostuds statement to keysplayr because he obviously didn't understand how biostud could make that determination without knowing actual pricing figures.

What exactly is YOUR problem with my statement? I don't see you refuting anything in my post. All I see is an Nvidia Focus Group member engaging in sales tactics, which I believe you are prohibited from doing:

GTX260Core216s allegedly cost more to produce than 1GB HD4870s, yet sell for less, outperform them, and offer unique features. Apparently costing more to produce doesn't always matter much to the buyer.

That has ABSOLUTELY NOTHING to do with what I posted.

So if you wish to challenge something in my previous post, please do. Otherwise, shut up.

Take it down just a notch please, mmm'kay?

-ViRGE
 
Originally posted by: nRollo
Unless you own NVIDIA, how does that matter to you Creig?
Rollo, do you even know what forum you're on? We'd argue about the color of video card coolers if given the chance, it's why this place exists in the first place.😛

If that question was asked every single time, this forum would be reduced to 3 legitimate posts per day, along side 2 nef threads and 4 spammers. Technology is interesting, understanding how that technology works, how it's made, and even what it costs are all aspects of technology that are what make it interesting.
 
Originally posted by: nRollo
You've said this in at least two threads in the last 24 hours.

Could you tell us, very specifically, why you say that? Reason I ask is I have a 9800GX2 I have no issues with, and one of my 7950GX2s is still going strong for a friend.

I knew I had seen the horrible scaling somewhere:
http://www.guru3d.com/article/...s-performance-review/5

Lol it scores lower than a 3850 sometimes. 🙂

I don't know how it is with new drivers though (this article is from October 22, 2008).
 

It's interesting to me how very many times I've seen ATi fans post about production costs for NVIDIA products when they only cost that matters to any of us is the one in the store. What's next? Comparisons of the real estate taxes on facilities and wages for engineers? I don't get it.

This forum seems to have a dual purpose. One is to weigh and evaluate our options as consumers of video cards, which is obviously a chief preoccupation among us. The other is to comment on the state of the industry as a whole. This second area is intrinsically interesting, but also relevant to the first area. Profit margins for ATI and NV impact our expectation for competition in future product cycles. It is particularly relevant with this latest generation that AMD's graphics division seems to have returned to profitability as many of us want to see it remain competitive, lest we once again revisit the G80 situation where the minimum buy-in for acceptable performance was $300.

- woolfe

 
Originally posted by: deerhunter716
Its interesting that NVidia fan boys tout how their cards are still better when benchmarks prove that ATI was the new king 🙂 We shall see who is the new king after this release and MAYBE it goes back to Nvidia; BUT competition is the real winner here.

There's more to a good gaming experience than just maximum FPS. The FPS of one dual chip card doesn't tell the tale for the entire line-up, not to mention driver compatibility and quality.

The 9800 GX2 didn't make Nvidia better when it was the fastest card made (for a brief time). Neither did the HD 3870 X2 before it for ATI. Nor the HD 4870 X2 now. They all have their issues related to a dual GPU design. And if Nvidia comes out with a GTX 260 X2 or even GTX 280 X2 (that'll be interesting), they won't be immune to those issues either - faster or not.

Nvidia "fanboys" and ATI "fanboys" tend to both get carried away with their hyperbole about who's better and why. Both companies make excellent products and what is best for you depends on a lot of factors.

 
Originally posted by: thilan29
Originally posted by: nRollo
You've said this in at least two threads in the last 24 hours.

Could you tell us, very specifically, why you say that? Reason I ask is I have a 9800GX2 I have no issues with, and one of my 7950GX2s is still going strong for a friend.

I knew I had seen the horrible scaling somewhere:
http://www.guru3d.com/article/...s-performance-review/5

Lol it scores lower than a 3850 sometimes. 🙂

I don't know how it is with new drivers though (this article is from October 22, 2008).

It's just one game too. As I noted, all multi GPU has it's pitfalls and games that don't scale, I could link to several for 4870X2s but there's no point. The 9800GX2 has good support for multi GPU, and that's the market it competes in.
 
Originally posted by: nRollo
Originally posted by: Creig
Originally posted by: keysplayr2003
Originally posted by: biostud
It will still be more expensive to produce than the RV770 core, but more competition is always good.

How much does each cost to manufacture?

It doesn't matter.

Assuming each design is getting a comparable number of errors per wafer on the same process, the GT200 will still be more expensive to produce than the RV770 simply due to transistor count (965 million vs 1.4 billion). This means that Nvidia gets fewer cores per wafer than AMD. In addition, a larger die means there will be a correspondingly higher chance of a failure to occur on each core. So Nvidia's yields will still be lower even if the number of errors per wafer are equal to AMD's. Thus, Nvidia's GT200 will still be more expensive to produce than the RV770, just as biostud stated.

Unless you own NVIDIA, how does that matter to you Creig?

GTX260Core216s allegedly cost more to produce than 1GB HD4870s, yet sell for less, outperform them, and offer unique features. Apparently costing more to produce doesn't always matter much to the buyer.

The market will set the price of these cards, like any other. If they're the highest performing, they'll likely cost the most.

It's interesting to me how very many times I've seen ATi fans post about production costs for NVIDIA products when they only cost that matters to any of us is the one in the store.

What's next? Comparisons of the real estate taxes on facilities and wages for engineers? I don't get it.

It matters when AMD are able to lower their selling price even more to stay competitive.

Also it's good for the environment 😉
 
Originally posted by: nRollo
It's just one game too. As I noted, all multi GPU has it's pitfalls and games that don't scale, I could link to several for 4870X2s but there's no point. The 9800GX2 has good support for multi GPU, and that's the market it competes in.

It is one game but a popular one isn't it? It was used for all those 4870 vs 216 reviews when the new 180 driver came out so it's obviously important.
 
Originally posted by: thilan29
Originally posted by: nRollo
It's just one game too. As I noted, all multi GPU has it's pitfalls and games that don't scale, I could link to several for 4870X2s but there's no point. The 9800GX2 has good support for multi GPU, and that's the market it competes in.

It is one game but a popular one isn't it? It was used for all those 4870 vs 216 reviews when the new 180 driver came out so it's obviously important.

Didnt the 4870 sandwich have issue on the FC2 launch?
 
Originally posted by: Creig
Originally posted by: keysplayr2003
Originally posted by: biostud
It will still be more expensive to produce than the RV770 core, but more competition is always good.

How much does each cost to manufacture?

It doesn't matter.

Assuming each design is getting a comparable number of errors per wafer on the same process, the GT200 will still be more expensive to produce than the RV770 simply due to transistor count (965 million vs 1.4 billion). This means that Nvidia gets fewer cores per wafer than AMD. In addition, a larger die means there will be a correspondingly higher chance of a failure to occur on each core. So Nvidia's yields will still be lower even if the number of errors per wafer are equal to AMD's. Thus, Nvidia's GT200 will still be more expensive to produce than the RV770, just as biostud stated.

Sure as long as you overlook things like R&D costs, overhead, marketing, reality, etc.
 
This sounds like bad news for Nvidia. They have stockpiled a 3 months supply of 55nm parts. Since I guess they can't get rid of all the old 65nm, and they don't want to release the new parts since they can't even sell the old ones. So they are sitting on mounds of GPU's they can't sell, and we are in a recession, very bad news for Nvidia.
 
Originally posted by: SolMiester
Originally posted by: thilan29
Originally posted by: nRollo
It's just one game too. As I noted, all multi GPU has it's pitfalls and games that don't scale, I could link to several for 4870X2s but there's no point. The 9800GX2 has good support for multi GPU, and that's the market it competes in.

It is one game but a popular one isn't it? It was used for all those 4870 vs 216 reviews when the new 180 driver came out so it's obviously important.

Didnt the 4870 sandwich have issue on the FC2 launch?

I'm not sure...I think they did. I wasn't trying to compare issues between video cards...if you think I was then please reread my original post linking to the article I was talking about. I was posting about a 9800GX2 issue, which nRollo asked for.
 
Originally posted by: BladeVenom
This sounds like bad news for Nvidia. They have stockpiled a 3 months supply of 55nm parts. Since I guess they can't get rid of all the old 65nm, and they don't want to release the new parts since they can't even sell the old ones. So they are sitting on mounds of GPU's they can't sell, and we are in a recession, very bad news for Nvidia.

Rethink this, and maybe you'll come up with the correct scenario.
 
Originally posted by: Wreckage
Sure as long as you overlook things like R&D costs, overhead, marketing, reality, etc.

Both AMD and Nvidia have R&D costs, they both have overhead, they both pay for marketing. In fact, I would imagine a 1.4 billion transistor GPU would cost more in R&D than a 965 million one. Plus, I'm pretty sure Nvidia spends more on advertising than AMD. This would make Nvidia GPUs even MORE expensive compared to their AMD counterparts. So in your attempt to prove me wrong, you're actually driving my point home. Thanks!

I'm not sure what your wisecrack "reality" comment is supposed to mean, however.
 
Originally posted by: Creig
Originally posted by: Wreckage
Sure as long as you overlook things like R&D costs, overhead, marketing, reality, etc.

Both AMD and Nvidia have R&D costs, they both have overhead, they both pay for marketing. In fact, I would imagine a 1.4 billion transistor GPU would cost more in R&D than a 965 million one. Plus, I'm pretty sure Nvidia spends more on advertising than AMD. This would make Nvidia GPUs even MORE expensive compared to their AMD counterparts. So in your attempt to prove me wrong, you're actually driving my point home. Thanks!

I'm not sure what your wisecrack "reality" comment is supposed to mean, however.

While you strive very thoroughly to make your point, you still do not make clear the "purpose" of your point? And what it means to the end user? You. Me.
 
Originally posted by: keysplayr2003
How much does each cost to manufacture?

It is generally quoted that a single 300mm 65nm die from TSMC costs around $5000. TSMC has lots of customers so i guess over time the info has leaked out somehow. 55nm is more expensive but hopefully not too much.

So for a back of the napkin calculation:
Total die area = pi * (300/2)^2 = 70685mm2
RV770 die size = 256mm2 => max 276 dies per wafer
GT200 die size = 576mm2 => max 122 dies per wafer

A more accurate count of dies per wafer could be obtained by finding a wafer shot of the relevant chip and counting the number of dies present.

Now yield - you never get 100% of the chips present on the wafer. Each wafer has imperfections, but the error rate is roughly predictable. You can download spreadsheets which can calculate yields once you type in the chip size and a few other relevant things about the foundry process. I cant be bothered doing that now so will just quote the yield at 71% for the RV770 though it is likely higher than that now.
ie 71% x 276 = 192 working dies per wafer => average die cost ~ $25

For the GT200 all sorts of figures have been thrown around from as low as 40% up to around 70%. If we are generous and use the highest of these:
70% * 122 = 85 dies per wafer => average die cost ~ $58

The above is the variable cost - does not include the huge fixed up front cost of the design + testing and other production costs like the mask etc. Also board costs must be added but they are a separate post.

Above is rough, feel free to spend time making the numbers better 😉
 
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: Wreckage
Sure as long as you overlook things like R&D costs, overhead, marketing, reality, etc.

Both AMD and Nvidia have R&D costs, they both have overhead, they both pay for marketing. In fact, I would imagine a 1.4 billion transistor GPU would cost more in R&D than a 965 million one. Plus, I'm pretty sure Nvidia spends more on advertising than AMD. This would make Nvidia GPUs even MORE expensive compared to their AMD counterparts. So in your attempt to prove me wrong, you're actually driving my point home. Thanks!

I'm not sure what your wisecrack "reality" comment is supposed to mean, however.

While you strive very thoroughly to make your point, you still do not make clear the "purpose" of your point? And what it means to the end user? You. Me.

I like it when Creig posts just because I get a chance to laugh at one of my old smart ass comments in his sig. 🙂

I don't agree with his assumption that a bigger chip would necessarily cost more than a smaller one to desiogn in R&D though- speed and accuracy of engineers probably plays into that a lot. For example, NVIDIA could design 1.4B transistors faster than I could design one. 😉

 
Originally posted by: nRollo
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: Wreckage
Sure as long as you overlook things like R&D costs, overhead, marketing, reality, etc.

Both AMD and Nvidia have R&D costs, they both have overhead, they both pay for marketing. In fact, I would imagine a 1.4 billion transistor GPU would cost more in R&D than a 965 million one. Plus, I'm pretty sure Nvidia spends more on advertising than AMD. This would make Nvidia GPUs even MORE expensive compared to their AMD counterparts. So in your attempt to prove me wrong, you're actually driving my point home. Thanks!

I'm not sure what your wisecrack "reality" comment is supposed to mean, however.

While you strive very thoroughly to make your point, you still do not make clear the "purpose" of your point? And what it means to the end user? You. Me.

I like it when Creig posts just because I get a chance to laugh at one of my old smart ass comments in his sig. 🙂

I don't agree with his assumption that a bigger chip would necessarily cost more than a smaller one to desiogn in R&D though- speed and accuracy of engineers probably plays into that a lot. For example, NVIDIA could design 1.4B transistors faster than I could design one. 😉

Hell, I'll give em a run! Now, where did I put that soldering iron and that leftover silly-putty?
 
Originally posted by: keysplayr2003
While you strive very thoroughly to make your point, you still do not make clear the "purpose" of your point? And what it means to the end user? You. Me.

My "purpose" was to prove to you that biostud was correct in his assertion that the GT200 would be more expensive to produce than the RV770. You attempted to get him to come up with some dollar figure, so I explained to you why it would be so. You can read rjc's calculations to see my explanation in dollar form (thanks rjc!).

As to "what it means to the end user" is that AMD is able to bring their cards to market at a lower MSRP since their dies are less expensive to produce and they get more viable cores per wafer. They also have more ability to lower prices while maintaining a profit than Nvidia.

I would have thought this would be obvious given my explanation.
 
Around here GM has a massive stock in which they are ready to release at reduced prices to gain market share!!

Nvidia pr around here is quite funny when not trolling. Making inventory a strength sounds good............

Anyways looks like amd will die soon so much of this seems to be moot.
 
Originally posted by: Creig
Originally posted by: keysplayr2003
While you strive very thoroughly to make your point, you still do not make clear the "purpose" of your point? And what it means to the end user? You. Me.

My "purpose" was to prove to you that biostud was correct in his assertion that the GT200 would be more expensive to produce than the RV770. You attempted to get him to come up with some dollar figure, so I explained to you why it would be so. You can read rjc's calculations to see my explanation in dollar form (thanks rjc!).

As to "what it means to the end user" is that AMD is able to bring their cards to market at a lower MSRP since their dies are less expensive to produce and they get more viable cores per wafer.and They also have more ability to lower prices while maintaining a profit than Nvidia.

I would have thought this would be obvious given my explanation.

Bold 1: See where your head is at? I attempted to "get" him to come up with a figure.
What really happened: I asked him because he sounded like he knew, and I didn't.

Bold 2: What are you talking about? GTX 260 core 216 is even cheaper than a 48701GB card right now, and some offers include bundled FarCry2! If what you say is so true, then AMD should be able to offer their 4870 at a cheaper price with a bundled game too? Right?

Bold 3: How does this matter to you? Me? Any end user?
What in the world is this infatuation with company costs? As long as YOU, the end user pays a reasonable price for a certain performance segment, that is where your infatuation should end. Because if it doesn't end, that means that you care more about the company than you do the hardware. Which does not make sense.
 
Back
Top