Nvidia stockpiles 55nm parts for a massive assult on ATI

krnmastersgt

Platinum Member
Jan 10, 2008
2,873
0
0
Maybe we'll all have something to fill (or want to fill at least) our stockings with this Christmas? :p
 

WelshBloke

Lifer
Jan 12, 2005
31,826
9,785
136
I'd have thought they'd be in the shops now if they were ready and wanted the xmas sales.

Or at least the NV hype machine would have started up.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
I hope we have an all out war. Its good for consumers when they do (witness the last six months).
 

thilanliyan

Lifer
Jun 21, 2005
11,958
2,184
126
The article says it's gonna cost LESS than the current 200 parts? Somehow I doubt that. I'tll probably just drive down the prices of the current ones.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Originally posted by: thilan29
The article says it's gonna cost LESS than the current 200 parts? Somehow I doubt that. I'tll probably just drive down the prices of the current ones.

Depending on the yield it could be cheaper.
 

KingstonU

Golden Member
Dec 26, 2006
1,405
16
81
Originally posted by: Acanthus
Ill take a 55nm GTX 290.

Or a 55nm GTX280GX2.


Rumors are that there will be 55nm GTX260GX2 but not possible with 280 on 55nm
 

badnewcastle

Golden Member
Jun 30, 2004
1,016
0
0
Originally posted by: KingstonU
Originally posted by: Acanthus
Ill take a 55nm GTX 290.

Or a 55nm GTX280GX2.


Rumors are that there will be 55nm GTX260GX2 but not possible with 280 on 55nm

Why is that people keep saying it's not possible to have GTX280 GX2?

If the dies are the same size, is it simply a heat and power issue? Surely a NV has ways of fixing that too, it seams.

But I have seen well respected posters here (appopin - not the only one, but for example).
 

thilanliyan

Lifer
Jun 21, 2005
11,958
2,184
126
Originally posted by: zerocool84
Originally posted by: thilan29
The article says it's gonna cost LESS than the current 200 parts? Somehow I doubt that. I'tll probably just drive down the prices of the current ones.

Depending on the yield it could be cheaper.

It COULD be cheaper but there's no way they'd do it. Since it'll most likely perform better than the older parts, why would they price it lower?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,196
126
The problem is, what if the 55nm cards come out, at a cheaper price. Then what are Nvidia's board partners going to do with their older, more expensive stock?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: VirtualLarry
The problem is, what if the 55nm cards come out, at a cheaper price. Then what are Nvidia's board partners going to do with their older, more expensive stock?

Well thats why no 55nm has been released yet. All the old 65nm inventory is probably being cleared as we speak. It makes sense seeing as the 55nm has been pushed back to january instead of the rumored October then November then December timeframe. It gives time for the board partners to clear old stock and I wouldn't be surprised if the GT200 has reached EOL.

nVIDIA is waiting for the right time to fully expose its 55nm transition to the world. Depending on what kind of benefits 55nm brings, I think waiting for the 40nm refresh might be a better idea for some before DX11 cards hit from both IHVs later late next year.

 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Originally posted by: VirtualLarry
The problem is, what if the 55nm cards come out, at a cheaper price. Then what are Nvidia's board partners going to do with their older, more expensive stock?


well didnt hey do that w/ the 8800 and 9800gt? both exactly the same card, just one runs on 55nm and the other 65nm. It didnt seem too problematic making the switch?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Cookie Monster
Originally posted by: VirtualLarry
The problem is, what if the 55nm cards come out, at a cheaper price. Then what are Nvidia's board partners going to do with their older, more expensive stock?

Well thats why no 55nm has been released yet. All the old 65nm inventory is probably being cleared as we speak. It makes sense seeing as the 55nm has been pushed back to january instead of the rumored October then November then December timeframe. It gives time for the board partners to clear old stock and I wouldn't be surprised if the GT200 has reached EOL.

nVIDIA is waiting for the right time to fully expose its 55nm transition to the world. Depending on what kind of benefits 55nm brings, I think waiting for the 40nm refresh might be a better idea for some before DX11 cards hit from both IHVs later late next year.

for people buying ati or nvidia.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: badnewcastle
Originally posted by: KingstonU
Originally posted by: Acanthus
Ill take a 55nm GTX 290.

Or a 55nm GTX280GX2.


Rumors are that there will be 55nm GTX260GX2 but not possible with 280 on 55nm

Why is that people keep saying it's not possible to have GTX280 GX2?

If the dies are the same size, is it simply a heat and power issue? Surely a NV has ways of fixing that too, it seams.

But I have seen well respected posters here (appopin - not the only one, but for example).

By now, the 55nm process is probably very mature. High Yields. Who knows where the power envelope will be for the GT200b? The GT200 was already impressive for it's power efficiency compared to it's size. Who knows, a 280GX2 could be possible at 55nm.
Also, after reading that RV770 article Anand wrote, it makes me think about the option of the 256-bit bus with GDDR5. This would contribute to a smaller die size as well, besides that actually process shrink. Is this not what they did with G92? Only they still used GDDR3 for the G92 including the 9800GX2.

The possibilities are endless when it comes to human Ingenuity, imagination and engineering.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Originally posted by: KingstonU
Originally posted by: Acanthus
Ill take a 55nm GTX 290.

Or a 55nm GTX280GX2.


Rumors are that there will be 55nm GTX260GX2 but not possible with 280 on 55nm

Aren't the current 260, 260 Core 216 and 280 all the exact same chip with different numbers of stream processors enabled? If they just die shrink the chip, then they should be able to make all three SKUs on 55nm, plus a few more for the low and mainstream markets.
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
I don't think a 260/280 GX2 is the way for nVidia to go if the 9800 GX2 is used as an example. 2 9800 GTXs in SLI consistantly beat a single 9800 GX2 in the majority of benchmarks. nVidia has a poor record driver-wise in supporting their GX2 products.:)
 

biostud

Lifer
Feb 27, 2003
18,864
5,744
136
It will still be more expensive to produce than the RV770 core, but more competition is always good.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: biostud
It will still be more expensive to produce than the RV770 core, but more competition is always good.

How much does each cost to manufacture?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
Originally posted by: biostud
It will still be more expensive to produce than the RV770 core, but more competition is always good.

How much does each cost to manufacture?

It doesn't matter.

Assuming each design is getting a comparable number of errors per wafer on the same process, the GT200 will still be more expensive to produce than the RV770 simply due to transistor count (965 million vs 1.4 billion). This means that Nvidia gets fewer cores per wafer than AMD. In addition, a larger die means there will be a correspondingly higher chance of a failure to occur on each core. So Nvidia's yields will still be lower even if the number of errors per wafer are equal to AMD's. Thus, Nvidia's GT200 will still be more expensive to produce than the RV770, just as biostud stated.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: FalseChristian
I don't think a 260/280 GX2 is the way for nVidia to go if the 9800 GX2 is used as an example. 2 9800 GTXs in SLI consistantly beat a single 9800 GX2 in the majority of benchmarks. nVidia has a poor record driver-wise in supporting their GX2 products.:)

You've said this in at least two threads in the last 24 hours.

Could you tell us, very specifically, why you say that? Reason I ask is I have a 9800GX2 I have no issues with, and one of my 7950GX2s is still going strong for a friend.

Here are the issues I know about for GX2s:
1. Original Quad Sli support took about 3 months for them to get around the render ahead limitations of DX9 with SFR of AFR. Single cards worked fine for me before they were launched.

2. It took them a while to get Vista drivers for the 7950GX2 together, because they had to re-write their drivers from the ground up for a unified architecture and the new OS.

3. Every now and then you can find a game that doesn't work well with a GX2 that they fix.These are the exception, not the rule, and the same could be said of any multi GPU configuration.

You paint a much bleaker picture of GX2s.

Is this based on your own experiences? Please specify.

If you're basing it on what you've read, please link.

Thanks!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
Originally posted by: keysplayr2003
Originally posted by: biostud
It will still be more expensive to produce than the RV770 core, but more competition is always good.

How much does each cost to manufacture?

It doesn't matter.

Assuming each design is getting a comparable number of errors per wafer on the same process, the GT200 will still be more expensive to produce than the RV770 simply due to transistor count (965 million vs 1.4 billion). This means that Nvidia gets fewer cores per wafer than AMD. In addition, a larger die means there will be a correspondingly higher chance of a failure to occur on each core. So Nvidia's yields will still be lower even if the number of errors per wafer are equal to AMD's. Thus, Nvidia's GT200 will still be more expensive to produce than the RV770, just as biostud stated.

Unless you own NVIDIA, how does that matter to you Creig?

GTX260Core216s allegedly cost more to produce than 1GB HD4870s, yet sell for less, outperform them, and offer unique features. Apparently costing more to produce doesn't always matter much to the buyer.

The market will set the price of these cards, like any other. If they're the highest performing, they'll likely cost the most.

It's interesting to me how very many times I've seen ATi fans post about production costs for NVIDIA products when they only cost that matters to any of us is the one in the store.

What's next? Comparisons of the real estate taxes on facilities and wages for engineers? I don't get it.
 

deerhunter716

Member
Jul 17, 2007
163
0
0
Its interesting that NVidia fan boys tout how their cards are still better when benchmarks prove that ATI was the new king :) We shall see who is the new king after this release and MAYBE it goes back to Nvidia; BUT competition is the real winner here.