Nvidia kills GTX285, GTX275, GTX260, abandons the mid and high end market

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: happy medium
Is it possible to shrink a gtx285 to 40nm and give it directx 11?
It would easily compete with the 5850.

I don't think so. IIRC, NVIDIA wants to have a a DX10.1 chip on the mid and low end because Win 7 Aero can 'take advantage of DX10.1' http://blogs.zdnet.com/hardware/?p=2896

Of course, we know all know DX9, 10, or 10.1 won't make a real world difference when using Aero (my 7900GS OC runs Win 7 Aero just fine), but NVIDIA really can't have a card that doesn't 'take full advantage of Windows 7's capabilities over Vista'.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Originally posted by: nitromullet

edit: either way, I call shens on "Nvidia abandons the high and mid range graphics card market." Even if they create a temporary shortage to raise prices on GT200 based parts, I don't think it signifies a withdrawal from the mid and high end market. If this were true, this would be horrible news for the future of graphics chips.

That is just the dude being overly dramatic - especially cause he doesn't say permanently.

But really, does he needs to tell us that the nvidia cards are currently overpriced compared with the competition?

Didn't we already know the 4xxx series was quite competitive with the GT200b and with the G92? And with the 5870 being both in performance and price between the GTX285 and the GTX295 while supporting DX11, with the 5850 being both faster and cheaper than the GTX285 and with the GTX260/GTX275 fighting the 4870/4890 which are cheaper and maybe the upcoming 57xx series, nvidia would have to drop prices?

I guess we were expecting prices drops from nvidia but they might not happen according to this information, which means it will take more time for the 5xxx series drop in price.

I hope nvidia can get a GT300 that is a lot smaller than that Femri they presented because that thing seems huge and that will be expensive and I'm not sure how much of those transistors will be for gaming, because when I'm buying cards for gaming that is what I want and not scientific cards, especially if I have to pay for that portion that I've no use for, even if I understand why nvidia is going that route.

Or maybe they can surprise me and get a card that isn't only for the $400+ market.
 

s44

Diamond Member
Oct 13, 2006
9,427
16
81
Can we just ban anyone who wastes our time with Charlie sans disclaimer?
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
It is always the same 2 or 3 people that post Charlie rants as fact.

*sigh*
 

DrBombcrater

Member
Nov 16, 2007
38
0
61
Originally posted by: GaiaHunter
That is just the dude being overly dramatic - especially cause he doesn't say permanently.
NVidia may find a temporary withdrawal turns into a permanent one very quickly. Security of supply is a big deal in the industry, you can't just say "sorry guys, no more cards, maybe next year, eh?". Board partners and OEM customers will go nuts. Remember when AMD sold most of their production to Dell and starved the channel of parts? That did them huge financial and reputational damage that persists to this day - nobody trusts a supplier that leaves them high and dry, voluntarily or otherwise.

Partners that have NV cards as their sole line probably aren't going to even be in business next year if this story is true.

NV management has to be aware of this, which suggests if they are indeed closing down GT200 production it's because they simply can't afford to keep bleeding cash by selling below cost and still have enough in the bank to finance the initial production runs of Fermi boards and keep the company going until those boards produce significant revenue. Even if they're just throttling GT200 supplies rather than EOLing them, that's still a tactic born from desperation.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Originally posted by: DrBombcrater

NVidia may find a temporary withdrawal turns into a permanent one very quickly. Security of supply is a big deal in the industry, you can't just say "sorry guys, no more cards, maybe next year, eh?". Board partners and OEM customers will go nuts. Remember when AMD sold most of their production to Dell and starved the channel of parts? That did them huge financial and reputational damage that persists to this day - nobody trusts a supplier that leaves them high and dry, voluntarily or otherwise.

Partners that have NV cards as their sole line probably aren't going to even be in business next year if this story is true.

NV management has to be aware of this, which suggests if they are indeed closing down GT200 production it's because they simply can't afford to keep bleeding cash by selling below cost and still have enough in the bank to finance the initial production runs of Fermi boards and keep the company going until those boards produce significant revenue. Even if they're just throttling GT200 supplies rather than EOLing them, that's still a tactic born from desperation.

My comment was more like "stop ignoring the drama bits to get readers and focus on the few bits that have a likely chance of being true" way.

If you ignore the drama nvidia is either starting to get get rid of stocks so they can launch the GT300 soon - which can imply nvidia is going to surprise many people and launch earlier than expected (would surprise me); or nvidia is EOL the GT200 and the GT300 is still far away but by not dropping the prices nvidia believe current stocks will be enough till GT300 is ready to launch.

Simply by not dropping the prices nvidia is reducing the demand on their cards.

And how many OEMs will be equipping their machines with GTX260 or higher?

 

kreacher

Member
May 15, 2007
64
0
0
Charlie is probably exaggerating the fact that Nvidia is in trouble because ATI has a head start for DX11 GPUs. I have an uneasy feeling that Nvidia will probably slip to Q1 2010 (with a paper launch end of this year).
 

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,267
136
I have the feeling that Nvidia is far too arrgogant to surrender the enthusiast market to ATI, even if they need to take a hit to do it. Far more likely is that they are getting ready for the GT300 ramp, and if that's the case I can't imagine how much money and effort is going into making something that was chip fresh from the fab connected by a mess of wires into a shipping product in two months.

On the other hand, maybe Nvidia did really mess up. We still haven't seen those pictures of the development board Fudzilla promised us...
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Originally posted by: HurleyBird
We still haven't seen those pictures of the development board Fudzilla promised us...
You must have missed this.

Well that's of course a joke. :) On topic, I don't believe NVIDIA will 'abandon' the mid-to-high end GPU market, even if the situation is dire. For reasons mentioned in this thread or elsewhere.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Stoneburner
Originally posted by: ronnn
Nice of you guys to post stock market stuff, but really doesn't address the point of the op.

Is nVidia really going to stop competing for a couple of months? Charlie certainly exaggerates, but it looks like he may have been right in the sense that nVidia is effing up big time.

For Nvidia to stop "competing" they would have to determine that selling the GT2XX's at their current MSRP is financially harmful to them. I don't see how that could happen. PLenty of people buy nvidia products even though ATI is often priced better. Moreover, I don't see why they would want to lose overall marketshare.

in a certain sense though nvidia is not currently competitive because ATI's equivalent products are produced cheaper. Nvidia is in a difficult situation right now but they should be able to get out okay. On the other hand, unlike the FX5800 days, they have to compete with ATI AND INTEL with the same damn product.

Demand could be down enough that they think the current stock of cards will last until fermi is in production. Sounds about right, video cards take about 3 months from assembly to market, so by the end of the year we'll see fermi cards hitting and they'll just let the current stock of high end cards run dry.
Seems like the GT200 series may have been a failure for nvidia even larger than that of the FX series if that's what happening though. It would mean the near future will still use G9x for mobile and low/mid range parts, and Fermi will take the high end, and GT200 is just retired.
 

T2k

Golden Member
Feb 24, 2004
1,664
5
0
Well, how is it better than Wreckage's posts? At least the Armenian is funny, Wreckage is boring. :D
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Just when you think Charlie can't sink any lower. Tomorrow he'll say they quit producing graphics cards altogether because ATI is just too awesome to compete against.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,957
126
I?ve no doubt these parts will be EOL?d soon, but that?s because they?ll be replaced with new stuff.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
Meh, I said a long time ago that NV would struggle to compete on price with AMD, and now it's happening.
This was the obvious result of NV's big die, big memory bus strategy vs ATIs small die, less complex bus, more flexible RAM strategy (i.e. GDDR3 prices weren't going to change much, but GDDR5 obviously was)

NV can't compete on price without hurting themselves, and that's a fact whatever way you want to look at it. It's been a fact since day 1 when the HD4870 was released with a 256-bit bus and GDDR5.
OK so it's probably being sensationalised, but that doesn't mean that NV doesn't have problems at the moment.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: Lonyo
Meh, I said a long time ago that NV would struggle to compete on price with AMD, and now it's happening.
This was the obvious result of NV's big die, big memory bus strategy vs ATIs small die, less complex bus, more flexible RAM strategy (i.e. GDDR3 prices weren't going to change much, but GDDR5 obviously was)

NV can't compete on price without hurting themselves, and that's a fact whatever way you want to look at it. It's been a fact since day 1 when the HD4870 was released with a 256-bit bus and GDDR5.
OK so it's probably being sensationalised, but that doesn't mean that NV doesn't have problems at the moment.

My thoughts exactly.

Also, once Fermi does hit, will it really matter? By then AMD will have its 5700s available and maybe even the rest of the lineup (ie 5600s and 5300s). Meanwhile nVidia is scrambling to modify old designs in order to put out a 10.1 part...

What good is the performance crown when there's nothing else to back it up?

Cutting down Fermi won't do much for profits...
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Lonyo
Meh, I said a long time ago that NV would struggle to compete on price with AMD, and now it's happening.
This was the obvious result of NV's big die, big memory bus strategy vs ATIs small die, less complex bus, more flexible RAM strategy (i.e. GDDR3 prices weren't going to change much, but GDDR5 obviously was)

NV can't compete on price without hurting themselves, and that's a fact whatever way you want to look at it. It's been a fact since day 1 when the HD4870 was released with a 256-bit bus and GDDR5.
OK so it's probably being sensationalised, but that doesn't mean that NV doesn't have problems at the moment.

:thumbsup:

As you said competing on price will be a problem (costly) for NVidia. If we look at the GTX285 costing around $320-$350, NVidia would have to lower the price $80-$100 in order to compete well against the HD5850, which costs $270. The GTX285 has a larger die size, 470mm^2 vs. 334mm^2, and the PCB is more complex having a 512-bit MC. So more cost overall.

But I read Charlie's article and I think it's B.S. Why would NVidia cancel their top of the range cards? All they could do is lower the prices a bit, many people would still buy them just because they are NVidia. I can understand if they want to clear the channel in preparation for Fermi, especially if the release is soonish, which seems unlikely though.

The only thing in the article that could be true is the size of Fermi, ~530mm^2 or a bit larger seems probable.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: Kuzi
Originally posted by: Lonyo
Meh, I said a long time ago that NV would struggle to compete on price with AMD, and now it's happening.
This was the obvious result of NV's big die, big memory bus strategy vs ATIs small die, less complex bus, more flexible RAM strategy (i.e. GDDR3 prices weren't going to change much, but GDDR5 obviously was)

NV can't compete on price without hurting themselves, and that's a fact whatever way you want to look at it. It's been a fact since day 1 when the HD4870 was released with a 256-bit bus and GDDR5.
OK so it's probably being sensationalised, but that doesn't mean that NV doesn't have problems at the moment.

:thumbsup:

As you said competing on price will be a problem (costly) for NVidia. If we look at the GTX285 costing around $320-$350, NVidia would have to lower the price $80-$100 in order to compete well against the HD5850, which costs $270. The GTX285 has a larger die size, 470mm^2 vs. 334mm^2, and the PCB is more complex having a 512-bit MC. So more cost overall.

But I read Charlie's article and I think it's B.S. Why would NVidia cancel their top of the range cards? All they could do is lower the prices a bit, many people would still buy them just because they are NVidia. I can understand if they want to clear the channel in preparation for Fermi, especially if the release is soonish, which seems unlikely though.

The only thing in the article that could be true is the size of Fermi, ~530mm^2 or a bit larger seems probable.

But you just said it yourself - the GT200s simply aren't cost effective, they're larger and more complex on the board and slower, lowering the price in order to keep production up might simply not be an option, hence cancellation.

The crazy part is that nVidia had a similar situation with the G80, although back then they didn't really have much competition. But still, they ended up simplifying and streamlining the G80 into the incredibly successful G92, a part that they're still clinging onto to this day. Where is the spiritual successor to G92?
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
I guess it may not be possible to lower the price of the GTX295 much, considering it has two GT200bs. But I'd say selling it for $400 wouldn't be a bad buy for some since it's still the fastest available card by a hair.

A spiritual successor to the G92 would be a GT200 (GT200c?) @ 40nm process :) And while such a chip would allow nV to compete well against ATI on price/performance, it would still not be as future proof because it would only support DirectX10.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: BFG10K
I?ve no doubt these parts will be EOL?d soon, but that?s because they?ll be replaced with new stuff.

I hope as well that the new stuff is coming.
Seriously though, what is it? These are honest questions.
What is the 40nm GT200 replacement? When are we going to see cut down Fermi? 6-9 months from now?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: Janooo
Originally posted by: BFG10K
I?ve no doubt these parts will be EOL?d soon, but that?s because they?ll be replaced with new stuff.

I hope as well that the new stuff is coming.
Seriously though, what is it? These are honest questions.
What is the 40nm GT200 replacement? When are we going to see cut down Fermi? 6-9 months from now?

That's something worth thinking about. Fermi will probably at least paper launch by the end of the year, we'll get the flagship single Nvidia GPU and probably some sort of die harvest part (like what the GTX260 was to the GTX280). But by the time it launches, most likely AMD will have an x2 part based on the 5870 out, as well as it looks like they'll have most of the 5xxx series launched. I haven't seen much regarding when we'll see the entire next gen series from Nvidia. Even on 40nm it'll probably be hard for Nvidia to compete with AMD on the mid range. Even if they take a GTX260 level card, shrink it, and give it DDR5 over 256 bit, AMD will have a smaller chip that most likely performs very similarly. Has anyone seen a roadmap for the other market segment products for Nvidia?
 

Jacen

Member
Feb 21, 2009
177
0
0
Come on, Nvidia is in a pretty tough spot with Fermi not hitting until next year but this kind of Fud is just ridiculous.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,957
126
You guys need to remember that nVidia is still in far, far better shape than the red team currently is. Also the red team were much worse off three years ago with the botched 2900XT and the Phenom, yet they still clawed back. A single ?Charlie article? doesn?t spell the end of nVidia.