nVidia's Fermi to be called GTX 470 and 480

cbn

Lifer
Mar 27, 2009
12,968
221
106
If GTX 380 was souped up 40nm shrink of GT200b I don't think a lot of people would be dissapointed.

Pretty much everyone has come to expect this type of naming scheme from Nvidia (ie, New naming scheme doesn't neccessarily mean new generation)
 

Udgnim

Diamond Member
Apr 16, 2008
3,683
124
106
I guess that is how Nvidia will have cards in the < $300 segment
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Sweet are we going to see 8800 GTS -> 9800 GTX+ -> GTS 250 -> GTS 340 -> GTS 430? Once they get to GTS 610 there's noplace for it to go though..


Great way to take advantage of the anticipation for GTX 300 series without delivering a GTX 300 series though.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Sweet are we going to see 8800 GTS -> 9800 GTX+ -> GTS 250 -> GTS 340 -> GTS 430? Once they get to GTS 610 there's noplace for it to go though..


Great way to take advantage of the anticipation for GTX 300 series without delivering a GTX 300 series though.

No I believe it's the gtx 260,and gtx 280 there are talking about for main steam cards.

A die shrunk super gtx 285 should beat a 5850.

Edit: never mind
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Keep in mind, Theo's only speculating on the skip in numbers; though leaving room for a die shrink/lower binned product does make sense.

In any case, 5850 sits nicely and can be adjusted downwards; middle ground is gonna be tough for nV. Hell 5870 could be adjusted downwards. Drop in a 5890, adjust 5870 and 5850 downwards.. middle ground is gonna be messy me thinks.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I wonder if these will sport DX10.1 or DX11 support. If DX11, would that mean a tesselation unit would have to be added? Or can this be accomplished in DX10.1?
What would the die size be. Current GT200b is 470mm2 on 55nm.
If they went with a smaller bus, like 384 or 256 using GDDR5, die shrunk to 40nm, but added DX10.1 or DX11 support, what do you think we'd end up with?
Perhaps something similar to the move from G80 to G92, but instead of sticking with GDDR3, a move to GDDR5 should be standard these days. GDDR3 is pretty long in the tooth now.

I have no details about these products yet myself. So, it's all my speculation.
 
Last edited:

Kuzi

Senior member
Sep 16, 2007
572
0
0
If this news is true, then I don't think Nvidia would add DX11 support to the GTX 3xx, probably just DX10.1 like they did already with the 310.

But you guys are right, a GT200b on a 40nm process, with DX10.1 and a 256bit bus using GDDR5 would be a nice card. It should be able to beat a 5850 by a bit, while also costing NV about the same amount to manufacture, as it's size should end up at around ~345mm^2. Almost the same size as Cypress (338mm^2).
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
If GTX 380 was souped up 40nm shrink of GT200b I don't think a lot of people would be dissapointed.

Pretty much everyone has come to expect this type of naming scheme from Nvidia (ie, New naming scheme doesn't neccessarily mean new generation)

Oh yea, a GTX285 shrank to 40nm with a bump in clock speed and hopefully DX11 support would be a very capable card I'd think. It wouldn't be the most exciting next gen part for Nvidia since most of us are a little sour on the renaming they do, but I'd think with a shrink and some tweaking it'd be a fine part.

On the other hand, if they just take the GTX285 and rename it, that'd be pretty disappointing. And even with a shrink if it's just a DX10 part I'd think that would hurt it quite a bit as well. I guess we'll see in a few months...
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
Oh yea, a GTX285 shrank to 40nm with a bump in clock speed and hopefully DX11 support would be a very capable card I'd think. It wouldn't be the most exciting next gen part for Nvidia since most of us are a little sour on the renaming they do, but I'd think with a shrink and some tweaking it'd be a fine part.

the GT200 architecture doesn't support DX11 extensions so that will never happen.
in order to support DX11 they need a completely new architecture which is Fermi.

shrinking the 2XX series and calling it 3XX will only work if they cut its prices and make it compete in the mainstream and downwards segments, but still it will lack DX11 support.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
the GT200 architecture doesn't support DX11 extensions so that will never happen.
in order to support DX11 they need a completely new architecture which is Fermi.

shrinking the 2XX series and calling it 3XX will only work if they cut its prices and make it compete in the mainstream and downwards segments, but still it will lack DX11 support.

HD4xxx didn't support DX11 either. Until it was added. Not saying this is what will happen with GTX3xx, but saying it will "never" happen is probably a bit extreme.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
So they've decided to rebadge the 300 series as the 400 series? And it hasn't even been released yet?!? That has got to be some sort of rebadging speed record for Nvidia! ;)
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
shame if this is true.

sad thing is in the 4 months that will have passed until fermi is released, most people have moved on and bought ATI. I just got my 5770, very nice purchase thank u very much.:)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
So they've decided to rebadge the 300 series as the 400 series? And it hasn't even been released yet?!? That has got to be some sort of rebadging speed record for Nvidia! ;)

Excuse me, but the GTX3xx moniker was purely specualtion by reviewers given that there was no official branding yet for GF100. It was only known as GF100 or Fermi until now.
And there still isn't any confirmation for final branding names.

GT200 looks to be a die shrink from 55nm to 40. And who knows what other changes "could" be made.

So, as Theo states, "giving precious fuel to ATI fans". Don't let this be you. ;)

You can't expect NV, or ATI for that matter, to just take an excellent architecture and throw it in the trash.
If it can be made into something better, and cover all the needs of mainstream users for a good competitive price/performance ratio
then more power to them.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I'm taking this to mean GTX 300 series will not support DX11. Else they might as well as put them in the same naming scheme as Fermi. In a way this isn't bad. I remember back in 2002-2003 people were complaining that the Radeon 9000/9100 series didn't support DX 9 (ATi's first number always meant the DX version). Plus it further distinguishes between the new Fermi GGPU / whatever features and GT200b features.

GTX 285 is 512-bit GDDR3, so they could make GTX 380 be 256-bit GDDR5. Then just clock it a bit higher and bam, you have a 5850 competitor.

I hope the GTS 250 gets re-badged. G92 must forever live!
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I'm taking this to mean GTX 300 series will not support DX11. Else they might as well as put them in the same naming scheme as Fermi. In a way this isn't bad. I remember back in 2002-2003 people were complaining that the Radeon 9000/9100 series didn't support DX 9 (ATi's first number always meant the DX version). Plus it further distinguishes between the new Fermi GGPU / whatever features and GT200b features.

GTX 285 is 512-bit GDDR3, so they could make GTX 380 be 256-bit GDDR5. Then just clock it a bit higher and bam, you have a 5850 competitor.

I hope the GTS 250 gets re-badged. G92 must forever live!

Don't know yet what level of DX it will support.
G92 has lived so long because it was/is such a good architecture. Who knows, they might shrink that again to 40nm and add some features. It would not surprise me.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
HD4xxx didn't support DX11 either. Until it was added. Not saying this is what will happen with GTX3xx, but saying it will "never" happen is probably a bit extreme.

Yea, it looks like AMD took their previous architecture and were able to make it a DX11 part (and doubled up shaders, ROP's, etc). But they also went from DX10.1 to DX11 which, as I understand it, isn't quite as big of a jump and DX11 is somewhat similart to DX10.1 I guess.

I think a shrank GTX285 would be a very good part for Nvidia. They already did the R&D on it, the 5850 is just a bit faster than it. So with a shrink and clock speed bump I'd think it could compete very well.

But as else said, if they are indeed the GT3xx series I wouldn't count on DX11 support. Given that DX11 games are already out and many in developement, I would think that'd hurt this part.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Yep. Plus it was easier for ATI to add DX11 support because they had a tessellator in the hardware for many years already, and they moved from DX10.1 not DX10. Nvidia would have to change too much to add DX11 to the GT200b, which I think will not happen.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
IF the GT200 renames have DX11 I wouldn't see a problem with that...but seriously at least for the sake of possible confusion of the uninformed (maybe that's what they're trying to achieve? :( ), they should keep DX11 cards as a separate series.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Even if they are nothing more than renames, I don't see a problem as long as they are priced appropriately. It would be cool if ATI did it too. Wouldn't you guys like to be able to buy a 4870 or GTX 260 or 4890 or GTX 285 for $100-200?
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
A sub-100$ GTS250 that doesn't require a power connector and will obliterate a HD5670? Who wouldn't want one? Who cares about DX11 in that segment anyway... And all the HTPC stuff like trans-coding or encoding most likely has or will have CUDA-based applications.

However, a GTX285-level card with no DX11? No thanks! Nothing out now that can utilize it (a game or two), but in what? 4 months when the cards hit the stores, I'm sure at least a few more games will be available. And this performance-level card really would suck if it does't offer everything a current generation card from the competition has. You can't be late to the game and offer less.

Not to mention all those people "upgrading" from a GTX280 to a GTX380 ;)
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Considering that previous gen high end cards are usually around the same speed or a bit faster than next gen mid range cards, I don't think many people were planning to upgrade from their 285 to Nvidia's mid range next gen GPU(which may end up being a 285 in this case). People with a 285 would upgrade to a 470 or 480 if they wanted to see real gains.