nVidia's Fermi to be called GTX 470 and 480

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Not really surprising, but not great news either. This implies Fermi is going to be *very* expensive if it's not cost effective to neuter/harvest the cores for a midrange solution.

DX11 is not mandatory at the mid/low end. Think of it as the X800 series competing vs the 6800. Only pixel shader 2.0, but they still sold as mid/low end cards just fine.

So I suspect there won't be DX11 support in the nv mainstream product lineup for at least six months if the mid 300 cards are just 200 series (or dare I say g92?) with new stickers.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
All signs so far have pointed towards Fermi being incredibly expensive to produce.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
HD4xxx didn't support DX11 either. Until it was added. Not saying this is what will happen with GTX3xx, but saying it will "never" happen is probably a bit extreme.

Different cases - lots of what was required for the "old DX10"/"almost all DX11" was already in the R600.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Excuse me, but the GTX3xx moniker was purely specualtion by reviewers given that there was no official branding yet for GF100. It was only known as GF100 or Fermi until now.
And there still isn't any confirmation for final branding names.
Yes, I know. It was a joke playing on Nvidia's fondness for rebadging cards, hence the ";)" at the end. I'm sure everybody else realized it was simply a small joke.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I don't know what people are complaining about. If the DX11 parts being part of the 400 series, then that's great. It means there won't be any confusion over what is DX11-capable, as the 300 series is already populated with DX10.1 cards.
 

Hyperlite

Diamond Member
May 25, 2004
5,664
2
76
I find it a little difficult to swallow that they think they could get away with re-launching a DX10 part. That would mean their "new" mid-range cards would have fewer features than any part in ATI's last gen range. This will be interesting to see, indeed.

That said, the 4xx naming scheme makes sense. With nV's newfound for the "shrink and rebadge" process, it plays well, and isn't arguable IMO. I'd love a $100 GTX260, but only if it was packing at least DX10.1. Which is kinda where this process falls short, because it means the low and midrange cards don't get any new features that are independent of performance. Tessellation has the potential to add a lot to games it seems, though i'm more interested in the multimedia features. As someone else suggested, maybe a lot of that will be passed off to CUDA apps, which would be great.
 
Last edited:

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Since the 210/220/240 are all DX10.1 40nm parts based on G200 I'd think 10.1 is practical for a mainstream shrink of higher end parts as well.

But I think your hopes for a $100 40nm GTX260 are misplaced. The 240 is intended for that price bracket, with a 250 targeted for the low $100s as well. It'll be priced about the same as the 5770 with marketing positioning PhysX as the feature to prefer over DX11.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
If this news is true, then I don't think Nvidia would add DX11 support to the GTX 3xx, probably just DX10.1 like they did already with the 310.

But you guys are right, a GT200b on a 40nm process, with DX10.1 and a 256bit bus using GDDR5 would be a nice card. It should be able to beat a 5850 by a bit, while also costing NV about the same amount to manufacture, as it's size should end up at around ~345mm^2. Almost the same size as Cypress (338mm^2).

That's an interesting point, maunfacturing costs will certainly play a role. ATI launched 5850 at $249 so we know it can easily be there again, maybe even cheaper if necessary. Man, seems like a 10.1 part would be futile. Let's see what happens..
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Yay, let's rebadge the g92 one more time into a gts350, and rely on consumer stupidity to sell out products!

Looks like Nvidia was so fucused on the high end fermi, they didn't really have a plan on scaling it down to mainstream parts anytime soon.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
I think nvidia is in deep trouble.

They first said the Fermi has 512 shaders. Soo well stick to that until otherwise notified.

The ATI 5890 has 1500 shaders. That is 3 times more shader. What does nvidia have to say about this. I think nvidia blew it!
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
I think nvidia is in deep trouble.

They first said the Fermi has 512 shaders. Soo well stick to that until otherwise notified.

The ATI 5890 has 1500 shaders. That is 3 times more shader. What does nvidia have to say about this. I think nvidia blew it!

No such thing as 5890.

If it ever exists it will have more than 1500 shaders.

ATI shaders and nvidia ones arent comparable.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think nvidia is in deep trouble.

They first said the Fermi has 512 shaders. Soo well stick to that until otherwise notified.

The ATI 5890 has 1500 shaders. That is 3 times more shader. What does nvidia have to say about this. I think nvidia blew it!
you say the most inaccurate things in your replies. its as if you dont even know what you are talking about. there is no 5890 and it would have at least 1600 sp since even the 5870 has that. ATI could not make a 1500 sp card anyway based on the way they do sp. ATI and Nvidia have completely different architectures and the number of sp cannot be directly compared.
 
Last edited:

Hyperlite

Diamond Member
May 25, 2004
5,664
2
76
I think nvidia is in deep trouble.

They first said the Fermi has 512 shaders. Soo well stick to that until otherwise notified.

The ATI 5890 has 1500 shaders. That is 3 times more shader. What does nvidia have to say about this. I think nvidia blew it!

Erm. First of all there is no such thing as a 5890. Secondly, you cannot compare ATi and nV shaders.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Erm. First of all there is no such thing as a 5890. Secondly, you cannot compare ATi and nV shaders.

And even if the two shaders performed roughly the same per unit there is still clock rate, fill rate, memory bandwidth and latency...

So yeah. Declare them dead based on targeting the HTPC crowd and designing a non-competitive gaming GPU if you must. But a postmortem based on shader count is very premature.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Even if they are nothing more than renames, I don't see a problem as long as they are priced appropriately. It would be cool if ATI did it too. Wouldn't you guys like to be able to buy a 4870 or GTX 260 or 4890 or GTX 285 for $100-200?

4870s, 4890s, and GTX260s have already been selling in the $100-$200 range. The GTX285 has not been but the 4890 is so close in performance and sold for much less already.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
I think nvidia is in deep trouble.

They first said the Fermi has 512 shaders. Soo well stick to that until otherwise notified.

The ATI 5890 has 1500 shaders. That is 3 times more shader. What does nvidia have to say about this. I think nvidia blew it!

They are different architectures. You can't compare them solely by "shaders" because this is the same as the megahertz ratings between AMD CPU's and Intel CPU's. Hell, look at Intel's Pentium4's and their own Core2Duo's. C2D's will outperform a P4 at the same mhz rating.


As far as the renaming goes, it's still from a rumor site. If the rumor is true, I think nVidia is having trouble keeping the costs down for their new architecture. While this is not a problem for the high end, it means that for the low end, and lower mid-end, you'll be stuck with a rebadge of last year's GPU's. Now, this is not as bad as it seems because if you're buying sub-$150 video cards, you're usually not worried about the latest technologies.