Mainstream Fermi in June, dual Fermi in April?

Jd007

Senior member
Jan 1, 2010
207
0
0
I don't think GTX 360 is considered mainstream, happy medium.

More like:
GTX 360 and GTX 380 in small quantities in March
GTX 395 in small quantities in April (to take back the crown?)
GTX350?, GT340?, GT320?, and all the other Fermi cards widely available in June
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I don't think GTX 360 is considered mainstream, happy medium.

More like:
GTX 360 and GTX 380 in March
GTX 395 in small quantities in April (to take back the crown? If the gtx 380 doesn't)
GTX350?, GT340?, GT320?, and all the other "Entry level" Fermi cards widely available in June

Fixed:D
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126

Are people honestly expecting the single Fermi to be faster than a 5970?

Not that Nvidia has shown any benchmarks, so who knows, but I'd have to say I would be shocked if a single 512 cuda core Fermi could best the 2x1600SP 5970. That would be quite the accomplishment. But my guess is that the GTX380 will fit somewhere between the 5870 and 5970... and probably be priced in between as well.

Than if the 448 cuda core version doesn't show up until May-June, I see the 5870 and 5850 selling very well for some time yet.

I wonder what Nvidia will position against the 5850, maybe they'll shrink the GTX285? Maybe a further cut down Fermi chip? If the GT200 cards are EOL and Fermi is only going to have two versions (the 512 cuda core part and 448 core part) there will be a lot of room between those chips and the GTS250... I wonder what Nvidia may have up it's sleeve. And ofcourse we have to wonder about the 5870 refresh we keep hearing rumors about...
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Are people honestly expecting the single Fermi to be faster than a 5970?...

5970 is a beast, but with games that scale poorly, or don't scale at all..

I'm most excited about more memory. I'd have bought two 5870's or a 5970 and be done for a year or so if they had delivered on that 2GB deal. Both 5870 and 5970 were rumored to have 2GB variants, 5970 was rumored to lauch with 2GB per gpu. That's my only bitch about 58xx..
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
5970 is a beast, but with games that scale poorly, or don't scale at all..

Said games wouldn't scale well on a dual Fermi either. I'm not expecting a dual Fermi to beat a 5970. A dual Fermi will probably be two GPUs that are each slower than a GTX 360.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Said games wouldn't scale well on a dual Fermi either. I'm not expecting a dual Fermi to beat a 5970. A dual Fermi will probably be two GPUs that are each slower than a GTX 360.

Do you really think Nvidia will allow ATI to best them?
I would definitly put all my money on dual fermi to best a 5970.
Every cent.;)
 

razor2025

Diamond Member
May 24, 2002
3,010
0
71
Wait.. that doesn't make sense... a single-chip Fermi is already pushing the 300watt envelope. They want to stick 2 chips there?! I guess they can if they dramatically cut down on the clocks and processing units for each chip.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Do you really think Nvidia will allow ATI to best them?
I would definitly put all my money on dual fermi to best a 5970.
Every cent.;)

Allow? ATI has bested Nvidia in the flagship department quite a few times in history, there's no reason to believe they couldn't do it again.

As for the scaling down, Nvidia doesn't really have a choice if they want to remain in spec. Their single GPU Fermi uses 280W of power supposedly. 300W is the maximum the dual Fermi will be permitted use. How do you think they are going to accomplish that? The die shrink already happened, they have nothing to help alleviate power consumption aside from seriously cutting things down.
 
Last edited:

nosfe

Senior member
Aug 8, 2007
424
0
0
Do you really think Nvidia will allow ATI to best them?
I would definitly put all my money on dual fermi to best a 5970.
Every cent.;)

under normal circumstances, sure, i'd too say that dual fermi would beat 5970. The problem is that 300W barrier, will fermi have enough performance\watt to beat 5970 considering all that GPGPU stuff on it?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
NV may not have a choice. Do you think they like the current situation of pitting G92 and EOL GT200 against the top to bottom DX11 product lineup from ATI? They wouldn't allow that if at all possible.

Simply doubling up a Fermi would mean 3x 6 pin (remember that the PCIe provides 75 watts, would need another 6 pin otherwise) and 2x 8 pin for the nearly 600 watts of heat being generated in a very, very small space. It would take a miracle to get a dual 380 Fermi from 600 to 300 watts. Which means much slower clocks or much less silicon. We'll see what we get when we get it, but it's nowhere near a sure bet.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
I'm still shcoked that nVidia doesn't make G200 chips at 40 nm. It must take too much work to make DX11 possible on the G200.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
According to AMD's recent numbers DX11 cards represented around 10% of the market. (That's not even including IGP's).

The more mainstream parts so far struggle with DX11 (as seen with the performance drop in Dirt2).

I think many people are over exaggerating the importance of DX11, which at this point is almost nil. Until several really good full DX11 games hit the market, it's just not that important and offers no compelling reason to upgrade.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
under normal circumstances, sure, i'd too say that dual fermi would beat 5970. The problem is that 300W barrier, will fermi have enough performance\watt to beat 5970 considering all that GPGPU stuff on it?

I would not be surprised if a single GTX380 can beat a 5970.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
According to AMD's recent numbers DX11 cards represented around 10% of the market. (That's not even including IGP's).

The more mainstream parts so far struggle with DX11 (as seen with the performance drop in Dirt2).

I think many people are over exaggerating the importance of DX11, which at this point is almost nil. Until several really good full DX11 games hit the market, it's just not that important and offers no compelling reason to upgrade.

Mainstream parts, the ones that are cheap and low powered, struggle with a brand new game??? WOAH that's some breaking news right there!!!
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I think many people are over exaggerating the importance of DX11, which at this point is almost nil. Until several really good full DX11 games hit the market, it's just not that important and offers no compelling reason to upgrade.

Crysis 2 is on the way, AvP is on the way, there are a few games out already (battleforge, dirt 2, stalker) so I think acceptance thus far has been better than DX10 was at this point in time. Since DX11 works with both Vista and Win 7, and Win 7 is finally getting people to upgrade from XP, I think DX11 will be full steam ahead by this fall. Games later this year will be released with DX9/DX11 support and hopefully will skip over DX10.

I would not be surprised if a single GTX380 can beat a 5970.

Due to the new architecture, I think there will be a few games where a gtx380 will beat/tie the 5970. I also bet there will be a few games where a gtx380 will only beat a 5870 by 10-15%.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I think many people are over exaggerating the importance of DX11, which at this point is almost nil. Until several really good full DX11 games hit the market, it's just not that important and offers no compelling reason to upgrade.

There are good titles NOW. It's not a reason to sidegrade, sure. But if someone is buying a midrange card today the DX11 version will at least let them see what the fuss is about, and the DX10 version will not.

Sure, at the entry level (5650) or slower DX11 may be a no-op. But anyone buying a GTX260 or 5770 or faster would be a fool to choose the DX10 (not even DX10.1)-only card.

Win7 has not been released for that long. To have a 10% market penetration already is stunning. DX11 is here, and being adopted faster than anyone could have expected, especially given poor pricing and availability.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I'm still shcoked that nVidia doesn't make G200 chips at 40 nm. It must take too much work to make DX11 possible on the G200.

According to bright side of news, g200 mid range were cancelled because of the r&d costs of Fermi. Also, I've read a few times that since G200 was made for and designed on a 65nm process that it just wasn't able to shrink past the 55nm node. Beyond this, the difference between a gtx260 and a gts250 (which is g92) isn't exactly huge; putting a product between those two cards just wouldn't make sense. At this point in time, instead of adding features to already existing architectures (which have already been tweaked & shrinked) it is probably best to go ahead and work on stripping down/tweaking the new architecture - especially if it really is the direction nvidia wants to take their GPU's in.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I don't think GTX 360 is considered mainstream, happy medium.

More like:
GTX 360 and GTX 380 in small quantities in March, $699 and $799 USD.
GTX 395 in small quantities in April (to take back the crown?) $1299 USD
GTX350?, GT340?, GT320?, and all the other Fermi cards widely available in June$599, or GT285+Dx11 @ 499
My guesses.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
There are good titles NOW. It's not a reason to sidegrade, sure. But if someone is buying a midrange card today the DX11 version will at least let them see what the fuss is about, and the DX10 version will not.

Sure, at the entry level (5650) or slower DX11 may be a no-op. But anyone buying a GTX260 or 5770 or faster would be a fool to choose the DX10 (not even DX10.1)-only card.

Win7 has not been released for that long. To have a 10% market penetration already is stunning. DX11 is here, and being adopted faster than anyone could have expected, especially given poor pricing and availability.

I think 10% (~2 million cards) is a pretty impressive feat, not only for only being available in volume for a realitively short time, but also think about the average price of those 2 million cards... that 10% is for the 5750 - 5970. AMD probably did great on the average selling price of those parts.

Since Nvidia's offerings at the present time are pretty slim I only see AMD selling more and more Radeon 5xxx cards. This reminds me of the GeForce 8800 launch, the competition was behind and DX10 was in it's infancy, but people bought those cards in mass.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
NV may not have a choice. Do you think they like the current situation of pitting G92 and EOL GT200 against the top to bottom DX11 product lineup from ATI? They wouldn't allow that if at all possible.

Simply doubling up a Fermi would mean 3x 6 pin (remember that the PCIe provides 75 watts, would need another 6 pin otherwise) and 2x 8 pin for the nearly 600 watts of heat being generated in a very, very small space. It would take a miracle to get a dual 380 Fermi from 600 to 300 watts. Which means much slower clocks or much less silicon. We'll see what we get when we get it, but it's nowhere near a sure bet.

See post #21
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
There are good titles NOW.

Really? Battleforge which was launched a long time ago without DX11 (patched but did not add much, not even a good game really), Dirt 2 which was launched a long time ago on the consoles without DX11 and only recently on the PC and Stalker which is not even in wide release yet.

:rolleyes:
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
I would almost bet that if things were turned around that you would be majorly glorifying nvidia and their having the only dx11 card for months before anyone else! DX11 support is the be all end all! Blah Blah Blah

Why you would act like a fanny of a corporation is beyond me. nVidia doesn't even like you. (they love and only care about our money like basically ever other greedy gut and the investers they have to answer to)


Jason

So true. Not sure why Wreckage's blatant PR isn't policed a little bit more. I love my Nvidia cards, but this dude is really out of hand and never responds when he's called on his propagandist tendencies.

On topic - if this is time frame pans out (which I very seriously doubt) then I would probably do the EVGA step-up from the GF100 to the dual variant in April.

Now to be flamed by T2K....