GF100 Previews from Digital Experience

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lsv

Golden Member
Dec 18, 2009
1,610
0
71
He talked more about Unigine than the card lol.

Probably some marketing exec. Or he was instructed not to divulge information. But we do know it's a huge ass card (what I assume is flagship) and runs 6+8 power.

Any speculations as to what's in that beast?
 

Allio

Golden Member
Jul 9, 2002
1,904
28
91
You don't feel that seeing AMD is slaughtering them right now in the high end with no competition, if they had a part that could put out those sort of numbers they would have found SOME way to let the market know by now? Plenty of people are getting sick of waiting and buying AMD. If they knew there was a 40% better part a couple of months away I'm sure they'd be happier to wait. Instead we've had absolute silence from NV.

We'll see I guess, hopefully during CES, but I really think that in this market, no news = bad news.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Not bad at all for being late and problematic through it's creation if it turns out to be true. This will also put it in line with a $499-$549 initial price tag.

Late, problematic, and 50% bigger.
36% more performance for a 50% bigger die and 50% more memory bits (not necessarily bandwidth, that depends on clocks), isn't a terrible return, but said 36% is a rumour number, so it will either be typical (not so bad), or inflated (pretty bad).
Now if 36% was worst-case, then it would be nice, but that's not going to be the case.
Performance will be solid, but NV won't be able to put any pricing pressure on AMD, and when an HD5850 can play most things at 1920x1200 without any real sweat, value for money starts being more important, and personally I don't see NV being able to offer it.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Probably some marketing exec. Or he was instructed not to divulge information. But we do know it's a huge ass card (what I assume is flagship) and runs 6+8 power.

Any speculations as to what's in that beast?

It's 10.5 inches. Same as the GT200 series.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Good news for NV for a change. The screen looks like at least 1920x1200 and the playback is smooth so I think the performance is there. It may consume a lot of power under load, but let's face it - No one really sits and measures power consumption while playing games. If NV gets the idle power right (including when to idle) and the card doesn't catch fire, I'll give a pass to load power.



The motherboard = Rampage II Extreme -> 12" x 10.6". So the card has to be 10.5" long just like GTX 280/285.

 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
I dont understand the concern with power requirements of 8 & 6pin, AMD was there what? 2 yrs ago?
 

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
36% more performance for a 50% bigger die and 50% more memory bits (not necessarily bandwidth, that depends on clocks), isn't a terrible return, but said 36% is a rumour number, so it will either be typical (not so bad), or inflated (pretty bad).

With full exception handling and ecc 14% off isn't so bad.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
With full exception handling and ecc 14% off isn't so bad.

I'm not sure they have ECC on the consumer version, since the ECC relies on using part of the onboard RAM and setting it aside for the ECC functionality, which would mean your 1GB consumer card becomes more like 800MB, and I doubt NV would feel a need for that to be the case, and I doubt their partners would think it ideal either.

I wouldn't be surprised to see ECC at least being consumer only (also "forcing" people to buy the Tesla versions instead of reappropriating the consumer cards), but of course, no one can be sure yet.

ATI also have a form of ECC, but they cheat slightly by just resending the data, which has a greater performance hit when operating, but ideally is only a worst case fallback and shouldn't be required.

And no idea about exception handling.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I dont understand the concern with power requirements of 8 & 6pin, AMD was there what? 2 yrs ago?

What?
No single GPU card has had an 8-pin connector up to this point. The GTX285 doesn't, the HD4890 doesn't, the HD5870 doesn't.
Assuming the Fermi based card does have a 6+8, it will be the first single GPU card, AFAIK, to have such a configuration.
 

Canbacon

Senior member
Dec 24, 2007
794
4
91
What?
No single GPU card has had an 8-pin connector up to this point. The GTX285 doesn't, the HD4890 doesn't, the HD5870 doesn't.
Assuming the Fermi based card does have a 6+8, it will be the first single GPU card, AFAIK, to have such a configuration.

I know for a fact that the following cards have a 6pin and 8pin connector:

ATI:
HD 2900 XT

NVIDIA:
GeForce 8800 GTX
GeForce GTX 280

If you want I can take a picture of them, since they are all sitting next to me.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
What?
No single GPU card has had an 8-pin connector up to this point. The GTX285 doesn't, the HD4890 doesn't, the HD5870 doesn't.
Assuming the Fermi based card does have a 6+8, it will be the first single GPU card, AFAIK, to have such a configuration.

You may want to check out this review of the 2900XT from 2007.
http://www.pcstats.com/articleview.cfm?articleID=2159

It seems you are very much wrong.
http://www.pcstats.com/articleimages/200708/RadeonHD2900XT_top.jpg
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
You may want to check out this review of the 2900XT from 2007.
http://www.pcstats.com/articleview.cfm?articleID=2159

It seems you are very much wrong.
http://www.pcstats.com/articleimages/200708/RadeonHD2900XT_top.jpg

My 2900Pro did have a 6 and 8 pin, but it only required two 6 pin connectors to function. A 6 and 8 just opened up CCC overclcoking functionality. Even with two 6 pin connectors you could still overclock with other programs, like Rivatuner. We'll have to wait and see how Fermi turns out with it's power...
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well at any rate it does look better for fermi right now. I tell I am ready to spank Bob . He came over an got the PC with the Clarkdale which isn't mine but it also had my 5870 in it . I have never benched the 5870. and now I see Fermi On the way I want benchies now so I can do a release driver compare between them .

I got to say the News from the show is really good . I really liked that Ananda report on Tegra II. I so want to see it go up against the Imagination 540 .

2009 was boring as all hell . 2010 looks like its going to Rock.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
I'm not sure they have ECC on the consumer version, since the ECC relies on using part of the onboard RAM and setting it aside for the ECC functionality, which would mean your 1GB consumer card becomes more like 800MB, and I doubt NV would feel a need for that to be the case, and I doubt their partners would think it ideal either.

You still need circuitry to do whatever correction they use. (hamming code) I really doubt it's done on the ram chips so it's probably part of the controller. If it's bypassed on the consumer card, it still exists on the chip.

ATI also have a form of ECC, but they cheat slightly by just resending the data, which has a greater performance hit when operating, but ideally is only a worst case fallback and shouldn't be required.

They only detect errors in both exception and ram. Whether either way is better is a push.

And no idea about exception handling.

I really have no specs either. Supposedly it's full per element exceptions, which in my opinion is no simple task. ATI will detect the errors, but not provide internal routines for handling them. Almost the same relationship as ram. We'll have to see what the cost/benefit will be. Many GPGPU contracts could care less about costs if it meets the specs.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
I'm not sure they have ECC on the consumer version, since the ECC relies on using part of the onboard RAM and setting it aside for the ECC functionality, which would mean your 1GB consumer card becomes more like 800MB, and I doubt NV would feel a need for that to be the case, and I doubt their partners would think it ideal either.

I wouldn't be surprised to see ECC at least being consumer only (also "forcing" people to buy the Tesla versions instead of reappropriating the consumer cards), but of course, no one can be sure yet.

ATI also have a form of ECC, but they cheat slightly by just resending the data, which has a greater performance hit when operating, but ideally is only a worst case fallback and shouldn't be required.

And no idea about exception handling.

Not sure if the consumer product has ECC or not. But I do believe it is supposed to have 1.5GB GDDR5. Not confirmed by any stretch.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
It doesn't have to be near 300 watts to use a 6 and 8 pin connector. It must be very close to 225 watts or slightly above. They might have used an 8 pin instead of a 6 to provide OC headroom.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Lets just wait for It seems closer than I thought. Nv got it on 3 steppings not bad not bad at all. Nice going NV.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Late, problematic, and 50% bigger.
36% more performance for a 50% bigger die and 50% more memory bits (not necessarily bandwidth, that depends on clocks), isn't a terrible return, but said 36% is a rumour number, so it will either be typical (not so bad), or inflated (pretty bad).
Now if 36% was worst-case, then it would be nice, but that's not going to be the case.
Performance will be solid, but NV won't be able to put any pricing pressure on AMD, and when an HD5850 can play most things at 1920x1200 without any real sweat, value for money starts being more important, and personally I don't see NV being able to offer it.

Well, the card itself isn't too bad. Still on the large size but nothing we haven't seen before. If your case supports a previous high end card, it'll support these suckers.

The 36% performance edge is indeed a rumor. The biggest problem is that even if true, is that a maximum or the average? If that's an average then Fermi will kick much rear end. If it's only a maximum, meaning only in certain situations and in certain games, then nVidia has a problem. Keeping in mind this is still conjecture and that the 36% number is still a rumor.

The issue with Fermi is that with how late it is, it might be in competition with the 5970 when it does get released. And even if it beats the 5970, AMD might be able to be price/performance competitive with Fermi.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
And even if performance is proportionally higher, a $500 video card is a much harder sell today than it was 3 years ago. Especially for those who are content with what they've got now and those who already got a HD 5000 series card.

On a positive side, I expect NV will stop pulling stunts like Batman once a performant part is out.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
And even if performance is proportionally higher, a $500 video card is a much harder sell today than it was 3 years ago. Especially for those who are content with what they've got now and those who already got a HD 5000 series card.

On a positive side, I expect NV will stop pulling stunts like Batman once a performant part is out.

Quite the contrary. If the card has extra pixel pushing to spare, you can be sure they're going to try to leverage physx and other taxing, perhaps even exclusive, features MORE to sell this beast. All of which is fine with me as long as it helps improve visual quality.
 

gorobei

Diamond Member
Jan 7, 2007
4,023
1,522
136
3d vision surround (3 monitor in 3d)requires 2 GF100 cards.

http://www.tomshardware.com/news/Nvidia-3D-Surround,9394.html

so total cost of ownership:
2 x GF100 cards
1 x SLI enabled motherboard
3 x 120 hz TN monitors
1 x 3d glasses
----------------
= lol


mind you i'm happy to see that they recognize the significance of 3+ monitor output (even sans 3d vision). no word on whether you need 2 identical cards.

sidenote: projector setup is interesting way to avoid bezels.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't understand what is so funny actually. The way I see it as more flexibility for GT-200 or potentially Fermi owners to have surround gaming with their 2d or 3d displays with Sli. Think it is great and was hoping for surround gaming when nVidia introduced Sli multi-monitor support a short time ago with one of their Big Bang driver releases.