The New and Improved "G80 Stuff"

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: Elfear
Some interesting info I found on B3D.

"Here are some fairly reliable numbers from the same source who was spot on with previous numbers (eg RV570, RV560, G71) before they got launched:

..............3DM05 / 3DM06
8800GTX 16909 / 11843
8800GTS 15233 / 10071
7950GX2 12931 / 8781
1950XTX 11866 / 7007
1950PRO 10044 / 5600

No specs of the system were given, but all the cards ran on the same system."

Source post #1487

So, are 1700 more points in 3DM05/06 worth an extra $150+? The GTS should overclock better.


 

Elfear

Diamond Member
May 30, 2004
7,167
824
126
Originally posted by: Dethfrumbelo

So, are 1700 more points in 3DM05/06 worth an extra $150+? The GTS should overclock better.

I guess we'll have to see what difference those extra shaders make in games. If it's the same % difference as 3DMark, than I'd say probably not unless you need the extra juice for Crysis, Oblivion, or some other resource hog.
 

impemonk

Senior member
Oct 13, 2004
453
0
0
I've been out of it here at this forum, but man... seeing a prototype graphic card like that makes me scared... imagine the loudness of the fan, or better yet imagine all the strain a video card that big will put on your pcie slot and mobo in general... geez... I hope the 8600GT or whatever are sized reasonably like the 7600GT's. Those are decent sizes for a vid card.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Gstanfor
Thats only vendor overclocking being banned if I'm reading that correctly (is there a correct way to read an INQ article?) I imagine endusers will still be able to overclock.

Oh yeah, I'm quite sure that NVIDIA will continue to allow enduser OCing via coolbits. I'm also not really all that concerned about vendor OC's, but it will be interesting to see how the vendors try to distinguish themselves when they are all bound to stock clocks. I also think that it is probably somewhat indicative of how hard NV is pushing these cards at stock though. I'm guessing they won't be the best OCer's, but we'll see.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: nitromullet
Originally posted by: Gstanfor
Thats only vendor overclocking being banned if I'm reading that correctly (is there a correct way to read an INQ article?) I imagine endusers will still be able to overclock.

Oh yeah, I'm quite sure that NVIDIA will continue to allow enduser OCing via coolbits. I'm also not really all that concerned about vendor OC's, but it will be interesting to see how the vendors try to distinguish themselves when they are all bound to stock clocks. I also think that it is probably somewhat indicative of how hard NV is pushing these cards at stock though. I'm guessing they won't be the best OCer's, but we'll see.


I'm just amazed they're getting 575MHz with 700million transistors at 90nm. That is probably pushing the envelope right there, hence nvidia's apprehension to OEM factory o/c's.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I'm still not at all convinced G80 will be 700 million transistors.

I'd expect that figure to be the either the 80nm figure or the final size the g8x architecture achieves.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
From B3D quoting CJ:

Here's some additional info:

NVIDIA unified architecture with GigaThreadTechnology
Full DirectX 10 ShaderModel 4.0 support
NVIDIA SLI Ready
16x full screen anti-aliasing (Lumenex Engine) [will work with HDR according to some sources]
True 128-bit floating point high dynamic-range (HDR) (Lumenex Engine)
NVIDIA Quantum Effects physics processing technology (Advanced shader processors architected for physics computation)
Two dual-link DVI (2560 x 1600 resolution displays)
NVIDIA Pure Video support
PCI-E Express support
Open GL 2.0 support
NVIDIA ForceWareUnified Driver Architecture
Built for Microsoft Windows Vista

Note that the GTX is rumoured to have 2 SLi connectors similiar to crossfire. The GTS is rumoured to have 1. (the images of the leaked G80 boards are early samples so it will not have these modifications).

What is the lumenex engine?

All the rumours are pointing to 700million transistors using 90nm process. No other figures have been heard other than this. No mention of 80nm either.

Anyhow.. HDR +16 xSSAA anyone? :D
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Well, i don't particuarly care for what the rumors have to say. I can well remember the nv3x rumors and r4xx 'extreme pipeline' rumors. If the 700m figure pans out I'll be pleasantly suprised.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Well, i don't particuarly care for what the rumors have to say. I can well remember the nv3x rumors and r4xx 'extreme pipeline' rumors. If the 700m figure pans out I'll be pleasantly suprised.

it nice to see a nvidia fan bring up nv30
:Q

:D

Nov 18 . . . we'll know . . . for sure

did anyone notice theInq's latest prediction . . . that g80 will be in short supply after the initial launch?
:shocked:

EDIT:

the VERY latest: :p

More G80 numbers outed, on more CPUs
WANT HARD NUMBERS on G80? Want specs? Well, you can find the specs here http://www.pcwelt.de/news/hardware/video/61081/index.html

at PC Welt, but they are in German. As for numbers, we told you that they would be at about 12,000 give or take a driver revision.

That brings up the question of 'on what'? Those numbers are on a Kentsfield, how will it do on your CPU?

Well, a Conroe 2.93 will score about 10,500, and a FX-62 will hit almost 10,000 on the nose. It is interesting that the quad will hit notably higher with a lower clock but a higher bus. I wonder if it is the threading or the FSB giving the kick?

nvidia is leaking more info on g80 than any other launch - in years!!

and now they are threatning to cut fingers off:
NVidia clamps down on leaks . . . Psst... Slides next week
n case you haven't noticed, the clampdown failed pretty miserably, but hey, they had to try. That said, there are no slides being given out until next Thursday, but we aren't supposed to know that either. The interesting thing is what they are planning for next year.

Since no slides allowed the information to leak out by text, email and phone, that will be clamped down on in the next go around. The cunning plan here is to stop it at the source - fingers. Each attendee of editors day will be issued a pair of wire cutters, and be kindly asked to remove all fingers and finger-like extremities in the vain hope of keeping information in check. We are told there are no plans to change any slides though

funny

:roll:
 
Oct 19, 2000
17,860
4
81
Originally posted by: apoppin
it nice to see a nvidia fan bring up nv30
:Q

:D

Nov 18 . . . we'll know . . . for sure

did anyone notice theInq's latest prediction . . . that g80 will be in short supply after the initial launch?
:shocked:
I thought it was November 8th?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: blurredvision
Originally posted by: apoppin
it nice to see a nvidia fan bring up nv30
:Q

:D

Nov 18 . . . we'll know . . . for sure

did anyone notice theInq's latest prediction . . . that g80 will be in short supply after the initial launch?
:shocked:
I thought it was November 8th?

did a '1' get left in there?
:Q

and according to theInq it is the 7th

:D

i woulda thought you'da commented on the hard number's leaked preview instead of my typo.

nov 18 is the last day you can buy a g80 . . . 'till next February. :p

:D

[just kidding] :p

Indications are that G80s will disappear as they launch and that there will be in tight supply for the first six weeks at least.

or am i?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
http://www.pcwelt.de/news/hardware/video/61081/index3.html

my German is bad . . . but even i can figure out the preview:
excerpts
700 Millionen Transistoren, Unified Shader Architektur und Gigathread-Technologie

Zudem besitzt das GTX-Flaggschiff gleich zwei 6polige Stromanschlüsse. Dies ist notwenig, weil die Grafikkarte unter Volllast bis zu 200 Watt verbrät. Deshalb schreibt Nvidia für den stabilen Betrieb auch mindestens ein 450-Watt-Netzteil vor, das auf der 12-Volt-Leitung wenigstens 30 Ampere liefert (Geforce 8800 GTS: 400 Watt/26 Ampere).

Quantum Effects Physics Processor und höhere HDRR-Rechengenauigkeit inklusive 16xAA

Damit der Grafikspeicher die Rechen-Power der 128 Streaming-Prozessoren nicht ausbremst, hat Nvidia dem Geforce 8800 GTX ein 384 Bit breites Speicher-Interface spendiert. Zudem darf das Spitzenmodell auf satte 768 MB GDDR3-Speicher zugreifen. Die Taktfrequenz beträgt 900 MHz. Damit erhöht sich die theoretisch maximale Speicherbandbreite auf heftige 86,4 GB/s.

i don't think the nvidia fans will be too disappointed
:)
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
What I'll be most interested to see is how it does with high AA & HDR enabled, whether there's less of a performance hit than with R580 & G70.

As to lack of availability, I wouldn't doubt it - I'm not going to get gouged though. If there's a local store that carries them, you've probably got a better shot there than ZZF who'll probably put them up for $800 and sell out in 15 minutes.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Gonna have to sit on the EVGA website throughout all of the 7th and the 8th so that I can jump on the step-up... :(
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Nightmare225
Gonna have to sit on the EVGA website throughout all of the 7th and the 8th so that I can jump on the step-up... :(

My guess it that eVGA step-up is going to be a situation of sign-up and wait, as they limit the number of cards they release for step-up. A friend of mine stepped-up from a 7800GT to a 7900GT KO last spring, and that's pretty much how it went. He waited about six weeks I before he got to actually do the step-up. As long as you get your request in before the 90 days, you're good to go. No big deal IMO, as long as I can step-up I'll be happy. I think I'll be fine with my GTXes for an extra few weeks, and so will you with your GX2.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: keysplayr2003
Originally posted by: nitromullet
Originally posted by: Gstanfor
Thats only vendor overclocking being banned if I'm reading that correctly (is there a correct way to read an INQ article?) I imagine endusers will still be able to overclock.

Oh yeah, I'm quite sure that NVIDIA will continue to allow enduser OCing via coolbits. I'm also not really all that concerned about vendor OC's, but it will be interesting to see how the vendors try to distinguish themselves when they are all bound to stock clocks. I also think that it is probably somewhat indicative of how hard NV is pushing these cards at stock though. I'm guessing they won't be the best OCer's, but we'll see.


I'm just amazed they're getting 575MHz with 700million transistors at 90nm. That is probably pushing the envelope right there, hence nvidia's apprehension to OEM factory o/c's.

Its not a single chip.
 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Originally posted by: apoppin
anybody have a good translation?



Yeah, I dont like what I'm seeing. I want a good translation too.


"Dies ist notwenig, weil die Grafikkarte unter Volllast bis zu 200 Watt verbrät. Deshalb schreibt Nvidia für den stabilen Betrieb auch mindestens ein 450-Watt-Netzteil vor, das auf der 12-Volt-Leitung wenigstens 30 Ampere liefert (Geforce 8800 GTS: 400 Watt/26 Ampere)."


I dont know about others but those numbers worry me. I cant tell though if they are quoting rumors or what. 400watts, 450watts, 26amps, 30amps? WTF??
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Acanthus
Originally posted by: keysplayr2003
Originally posted by: nitromullet
Originally posted by: Gstanfor
Thats only vendor overclocking being banned if I'm reading that correctly (is there a correct way to read an INQ article?) I imagine endusers will still be able to overclock.

Oh yeah, I'm quite sure that NVIDIA will continue to allow enduser OCing via coolbits. I'm also not really all that concerned about vendor OC's, but it will be interesting to see how the vendors try to distinguish themselves when they are all bound to stock clocks. I also think that it is probably somewhat indicative of how hard NV is pushing these cards at stock though. I'm guessing they won't be the best OCer's, but we'll see.


I'm just amazed they're getting 575MHz with 700million transistors at 90nm. That is probably pushing the envelope right there, hence nvidia's apprehension to OEM factory o/c's.

Its not a single chip.

It WILL be a single chip. No doubt about that. I say this because nv40 had 222million transistors on 130nm. 700million on 90nm doesnt sound impossible.


 

CP5670

Diamond Member
Jun 24, 2004
5,676
776
126
Yeah, I dont like what I'm seeing. I want a good translation too.


"Dies ist notwenig, weil die Grafikkarte unter Volllast bis zu 200 Watt verbrät. Deshalb schreibt Nvidia für den stabilen Betrieb auch mindestens ein 450-Watt-Netzteil vor, das auf der 12-Volt-Leitung wenigstens 30 Ampere liefert (Geforce 8800 GTS: 400 Watt/26 Ampere)."


I dont know about others but those numbers worry me. I cant tell though if they are quoting rumors or what. 400watts, 450watts, 26amps, 30amps? WTF??

I wouldn't worry too much about Nvidia's suggested numbers, as they need to take low quality power supplies into account. IIRC ATI recommended 30A for the X1900s, which was pretty exaggerated. That being said it does look like these cards will be power hogs.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Ok, a more detailed translation took from b3b

For the those of us who didn't master the tongue of Germany yet:

- A GFX card of 800 grams and 26 centimeters long
- Two PCIe power connectors, one is a must to satisfy the 200 W peak load, two are recommended for stable performance
- Nividia recommends a 450 W PSU, with at least 30 A on the 12 V line.
- 700 million transistors. I think it's a ridiculous amount, compared to 375 million for the 80 nm X1950XTX from ATI. Can they actually produce this ? Okay... if TSMC yields are THAT good (and they are good...).
- 575 MHz core clock, 1350 MHz for the streaming processors. What the heck are these ? Pixel shaders in pre-DX10 speak ? This sounds like custom design, because with a traditional ASIC approach (thought these GPU people usually stay away from custom logic like AMD and Intel) this is hard to achieve.
- Special logic for physics included (Quantum Physics Processor). I just love this Marchitecture nonsense from marketing people.
- 128-bits HDR. Can be used in combination with AA (16X).
- 128 streaming processors, 768 bit memory bus, running at 900 MHz. Giving 86.4 GB/s. Max. 768 MB GDDR3.
- 8800 GTS will have 320 bit memory bus, giving 64 GB/s. Max. 640 MB GDDR3.
- Two DualLink DVI outputs, one analog video in/out. HDCP supported.
- OpenGL 2 support + DX9 + DX10
- 8800 GTX will cost 650 euro, 8800 GTS will cost 500 euro. With the usual rip-off prices in Europe, guess US prices will be 20%-30% lower
- GTX has 575 MHz core clock, 1375 MHZ for streaming proces, 900 MHz mem clock
- GTS has 500 MHz core clock, 1200 MHz core clock, 800 MHz mem clock
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Cookie Monster
Originally posted by: Acanthus
Originally posted by: keysplayr2003
Originally posted by: nitromullet
Originally posted by: Gstanfor
Thats only vendor overclocking being banned if I'm reading that correctly (is there a correct way to read an INQ article?) I imagine endusers will still be able to overclock.

Oh yeah, I'm quite sure that NVIDIA will continue to allow enduser OCing via coolbits. I'm also not really all that concerned about vendor OC's, but it will be interesting to see how the vendors try to distinguish themselves when they are all bound to stock clocks. I also think that it is probably somewhat indicative of how hard NV is pushing these cards at stock though. I'm guessing they won't be the best OCer's, but we'll see.


I'm just amazed they're getting 575MHz with 700million transistors at 90nm. That is probably pushing the envelope right there, hence nvidia's apprehension to OEM factory o/c's.

Its not a single chip.

It WILL be a single chip. No doubt about that. I say this because nv40 had 222million transistors on 130nm. 700million on 90nm doesnt sound impossible.


Do you know how many transistors resides in the current 7900's? Under 300million. On 90nm. Now to me, 700million on 90nm sounds impossibly high. Especially with a base core clock of 575MHz. 7900GTX is only 75MHz faster core with less than 300 million trannies on .09u and look at the huge cooler needed for it. They must have done something real interesting with the process to allow something this large, clocked that high. Lets not even talk about the shaders clocked somewhere in the stratosphere.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Cookie Monster
Originally posted by: Acanthus
Originally posted by: keysplayr2003
Originally posted by: nitromullet
Originally posted by: Gstanfor
Thats only vendor overclocking being banned if I'm reading that correctly (is there a correct way to read an INQ article?) I imagine endusers will still be able to overclock.

Oh yeah, I'm quite sure that NVIDIA will continue to allow enduser OCing via coolbits. I'm also not really all that concerned about vendor OC's, but it will be interesting to see how the vendors try to distinguish themselves when they are all bound to stock clocks. I also think that it is probably somewhat indicative of how hard NV is pushing these cards at stock though. I'm guessing they won't be the best OCer's, but we'll see.


I'm just amazed they're getting 575MHz with 700million transistors at 90nm. That is probably pushing the envelope right there, hence nvidia's apprehension to OEM factory o/c's.

Its not a single chip.

It WILL be a single chip. No doubt about that. I say this because nv40 had 222million transistors on 130nm. 700million on 90nm doesnt sound impossible.

So you are saying they have different clockspeeds on different parts in the same ginormous chip?

k.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Gstanfor
Well, i don't particuarly care for what the rumors have to say. I can well remember the nv3x rumors and r4xx 'extreme pipeline' rumors. If the 700m figure pans out I'll be pleasantly suprised.

:thumbsup:

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Acanthus
Originally posted by: Cookie Monster
Originally posted by: Acanthus
Originally posted by: keysplayr2003
Originally posted by: nitromullet
Originally posted by: Gstanfor
Thats only vendor overclocking being banned if I'm reading that correctly (is there a correct way to read an INQ article?) I imagine endusers will still be able to overclock.

Oh yeah, I'm quite sure that NVIDIA will continue to allow enduser OCing via coolbits. I'm also not really all that concerned about vendor OC's, but it will be interesting to see how the vendors try to distinguish themselves when they are all bound to stock clocks. I also think that it is probably somewhat indicative of how hard NV is pushing these cards at stock though. I'm guessing they won't be the best OCer's, but we'll see.


I'm just amazed they're getting 575MHz with 700million transistors at 90nm. That is probably pushing the envelope right there, hence nvidia's apprehension to OEM factory o/c's.

Its not a single chip.

It WILL be a single chip. No doubt about that. I say this because nv40 had 222million transistors on 130nm. 700million on 90nm doesnt sound impossible.

So you are saying they have different clockspeeds on different parts in the same ginormous chip?

k.


Its called clock domains where different parts of the GPU run at a different speed. This has been done by nVIDIA since NV40.