G80 De-Mystified; GeForce 8800GTX/GT

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: Schadenfroh
Originally posted by: inspire
Originally posted by: Ulfhednar
Originally posted by: SpeedZealot369
If this is true, time for external power bricks :(
That rumour is already in full circulation and the flames are burning ever hotter due to this article from June 5th.

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2770

ATI and Nvidia can bite me if this comes to pass, I will buy a Wii.

:thumbsup: Wii FTW.

That is right! Defy ATI by buying a Wii! That will show them!

Yes because ATI don't make the GPU for the Wii...

... Oh what?... they do.... oops
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: Genx87
Like me? I am not buying any of these cards at these prices so nice try. Nvidia doesnt set ripoff prices the market does, this is basic economics.

Looks like the video forum is full of liberals too. They don't understand Capitalism. :D :p

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
I can't wait to find out about the new AA mode. Water cooling sucks though. :(
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Holy crap! Some weird, wacky architecture if any of this info is true. I'd love to see it in person. 700M transistors and 1.5 GHz clockspeed would be absolutely shocking if true. It looks to crush current generation cards.

But if there is some truth in this, and the card only has 2X the pixel power and 12X the vertex power, I have to question the internal architecture. Is that really wise? Isn't the future looking to heavier and heavier pixel shader loads (or have I just drunk too much of ATI's kool aid?).
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
They took it down. Someone big complained.

The specs look really bizzare, almost too bizzare to make up.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Well, remembering all the 32-pipe, 700mhz g71 monster specs floating around half a year ago, I wouldnt bet too much money on these rumored specs. Some of them seem probable, but 700M transistors and 250W power consumption is just absurd.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The stuff has been taken down from both hardspell and vr-zone.

What does this all mean? :D

Even if the real G80 met 60~80% of that specs.. its going to be a monster non the elss.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Cookie Monster
The stuff has been taken down from both hardspell and vr-zone.

What does this all mean? :D

...that Shamino messed up again :p

Who knows. I say just sit back and relax. Even when it hits I'm going to wait and see just how good it does against the R600. If it beats it in what I want it to, then I'll glady buy a G80. I think Nvidia has what it takes to be an industrial leader, but I think ATI has that too. We'll just have to see what happens and not pre-ejaculate before the veil even falls.
 

stelleg151

Senior member
Sep 2, 2004
822
0
0
700m transistors correlates well with the speculated huge power consumption.

the fact that they quickly took these specs down but never took those g71 specs down is also suspicious.

Really I just hope the mid november part is right, cause I bought an 7300gt last month to tide me over until G80 and want to be able to use the step up program.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: josh6079
Originally posted by: Cookie Monster
The stuff has been taken down from both hardspell and vr-zone.

What does this all mean? :D

...that Shamino messed up again :p

Who knows. I say just sit back and relax. Even when it hits I'm going to wait and see just how good it does against the R600. If it beats it in what I want it to, then I'll glady buy a G80. I think Nvidia has what it takes to be an industrial leader, but I think ATI has that too. We'll just have to see what happens and not pre-ejaculate before the veil even falls.

:laugh:

I wonder if the whole "... scaling up to 1.5ghz.." means GDDR4 and not the core clock speed.

edit - wonder if this 700M chip means 350 + 350 (dual GPU) as in similiar to 7950GX2 except on 1 PCB.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Cookie Monster
I think G80 is a "unified" shader architecture.
But its abit different than your normal definition for a USA. (unified shader architecture)

Now, from the rumours from various sources, it leads the G80 having split into 2 sections.
Dedicated Pixel shaders and unified VS/GS.

For DX10, we know that a shader can be addressed as a PS/VS/GS hence giving birth to USA for future DX10 GPU.

Now from the vr-zone, it said 2 x pixel shader performance and 12 x VS performance.
48 dedicated pixel shaders hence 2x24 = 48. Since the VS/GS is unified with a total of 96 shaders, this can equal for the card being 48VS/48GS if configured to be split into two. Also since its unified it means that it can have a total of 96 VS, hence 8x12=96.

So, basically G80 might have 48 PS, 48 VS and 48 GS. So with a total of 128 shaders, where 48 is dedicated and 96 dedicated in VS/GS.

For the 384bit, i think this might help for some to understand the reason behind it.
From B3D quoting jawed

My earlier idea that the 256-bit bus is for normal work and the "odd" bus is for Constant Buffers, Streamout Buffers and post-GS cache:

http://www.beyond3d.com/forum/showpo...&postcount=693
Maybe it's a 256-bit bus to conventional local RAM plus a 128-bit bus to a pool of memory dedicated to:
constant buffers
post geometry shader cache
stream-out


is making more sense now... In GTX the odd bus is 128 bits wide, total 768MB. In GT the odd bus is 64 bits wide (with half the memory, only 64MB), total 640MB. Also makes the board a bit cheaper to make.

In many ways, its arguable that the access patterns to these buffers are quite unlike the normal access patterns for local RAM. Well, that's certainly the case for Streamout and post-GS cache (which are tiled-write, serial read). Constant buffers (and the associated Texture buffers?) are more like textures in one sense, so maybe they'll live in regular memory (which is optimised for tiled-write, tiled-read).

If the two-chip rumour has any foundation then I could see something like:

VS/GS die <->VB,CB,TB,SO,PGSC odd-RAM->PS die<->VRAM

Note that the VS/GS pipes don't generally need access to textures, TBs and CBs serve those functions (as well as VB - vertex buffers for input data). The CPU would send vertices directly to the odd RAM, as a serial stream and it would directly update the CBs in odd RAM too.

The PS die then has read-only access to the odd RAM (need a better name!!!) and all framebuffer operations work against VRAM. Obviously the PS die can access any TBs and CBs in the odd RAM, whilst fetching normal textures from VRAM.

Im also lead to believe that G80 will have quite impressive DX10 performance not to mention DX9 performance.

Edit - finally 700M is a possbility. Thats because Nvidia themselves said their next gen part was "...over half a billion transistors..." Which means >500M. So its all possible.


well im backing cookie monster here, what ive just read there seems entirely reasonable.

and would tie in with the rumours that nvidia werent quite going fully universal with their shaders.

48 dedicated PS with 96 further shaders that can be split between Vertex shading and Geometry work. seems a reasonable argument/idea to me
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Its about time the vertex and geometry pipelines received some attention. In the past all the love has gone to the pixel shaders (because its easier to proivde visible proof of your improvements that way).
 

biostud

Lifer
Feb 27, 2003
20,134
7,252
136
I'll definaely would wait for 65nm re-spins of these chips. The software taking advantage of DX10 should also be be more available........

And only playing HoMM V ATM does not really increase my interest in buying any of these cards.
 

robkas

Member
Aug 7, 2006
152
0
0
i have a question...(it maybe a dumb one)

Will you have to have Windows Vista in order to use DX10 with DX10 cards...or does it just rely on if the GPU itself is DX10 and than can be used with windows XP. ?
 

biostud

Lifer
Feb 27, 2003
20,134
7,252
136
Originally posted by: robkas
i have a question...(it maybe a dumb one)

Will you have to have Windows Vista in order to use DX10 with DX10 cards...or does it just rely on if the GPU itself is DX10 and than can be used with windows XP. ?

DX10 is vista only
 

robkas

Member
Aug 7, 2006
152
0
0
Originally posted by: biostud
Originally posted by: robkas
i have a question...(it maybe a dumb one)

Will you have to have Windows Vista in order to use DX10 with DX10 cards...or does it just rely on if the GPU itself is DX10 and than can be used with windows XP. ?

DX10 is vista only
OMGWTFBBQ!!!!!

if a DX10 card is released before January it is going to be SO TOUGH deciding if you should do the evga stepup plan now (i just bought a 7950gt) being that i don't plan on bulding a new ring with vista until 07 holiday season
 

smthmlk

Senior member
Apr 19, 2003
493
0
0
Originally posted by: TheLiberalTruth
Originally posted by: Genx87
Like I said, market will dictate prices. Apparently however you are in the minority and thus Nvidia prices stay higher than you like.

So, why do you think that? You're probably some 17 year old nub who has taken one economics class in high school and now thinks he understands all aspects of the economy.
Haven't you ever noticed how other countries don't always have the same prices for products as we do? Not to mention the fact that the market doesn't always have a whole heck of a lot to do with prices anyway. Ever heard of price fixing? Seems you're not quite the omnipotent being you purport yourself to be.
It's really too bad that the people with the lowest IQs and least significant things to say are so often the loudest.

You've completely lost the plot man. Most asinine post I've read in quite some time.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Does anyone think its possible that the reason for the secondary (if it is in fact a secondary) narrower memory bus, 128GTX 64GT is for onboard cache memory for geometry processing or part of? Could explain the ENORMO jump in trannies if the core(s) have, lets say, onboard dedicated physics processing capabilities. Or even a ring bus type memory controller akin to ATI's? So many possibilities arise with this weird new setup.
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
I'll have to agree that this is going to be a dual core GPU on a single PCB from nVidia. Seems like it from the odd specs. Would also explain the high cost and power requirments as others have stated. Heck I should have just quoted everyone and left my mouth and fat aths fingers alone :p
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: BFG10K
All water cooling sucks.

When you spring a leak...well, you already know what happens then.

OH!!!!!!!!!

That was below the belt!! :laugh:

Water cooling is awesome as long as you're more careful then I initially was. I've never had any leak in any of my water cooling. When I lost my X1900XTX, it was because I was reposistioning the reservoir and refilling it as well. During that process is when the water drops got some places I'd rather them not have gotten. You never have to worry about temps, and the noise of my water cooling loop is next to silent.