Nvidia's G80 has 32 pixel pipes

Sable

Golden Member
Jan 7, 2006
1,130
105
106
http://www.theinquirer.net/default.aspx?article=32856

IT TURNS that the fancy Nvidia G80 chip taped out, and in working silicon stage it will have 32 pixel Shaders and, as predicted, have 16 vertex and geometry Shaders.
Nvidia wants to stick with a two to one ratio and assumes that the games of tomorrow will need twice as many pixels than they will need vertices and geometry information.

We don?t know the clock speed of the upcoming performer but we don?t believe Nvidia can get more than 700MHz out of it - we could be wrong about that.

------------------------------------------------------------------------------------------------------

nice and committed on the clockspeed there Fu(a)d. ;)

 

rmed64

Senior member
Feb 4, 2005
237
0
0
sounds good to me. I think my 12 piper might be upgraded early next year
 

TroubleM

Member
Nov 28, 2005
97
0
0
From what I understood:
Ati R600: 64 shaders (vertex, pixel or geometry)
nVidia G80: 32 pixel shaders, 16 vertex shaders, 16 geometry shaders

If this is true, and Ati will be able to change "on the fly" the way its shaders behave and NVidia won't, it seems logical that Ati might have an ace up its sleeve.

Of course, this is just speculation, there's a lot more to a video card than just shaders.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: TroubleM
From what I understood:
Ati R600: 64 shaders (vertex, pixel or geometry)
nVidia G80: 32 pixel shaders, 16 vertex shaders, 16 geometry shaders

If this is true, and Ati will be able to change "on the fly" the way its shaders behave and NVidia won't, it seems logical than Ati might have an ace up its sleeve.

Of course, this is just speculation, there's a lot more to a video card than just shaders.

It all comes out equal. The R600 will be able to run 32pixel shaders, 16 vertex shaders, and 16 geometry shaders if it needs to. I just wonder how ATI's hardware will decide what shaders need more shaders. If ATI can get that working right I see a winner in ATI's corner. IMO Nvidia needs to bring more than 32 pixel shaders to the table if they hope to come out on top. But what do I know ATI did pretty well with their x850 and x1800 and those only had 16pixel shaders. Time will tell.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
can somebody tell me. is there a difference b/w pixel shader and pixel pipe. cuz i thought pipe was the number of pixel that would be processed per clock. are they the same thing ?

does the x1900xt have 16 pipes and 48 pixel shader or is the 48 actually vertex shader. somebody please explain.
thanks.
 

Golgatha

Lifer
Jul 18, 2003
12,424
1,110
126
Anyone else think this is underwhelming. The X1900 series has 48 pixel shaders now. Will 8 more vertex shaders really make that much difference in actual game performance?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: TroubleM
From what I understood:
Ati R600: 64 shaders (vertex, pixel or geometry)
nVidia G80: 32 pixel shaders, 16 vertex shaders, 16 geometry shaders

If this is true, and Ati will be able to change "on the fly" the way its shaders behave and NVidia won't, it seems logical that Ati might have an ace up its sleeve.

Of course, this is just speculation, there's a lot more to a video card than just shaders.

possible but afaik DX9 games wont be able to take advantage of such a setup. Maybe ATI can do it in the driver?

I am curious what is\was holding back the x1900 series as they supposed have about twice the pixel resources but can only beat their competition by a few % points.

The games that are pixel limited should see a higher performance on ATI's arch.

It is interesting to note Nvidia doubling the vertex shader capability. My understanding is there are no games that are vertex limited even on 8 pipes. However I have read somewhere you need much more vertex shader power if you were to do physics type calculations on a GPU.

 

Sc4freak

Guest
Oct 22, 2004
953
0
0
ATI is using a unified design - and it will work with any 3D app. The switching of processing units is completely invisible to the software, it should be handled by the hardware.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
"We don?t know the clock speed of the upcoming performer but we don?t believe Nvidia can get more than 700MHz out of it - we could be wrong about that."

Way to have confidence, or seem partially excited, about any next gen card.

Either way, it's not like I could afford anything anyways.
 

CP5670

Diamond Member
Jun 24, 2004
5,684
785
126
They don't say anything about the number of texture mapping units. I guess Nvidia will stick with the 1:1 ratio on those and have 32?
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
lol, I lauged when I read Fuad's story.

People, there is absolutely no merit anymore to anything this guy posts. I think he's fitst off, oblivious to the fact that this isn't NV4*. Being as he barely understood that architecture, it's very unlikely that he has any idea what he's talking about, other than what his "trusted sources" (who are usually some random forum members) are telling him.

And G80/R600 will be using GDDR4. 700MHz as a limit seems unlikely.


 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Apparently the current rumors have ATI at 64 Shaders with 16 TMU for R600.

With Nvidia they will have 32 Pixel Shaders and 16 Shaders Between the Geometry and Vertex Shaders. They will also have an undiclosed amuont of TMU's... I would guess 32 to match the amount of Pixel Shaders.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Genx87
Originally posted by: TroubleM
From what I understood:
Ati R600: 64 shaders (vertex, pixel or geometry)
nVidia G80: 32 pixel shaders, 16 vertex shaders, 16 geometry shaders

If this is true, and Ati will be able to change "on the fly" the way its shaders behave and NVidia won't, it seems logical that Ati might have an ace up its sleeve.

Of course, this is just speculation, there's a lot more to a video card than just shaders.

possible but afaik DX9 games wont be able to take advantage of such a setup. Maybe ATI can do it in the driver?

I am curious what is\was holding back the x1900 series as they supposed have about twice the pixel resources but can only beat their competition by a few % points.

The games that are pixel limited should see a higher performance on ATI's arch.

It is interesting to note Nvidia doubling the vertex shader capability. My understanding is there are no games that are vertex limited even on 8 pipes. However I have read somewhere you need much more vertex shader power if you were to do physics type calculations on a GPU.

The texture mapping units.... Nvidia has more Texturing power this generation, lots of Pixel Shaders don't work, as you still need the texturing to back them up.

Prior we used a "Pixel Pipe" to determine performance, and that consist of 1TMU, 1 Pixel Shader, and 0.66-1.0 ROP Unit, in order to maintain full performance of a "legacy" Pixel Pipe, if you deviate from this design you won't get the linear increase we have had over the past while, when this ratio's were fixed.

a ATI R580 Pixel Pipe consist of 3 Pixel Shaders, 1 TMU and 1 ROP. This will give more performance then 1 Legacy Pipeline, however not 3x or even 2x for that matter.

 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
I like it, but I still wish that Nvidia would've pushed the G80 as a unified-shader architecture like ATi are doing with the R600, because frankly I feel like a change. I like to switch between the two every once in a while, and my last two cards have been ATi.

Hopefully the first or second refresh of DirectX 10 cards will all use unified shader architecture, because I don't plan to hop on that train right away. Then I will have more reason to look more closely at both brands of card at least.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
When will people stop posting "articles" (and I use that in the loosest sense of the word) for the INQ?
 

eastvillager

Senior member
Mar 27, 2003
519
0
0
Originally posted by: fierydemise
When will people stop posting "articles" (and I use that in the loosest sense of the word) for the INQ?

Its not different than anybody else posting their own opinion, want us to stop doing that too? Won't be too many posts on the boards. :)
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: coldpower27
Originally posted by: Genx87
Originally posted by: TroubleM
From what I understood:
Ati R600: 64 shaders (vertex, pixel or geometry)
nVidia G80: 32 pixel shaders, 16 vertex shaders, 16 geometry shaders

If this is true, and Ati will be able to change "on the fly" the way its shaders behave and NVidia won't, it seems logical that Ati might have an ace up its sleeve.

Of course, this is just speculation, there's a lot more to a video card than just shaders.

possible but afaik DX9 games wont be able to take advantage of such a setup. Maybe ATI can do it in the driver?

I am curious what is\was holding back the x1900 series as they supposed have about twice the pixel resources but can only beat their competition by a few % points.

The games that are pixel limited should see a higher performance on ATI's arch.

It is interesting to note Nvidia doubling the vertex shader capability. My understanding is there are no games that are vertex limited even on 8 pipes. However I have read somewhere you need much more vertex shader power if you were to do physics type calculations on a GPU.

The texture mapping units.... Nvidia has more Texturing power this generation, lots of Pixel Shaders don't work, as you still need the texturing to back them up.


Depends on the game. If I'm reading the beyond3d R580 technology interview right, ATI's engineers believe that until faster memory comes along there won't be a huge benefit from additional texturing units... Look at Oblivion for instance, an XTX w/ 48 shaders running at 650mhz outperforms a 7800GTX 512 SLI with 48 shaders and 48 TMUs running at 550
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: eastvillager
Originally posted by: fierydemise
When will people stop posting "articles" (and I use that in the loosest sense of the word) for the INQ?

Its not different than anybody else posting their own opinion, want us to stop doing that too? Won't be too many posts on the boards. :)

Yes, it's very different.

It's like saying "Oh well, Kristopher Kubicki reported a big lie, but atleast it was his opinion, which we're all entitled to."

Or saying, "Oops, Anandtech mis-leadingly added 20 FPS to all the xy GPUs because he thinks their better. I'll just run my own benchmarks and post them up so people can see the truth."

You're telling me that you'd excuse Kristopher and Anand's mis-use of their capabilities on their public site just because it's their opinion?
(Not that they ever have, I'm just using them as an example.)

The average joe can post whatever he wants, but usually has no credability. Just like you wouldn't go to Hardforums and say "XY from Anand Boards says G80 will be Quad-Core!"

So why do we do it about Fuad? :confused:
 

fbrdphreak

Lifer
Apr 17, 2004
17,555
1
0
Originally posted by: Nelsieus
Originally posted by: eastvillager
Originally posted by: fierydemise
When will people stop posting "articles" (and I use that in the loosest sense of the word) for the INQ?

Its not different than anybody else posting their own opinion, want us to stop doing that too? Won't be too many posts on the boards. :)

Yes, it's very different.

It's like saying "Oh well, Kristopher Kubicki reported a big lie, but atleast it was his opinion, which we're all entitled to."

Or saying, "Oops, Anandtech mis-leadingly added 20 FPS to all the xy GPUs because he thinks their better. I'll just run my own benchmarks and post them up so people can see the truth."

You're telling me that you'd excuse Kristopher and Anand's mis-use of their capabilities on their public site just because it's their opinion?
(Not that they ever have, I'm just using them as an example.)

The average joe can post whatever he wants, but usually has no credability. Just like you wouldn't go to Hardforums and say "XY from Anand Boards says G80 will be Quad-Core!"

So why do we do it about Fuad? :confused:
Go whine in forum issues, people here are discussing graphics cards. The fact of the matter is that journalists have access to sources who can hint or "guess" at things but can say no more to anyone. That could be the case here, or he could have his own reasons behind it. Either way, his information is more based on fact than anything else out there. Move on
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Nelsieus
Originally posted by: eastvillager
Originally posted by: fierydemise
When will people stop posting "articles" (and I use that in the loosest sense of the word) for the INQ?

Its not different than anybody else posting their own opinion, want us to stop doing that too? Won't be too many posts on the boards. :)

Yes, it's very different.

It's like saying "Oh well, Kristopher Kubicki reported a big lie, but atleast it was his opinion, which we're all entitled to."

Or saying, "Oops, Anandtech mis-leadingly added 20 FPS to all the xy GPUs because he thinks their better. I'll just run my own benchmarks and post them up so people can see the truth."

You're telling me that you'd excuse Kristopher and Anand's mis-use of their capabilities on their public site just because it's their opinion?
(Not that they ever have, I'm just using them as an example.)

The average joe can post whatever he wants, but usually has no credability. Just like you wouldn't go to Hardforums and say "XY from Anand Boards says G80 will be Quad-Core!"

So why do we do it about Fuad? :confused:
You're correct in being critical of such articles, but I think you're missing the point here. Tech gossip is just something people make up as an excuse to discuss topics for which they have no valid information. It doesn't really matter if Fuad is really a psychic toad bent on world domination in part through an IT misinformation campaign. As long as it gives people an excuse to post about G80, it's fine. Why worry about idle speculation?
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: sandorski
The Inquirer Equation 2005/2006: X FutureCard = 32 Pipes!
Yeah, let's not forget that the 7900 was supposed to have 32 pipes also according to the inq. Time has passed and manufacturing has advanced so it's no surpise that g80 will have 32pipes. If it only had 24 pipes, I would have been surprised.
 

Soccerman06

Diamond Member
Jul 29, 2004
5,830
5
81
Originally posted by: Golgatha
Anyone else think this is underwhelming. The X1900 series has 48 pixel shaders now. Will 8 more vertex shaders really make that much difference in actual game performance?

Your missing the point, DX 10 has the ability to change how many shaders, pixel, and geometry shaders can go into each frame, so you can go from 64 pixel to 64 shaders if the frame calls for that.