More Rumours of G71 & G80

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Cookie Monster

However they arent full ALUs. One full featured ALu, and one mini ALu (i think the 6 series used this concept). Speaking in terms of full featured ALUs, the R580 has 48.

Also, the R580 has "fixed" number of TMUs. (16 TMUs). While also it has 48 pixel shaders. But NVs method shows alot more flexiblity as i explained.

:confused: ???
Unless Nv reinvents the pixel pipe, there's no way the g71 ,assuming it's a 32 pipe card, will have more than 32 TMU's (the FX series had 2 TMU's per pipe, but IMO it's safe to bet they're not going that route again...). And pixel shaders can not function as texture units, and vice versa.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: Cookie Monster
Originally posted by: munky
I'm actually more interested in the g80 vs r600 face off. Will Nv throw out unified shaders to get the card out before Ati? And sm4, lol... think of all the flaming coming up from that debate :laugh:

:laugh:

Unified shaders... tile based rendering... S.M 4.0.... 80/65nm... Quad SLi2/Crossfire2.. NV/ATi... :D


Do ATI have plans for Quad Crossfire do you know?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
Originally posted by: Cookie Monster

However they arent full ALUs. One full featured ALu, and one mini ALu (i think the 6 series used this concept). Speaking in terms of full featured ALUs, the R580 has 48.

Also, the R580 has "fixed" number of TMUs. (16 TMUs). While also it has 48 pixel shaders. But NVs method shows alot more flexiblity as i explained.

:confused: ???
Unless Nv reinvents the pixel pipe, there's no way the g70 ,assuming it's a 32 pipe card, will have more than 32 TMU's (the FX series had 2 TMU's per pipe, but IMO it's safe to bet they're not going that route again...). And pixel shaders can not function as shader ALU's, and vice versa.


Ah.. i see. I think i got mixed up but however.

Pixel shaders, TMUs, ALus.. pipelines.. sheesh
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: nib95
I'm not sure what you mean. The link above is not a review.
It's a preview as you so mentioned.

You asked:

"Would that be the new RD580 chipset that was used in the benches linked to above?"

To my knowledge there hasnt been a review of the RD580 yet. If there has, please link me to it.

I meant a link to something that would validate your ridiculous claim that ATI graphics cards run much faster on Crossfire motherboards, or that NVIDIA cards run much faster on NForce4 motherboards.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: Matthias99
Originally posted by: nib95
I'm not sure what you mean. The link above is not a review.
It's a preview as you so mentioned.

You asked:

"Would that be the new RD580 chipset that was used in the benches linked to above?"

To my knowledge there hasnt been a review of the RD580 yet. If there has, please link me to it.

I meant a link to something that would validate your ridiculous claim that ATI graphics cards run much faster on Crossfire motherboards, or that NVIDIA cards run much faster on NForce4 motherboards.


I never said 'much' faster.

I just said faster. So I assumed the difference would be more then 1-2fps. But 1-2fps is still a difference is'nt it?
I assume with the newer drivers the difference may increase.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
Originally posted by: supafly
It will run Quake III at over 1000fps!


I've been able to do that for a while now. However at that kind of frame rate (seems to be around 600 fps+) the game is unplayable due to stuttering and jumpiness due to its inability to deal with timing or something at that high a frame rate! I have to play with vsync enabled for it to run smoothly.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: nib95
I never said 'much' faster.

I just said faster. So I assumed the difference would be more then 1-2fps. But 1-2fps is still a difference is'nt it?

From your first post that started this whole mess:

All reviews running the X1900 on SLI mobos show it losing to the 7800 GTX 512mb in numerous games, but if you look at the reviews of the X1900 being run on RD480 chipset mobo's, the X1900 pretty much beats the GTX 512mb in every game, even OpenGL ones.

So its more then just a 'minor' difference.
Hense the reason why I said we can no longer test GPU's on a single test platform.

You said it was "more then (sic) just a 'minor' difference", and the X1900 on RD480 "pretty much beats the GTX 512mb in every game".

Basically, provide some benchmarks that prove this, or just drop it.

I assume with the newer drivers the difference may increase.

I doubt there is a whole lot for them to tweak through driver updates. It's not like changing the motherboard makes the GPU core run faster, so anything GPU-limited is going to remain GPU-limited.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Mathias, thanks for explaining. I thought the difference was more then minor, I guess thats not the case. I prefer this kind of conversation, no need for threats! :| *cough* Ronin *cough*
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: exdeath
Originally posted by: supafly
It will run Quake III at over 1000fps!


I've been able to do that for a while now. However at that kind of frame rate (seems to be around 600 fps+) the game is unplayable due to stuttering and jumpiness due to its inability to deal with timing or something at that high a frame rate! I have to play with vsync enabled for it to run smoothly.

.... which card do you have...
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
lol @ arguments over the internet about cards not released yet.

All you guys need to know is the 1900 series is the current winner, i've been saying this for a while.
 

dadach

Senior member
Nov 27, 2005
204
0
76
exactly...it is nvidia that is getting owned currently, but some people just have hard time dealing with it...sad :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: dadach
exactly...it is nvidia that is getting owned currently, but some people just have hard time dealing with it...sad :)

Yeah, it's ATI's turn to shine for 6 solid whole weeks. Well, at least it's something for those poor bastards. ;)
In case nobody realizes, this was a joke. See? -------> :D

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dadach
exactly...it is nvidia that is getting owned currently, but some people just have hard time dealing with it...sad :)

:roll:

The people who "have a hard time dealing" with their favorite company not having the overall fastest card at the moment have larger issues.

So do you think in March when the 7900S come out if they're faster than whatever ATI has at the time ATI fans will be "sad"? LOL

If what you think were true, we would have had rashes of ATI fan suicides in 2005, as nVidia had the fastest graphics all year long?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nib95
Originally posted by: Cookie Monster
Originally posted by: munky
I'm actually more interested in the g80 vs r600 face off. Will Nv throw out unified shaders to get the card out before Ati? And sm4, lol... think of all the flaming coming up from that debate :laugh:

:laugh:

Unified shaders... tile based rendering... S.M 4.0.... 80/65nm... Quad SLi2/Crossfire2.. NV/ATi... :D


Do ATI have plans for Quad Crossfire do you know?

There is a picture of it linked in my sig
|
|
v
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: dadach
exactly...it is nvidia that is getting owned currently, but some people just have hard time dealing with it...sad :)

Getting owned? i wouldnt say so. The thing is that even though the X1900XTX maybe the fastest single doesnt mean ATi is selling them well.

Money comes from low/midrange. The 6800GS/7800GT is dominating the 200~300 dollar market. Not sure about the low end though, but obviously 6600DDR2 looks more favorble than the X1300 pro, the 6600GT still a really good budget card.

Just look at the consumers review on newegg. Alot of people seems to favor NV right now. EVGA 7800GT - Retail itself has 277 review.. while the X1800XL with the most reviews has 29..

This tells you something doesnt it?
 

dadach

Senior member
Nov 27, 2005
204
0
76
all i was saying is that ati has currently the fastest card (this of course goes for the benchmarks with the most quality, as 800x600 no aa/no af tests are not relevent)...i was not talking about sales, just quality of the product...naturally i expected the hardcore nvidia fans/trolls to jump up like i cursed their families :)...this was too easy...i would recommend waitnig for the ACTUAL product and than maybe talk trash (like you could/did before the x1800 launch), until then the best thing is to be quiet and not hold on to rumours, cause it can come back to haunt you :)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: dadach
all i was saying is that ati has currently the fastest card (this of course goes for the benchmarks with the most quality, as 800x600 no aa/no af tests are not relevent)...i was not talking about sales, just quality of the product...naturally i expected the hardcore nvidia fans/trolls to jump up like i cursed their families :)...this was too easy...i would recommend waitnig for the ACTUAL product and than maybe talk trash (like you could/did before the x1800 launch), until then the best thing is to be quiet and not hold on to rumours, cause it can come back to haunt you :)

Well.. the term owned you used did imply sales though. Because whats the use with a quality product if its not even being sold? now thats getting owned. However, ATis new architecture really is impressive, not denying that.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: munky
Unless Nv reinvents the pixel pipe, there's no way the g71 ,assuming it's a 32 pipe card, will have more than 32 TMU's (the FX series had 2 TMU's per pipe, but IMO it's safe to bet they're not going that route again...). And pixel shaders can not function as texture units, and vice versa.
Right, Cookie Monster got overenthusiastic with the 48 TMUs bit, but don't NV4x and G7x share some transistors between the first pixel shader ALU and the texturing unit in each "pipe?" This would agree with why you "lose" the primary ALU in a pipe if you texture.

32 TMUs is still twice what R580's packing. And if G71 goes to 24 ROPs, that's 48 stencil ops per clock, or three times as many as R580 without AA (only 1.5x as many with AA). So, a 32 "pipe" G71 could conceivably kick R580's ass in some titles--bandwidth permitting.

The question is, is bandwidth a hard-ass or a softy? :)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Pete
Originally posted by: munky
Unless Nv reinvents the pixel pipe, there's no way the g71 ,assuming it's a 32 pipe card, will have more than 32 TMU's (the FX series had 2 TMU's per pipe, but IMO it's safe to bet they're not going that route again...). And pixel shaders can not function as texture units, and vice versa.
Right, Cookie Monster got overenthusiastic with the 48 TMUs bit, but don't NV4x and G7x share some transistors between the first pixel shader ALU and the texturing unit in each "pipe?" This would agree with why you "lose" the primary ALU in a pipe if you texture.

32 TMUs is still twice what R580's packing. And if G71 goes to 24 ROPs, that's 48 stencil ops per clock, or three times as many as R580 without AA (only 1.5x as many with AA). So, a 32 "pipe" G71 could conceivably kick R580's ass in some titles--bandwidth permitting.

The question is, is bandwidth a hard-ass or a softy? :)

If Nv can get ddr4 on the g71, bandwidth might become less of a problem. But I would not bet on that happening. As for the ALU's, yes, one of the ALU's in each shader is also responsible for texture addressing (Ati has a separate unit), so in a real life game one ALU may not always be doing math ops.
 

imported_Aelius

Golden Member
Apr 25, 2004
1,988
0
0
Well this back and forth is actually good.

While most people are sick of seeing cards go from 110FPS to 130FPS every few months in their favorite games these same people do not run everything maxed and most certainly do not run the game in the 1600 range of resolutions with said maxed graphics.

With LCDs being the most popular displays people should realize that running an image on any LCD at a different resolution then the native resolution has an impact on image quality. Period.

So yeah I want to see new cards get better and better and not by just a bit but by leaps and bounds because today I simply cannot run most games with decent FPS at native resolutions with maxed graphics. Then again I run a 20" widescreen LCD so that's 1650 X 1080. I'm just glad more games support the 16:10 resolution then ever before.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
We need G71 to beat X1900XTX heavily. Thus, ATI won't be able to say they have serious advantages in shader heavy games with AA enabled and they'll lose even more in OpenGL. This will force ATI to finally rewrite their OpenGL drivers :) and prices on X1900XT will fall. ATI will get nervous about G80 and will 'unlock' 24 more pipes to R600 because you know they always had extra pipes in reserve even though the card was in development for 2 years hehe....
 

Compuzen

Member
Nov 25, 2005
161
0
0
Hmm, so should I get my second card for SLI now or wait to see what happens? If I wait, nothing will happen, If I buy a second GTX KO, the new cards will hit the street next month. I'm getting the 24" Dell as soon as my check comes in. Will my one GTX handle 19 x 12 well?