R600 nice read

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BlizzardOne

Member
Nov 4, 2006
88
0
0
Originally posted by: Cookie Monster
The thing is that nVIDIA is preparing a total of nine G80 SKUs. 5 of them are found in the drivers i think.

Sure it was nine G80 SKU's, and not G8x's? Being spoiled for choice is one thing, but nine derivatives of essentially the same thing is just nonsense.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BlizzardOne
Originally posted by: Cookie Monster
The thing is that nVIDIA is preparing a total of nine G80 SKUs. 5 of them are found in the drivers i think.

Sure it was nine G80 SKU's, and not G8x's? Being spoiled for choice is one thing, but nine derivatives of essentially the same thing is just nonsense.

Well that could include mobile and integrated chips. G8950GX2 FTW!
 
Jun 14, 2003
10,442
0
0
Originally posted by: tuteja1986
Originally posted by: Giffen
By the time ATI releases that card Nvidia will be well on their way to releasing their next gen card, and I doubt the R600 will hold a candle to the G80 anyway.

lol newbie ;( ... it takes atleast few months to counter attack ... i say G81 will come out in April earliest. Also Nvidia wouldn't need to worry about R600 when they have all noobs believed Nvidia is the best and ATI knows crap all. A noob like you Giffen ;)

if you ask me, nvidia are not just and have not just been sat on their arses.

i personally think they havent played all their cards, for me G80 doesnt seem like the full thing. why release your absolute best hand, when the competition can be beaten easily with a worse (relatively) hand and be beaten for some months?

its awesome, no doubt, but i think theres a little more to come. remember that G80 is manufactured on a tried and tested 90nm. the move to 65nm surely isnt far off and when it happens, i think were gonna see some nice clock increases. (if ati is aiming 700-800mhz, no reason why nvidia cant aim for the same, and they will of had time to tweak and slim an already impressive design)

i heard a rumour that G80 was infact designed with a 512bit memory in mind, but that it wasnt fully utilized because of cost (cost to implement, cost of it not working reliably). 384bit does look odd, as generally things tend to just go to another power of 2.

personally i dont think we'll see a refreshed g80 with a 512bit memory bus (dont think they would shrink to 65nm and increase bus width) thought id like to be wrong. id envisage a die shrink and copious clock boost in a effort to combat R600.
 
Jun 14, 2003
10,442
0
0
Originally posted by: munky
First of all, it's 16 ROP, not pipes. Those are completely different things, and until the g80 Nv was also using 16 ROP's. But it doesnt make sence if the r600 will have a 512-bit external memory bus, a 1024-bit mem controller, then why stick with only 16 ROP's? With such bandwidth available, you could surely benefit from more. Then again, it's the INQ, so this info has as much probablility of being true as it does of being false.

And for all those prophesizing doom for the r600 from the g80 refresh, well that sure worked out well for the g71 refresh and the r580, didn't it? :roll:

well the way i understand it is that ROP's arent fully utilized most of the time.

i remember the 6800's had 16rops to 16 pipes, but for the 6600 nvidia used 4 rops to 8 pipes and it still mopped the floor with ATi's offerings

if pixels are gonna spend along time in the pixel pipeline, then your ROPs are just sat doing nothing.

if you cook up some good scheduling/allocation thing, then you could get away with it?

something like that, its not worth it to have 1:1 ratio on rops to shaders, most of the rops just wont get used.
 
Jun 14, 2003
10,442
0
0
Originally posted by: apoppin
there is already a Part Two:

in part

http://www.theinquirer.net/default.aspx?article=35708

his PCB (Printed Circuit Board) is the most expensive one that DAAMIT has ever ordered. It's a complex 12-layer monster with certain manufacturing novelties used in order to support the requirements of the R600 chip, most notably the 512-bit memory controller and the distribution of power to the components.

The memory chips are arranged in a similar manner as on the G80, but each memory chip has its own 32-bit wide physical connection to the chip's RingBus memory interface. Memory bandwidth will therefore range from anywhere between 115 (GDDR3 at 8800GTX-style 900MHz in DDR mode - 1.8GHz) and 140.1GB/s (GDDR4 at 1.1GHz DDR, or 2.2GHz in marketingspeak).

This will pretty much leave the Geforce 8800 series in the dust, at least as far as marketing is concerned. O course, 86GB/s sounds pretty much like nothing when compared to 140GB/s - at least expect to see that writ large on the retail boxes.

The R600 board is HUGE...

The PCB will be shorter than 8800GTX's in every variant, and you can compare it to X1950XT and 7900GTX. The huge thing is the cooler. It is a monstrous, longer-than-the-PCB quad-heat pipe, Artic-Cooling style-fan on steroids looking beast, built from a lot of copper. Did we say that it also weighs half a ton?

This is the heaviest board that will hit the market and you will want to install the board while holding it with both hands. The cooler actually enhances the structural integrity of the PCB, so you should be aware that R600 will bring some interesting things to the table.

...

There will be two versions of the board: Pele comes with GDDR4 memory, and UFO has GDDR3 memory, as Charlie already wrote here. DAAMIT is currently contemplating one and two gigabyte variants, offering a major marketing advantage over Graphzilla's "uncomputerly" 640 and 768MB.

Did we mention two gigabytes of video memory? Yup, insane - though perhaps not in the professional world, where this 2GB board will compete against upcoming G80GL and its 0.77/1.5GB of video memory. We do not expect that R600 with 2GB will exist in any other form than in FireGL series, but the final call hasn't been made yet.

The original Rage Theatre chip is gone for good. After relying on that chip for Vivo functions for almost a decade, the company decided to replace the chip with the newer digital Rage Theatre 200. ...

...

For starters, the rumour about this 80nm chip eating around 300W is far from truth. The thermal budget is around 200-220 Watts and the board should not consume more power than a Geforce 8800GTX. Our own Fudo was right in a detail - the R600 cooler is designed to dissipate 250 Watts. This was necessary to have an cooling headroom of at least 15 per cent. You can expect the R680 to use the same cooler as well and still be able to work at over 1GHz. This PCB is also the base for R700, but from what we are hearing, R700 will be a monster of a different kind.

...

Just like RV570, the X1900GT board, the R600 features new dual-bridge connector for Crossfire capability. This also ends nightmares of reviewers and partners, because reviewing Crossfire used to be such a pain, caused by the rarily of the Crossfire edition cards.

Expect this baby to be in stores during Q1'07, or around January 30th. . . .


January 30th!
:Q

edit:

HIGHly doubtful

let me guess .. Feb or March ... DAAMit will lay low till then ;)

i dont believe that lol

still i wanna see the cooler..... i love a good meaty air cooler. especially well engineered ones. hopefully its lots of copper and heat pipes, it'll look like a work of art!

 
Jun 14, 2003
10,442
0
0
Originally posted by: Genx87
Originally posted by: apoppin
Originally posted by: Genx87
Isnt the 8800GTX's thermal envelope 189 watts?

If they are budgeting out for 250 watts, I highly doubt their claims it wont eat more power than the 8800GTX.

The thermal budget is around 200-220 Watts and the board should not consume more power than a Geforce 8800GTX. Our own Fudo was right in a detail - the R600 cooler is designed to dissipate 250 Watts. This was necessary to have an cooling headroom of at least 15 per cent.

Like I said, I doubt their claims.

Why are they specing their cooler for 250 watts if they expect it to remain on par with the 8800GTX? Their claim a thermal budget of 200-220 is still higher than the envelope of the 8800GTX which afaik is 189 watts.

Time will tell, but knowing ATI's history, I am not expecting it to match the Nvidia part on heat and power consumption.

which is something i just dont get

Ati's mobile parts often have very good heat/power figures....yet none of this seems to filter up to the desktop cards. though again ati's high end mobile parts are that scarce that no ones tested one (x1800 mobility radeon?)

and another thing

why design the cooler to be bigger than the PCB?

if the cooler is gonna dwarf the PCB why limit yourself? why not just make the PCB bigger (match the HSF size) and give your self more breathing room?

its a bit daft to be saying hey look we crammed all this ace stuff onto a PCB the same size as our old PCB's...... but oh yeah it still wont fit in alot of cases coz the HSF is just mahoosive.

seems like pointless exercise for me lol
 
Jun 14, 2003
10,442
0
0
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

read the bloody article man

64 shaders yes, but 4 x more work they can do! (if the INQ is correct of course, which remains to be seen)

you havent been able to compare pipes to pipes for ages now.... its just not that simple anymore
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Hmm... I want to get a DX10 card for my rig. I have a 7900GTO that is eligible for step up for 50-some more days. I think I'm gonna get a 8800GTX at the end of my step-up, then sell it to my roommate and get a R600 :) He's a nvidia fanboy so I shouldn't have that much of a problem :p
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

Sure it will. One of the advantages of being fashionably late to a party. You can peek in through the windows and see what everyone is wearing, then run home and outdress everyone. (Ok bad example, but point made).

 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Hmm R600 monster + Vista/DX10 official + Crysis = happy 5150 Joker. If the rumors INQ is floating around about its PCB being shorter than the 8800 GTX and only consuming 200-220W then this thing will be a winner. No point in owning a DX10 card right now though :\
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: otispunkmeyer
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

read the bloody article man

64 shaders yes, but 4 x more work they can do! (if the INQ is correct of course, which remains to be seen)

you havent been able to compare pipes to pipes for ages now.... its just not that simple anymore

Well theoretically they can do 4 times the work. In practice what does it really work out to? 2 times the work at best is my guess. Maybe hit 2.5-3 times the work done each cycle after a year of optimizations.

Edit- it will be very close as always. Both these companies know exactly what the other is coming out with for each wave of cards. When they can get it to market is a little less known by each...
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: 5150Joker If the rumors INQ is floating around about its PCB being shorter than the 8800 GTX

The PCB will be shorter than 8800GTX's.......The huge thing is the cooler. It is a monstrous, longer-than-the-PCB quad-heat pipe

The board might be shorter, but the cooler might be bigger?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: keysplayr2003
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

Sure it will. One of the advantages of being fashionably late to a party. You can peek in through the windows and see what everyone is wearing, then run home and outdress everyone. (Ok bad example, but point made).

OMG, people actually do that? LOL... Pathetic. Something only a female would do...

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: the Chase
Originally posted by: otispunkmeyer
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

read the bloody article man

64 shaders yes, but 4 x more work they can do! (if the INQ is correct of course, which remains to be seen)

you havent been able to compare pipes to pipes for ages now.... its just not that simple anymore

Well theoretically they can do 4 times the work. In practice what does it really work out to? 2 times the work at best is my guess. Maybe hit 2.5-3 times the work done each cycle after a year of optimizations.

Edit- it will be very close as always. Both these companies know exactly what the other is coming out with for each wave of cards. When they can get it to market is a little less known by each...

It all depends on the shader code. If a relatively simple fragment program writes to all 4 components of the vector (RGBA) then I see no reason why the vec4 shader would not approach the theoretical 4x performance. However, if some color components have a dependency on the other color components or if not all components are being used, then you could see a drop in efficiency. A lot of this also depends on the shader compiler and how well it optimizes the code.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

Sure it will. One of the advantages of being fashionably late to a party. You can peek in through the windows and see what everyone is wearing, then run home and outdress everyone. (Ok bad example, but point made).

OMG, people actually do that? LOL... Pathetic. Something only a female would do...
that is pretty sexist

say that to your GF :p
:Q

:D
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BlizzardOne
Originally posted by: Cookie Monster
The thing is that nVIDIA is preparing a total of nine G80 SKUs. 5 of them are found in the drivers i think.

Sure it was nine G80 SKU's, and not G8x's? Being spoiled for choice is one thing, but nine derivatives of essentially the same thing is just nonsense.

Yea, G8x deriavtives. My bad. :p

Edit - Heres the link. Seems to say G80 derivatives..

Link

And heres the one with the 5 G80 derivatives.

Link 2

G80-400 (This could be the 8800 ultra)
G80-200 (Possibly the 8800GT)
G80-600 (mid range e.g 8600?)
G80-875 and G80-850 (low end e.g 8300?)
8800GTX (G80-300)
8800GTS (G80-100)

Should post this in the G80 thread.
 
Jun 14, 2003
10,442
0
0
Originally posted by: 5150Joker
Hmm R600 monster + Vista/DX10 official + Crysis = happy 5150 Joker. If the rumors INQ is floating around about its PCB being shorter than the 8800 GTX and only consuming 200-220W then this thing will be a winner. No point in owning a DX10 card right now though :\

lol the PCB shorter point is maybe a little moot, considering INQ also said that the cooler is bigger than the card. though it may only be like 10mm longer or something, but we dont know for sure.

 
Jun 14, 2003
10,442
0
0
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

Sure it will. One of the advantages of being fashionably late to a party. You can peek in through the windows and see what everyone is wearing, then run home and outdress everyone. (Ok bad example, but point made).

OMG, people actually do that? LOL... Pathetic. Something only a female would do...

maybe ati is run buy the wiminz!
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
My only question is how many arms and legs will the R600 cost ? I only have two of both.
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
Originally posted by: Zenoth
My only question is how many arms and legs will the R600 cost ? I only have two of both.

Can I buy one off you? :D

Pray that there are two cards for different budgets!
 

sam0t

Member
Mar 20, 2006
26
0
0
Intresting times. My wallet is not the fattest around so I have plenty of time to ponder which one of these chernobyls to get :)

My only consern are the drivers. Seems to be the trend to release hardware as fast as possible and drivers are lagging behind badly. Seen plenty of people complain about G80 drivers on other forums so it isnt ATI only thing anymore.
 

sbuckler

Senior member
Aug 11, 2004
224
0
0
I expect the R600 to have the performance edge over nvidia, even if they have to run a monster cooler and have it suck some silly amount of watts (/me still remembers the massive uproar over the 6800's power requirements and now look at us). That said ATI releasing a card and then actually being able to buy it are different things.

I expect nvidia will be more interested in selling lower end 8 series cards as that is where a lot more of the money is, and where ATI won't be able to compete yet.

Personally I'll wait for the smaller, cooler second gen cards to come out. By then we'll have some apps that really need the performance provided (for something more then yet higher levels of AA, and AF), and hopefully the prices will have settled down to more reasonable levels.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: apoppin
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003
Originally posted by: Wreckage
Less shaders and less ROPs than the G80? They are relying entirely on clock rate, which is a fools bet as Intel found out.

Hopefully Inq is wrong (as usual), as this will not compete well with the current G80 let alone a refresh. Not to mention NVIDIA will have several months to tweak the drivers by then.

Sure it will. One of the advantages of being fashionably late to a party. You can peek in through the windows and see what everyone is wearing, then run home and outdress everyone. (Ok bad example, but point made).

OMG, people actually do that? LOL... Pathetic. Something only a female would do...
that is pretty sexist

say that to your GF :p
:Q

:D


Why, yes... You are right :D

I decided to tell my wife that... She agrees!
 

Makaveli

Diamond Member
Feb 8, 2002
4,966
1,561
136
She agreed that its sexist, or that you are correct?

If the latter, apoppin u just got pwned by that guys wife, and she wasn't even on the computer LOL.