New Details on R600 emerging - ReScheduled to MAY Launch

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 14, 2003
10,442
0
0
Originally posted by: Cookie Monster
From CJ in Beyond3d:

It seems that today some more info has surfaced.

R600 will be launched at Cebit. The press will get to see the R600 at the end of February. One week after the Cebit the R600 will be available for everyone to buy. Claims are that the R600 is 5 to 10% faster in DX10 applications than the G80. Note the DX10 claim and no mention of DX9 performance compared to G80.

Price will be $599 and only the 9" PCB version will be out in retail channels. The 12" PCB version will be OEM only.

Scheduled for a launch at the end of April is RV630 as a replacement of the X1950Pro. nV will launch the 8600GTS (G84) one month earlier in March as a replacement of the 7900GS and 7950GT. A few weeks later, mid-April, nV will release the 8600GT with/without HDMI / with HDCP, as a replacement of the 7600GT.

In the lower end AMD will introduce the RV610. First samples have arrived from the factory and they are looking good. RV610 will be the replacement of the X1650 series. Around that time nV will also release 8500GT HDMI (G86 GDDR3) in versions with and without HDCP (G86 DDR2). At the end of April they will also release 8300GS (G86 DDR2) as a replacement of the 7600GS/7300GT.

There you have it.

Ok i shouldve bolded it all.



12 inch cards for OEM only? i dont get it.

why not just make all 9 inchers if thats what they can make? (must of been a good design...512bit bus and some serious power circuitry on a card no bigger than the 8800's)

is it cheaper for them to make the 12 inch cards? so they just fob them ones off to the OEM and system builder crowd, leaving the more expensive smaller ones for us?
 
Jun 14, 2003
10,442
0
0
Originally posted by: sodcha0s
what i am waiting for is to see how they squeezed such *incredible bandwith* out of a single "512MB ring bus" --compared with 1900 series ...

well *how* they "did" it was to actually *make* the ring bus *512-BIT* with 1 gig of *memory.*

:p

?

i thought it was supposed to be 512bit physical memory bus to the memory, then internally (inside the chip) the data can then be shunted around on a 1024bit ring bus.

the x1900 had 256bit memory interface, jimmied to a 512bit internal ring bus. it didnt give it any more memory bandwidth, but it helped shunt data round the chip nicely
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
http://www.theinquirer.net/default.aspx?article=37484
. . . R600 . . . can and will support Quad setups. It will work with four cards. ATI has been working to solve the four card performance problem for a while. Nvidia tried it and we tested the Quad SLI with two 7950 GX2 cards but we got super lame performances out of it.

The power supply manufacturers are thrilled about four cards in a high-end machine as such a system will demand a lot of power, probably even more than 1000W.

ATI can possibly push a frame or two with its Alternate Frame Rendering marchitecture and make it more efficient than the split screen rendering. We will have to wait and see but it sounds like in March some records are going to get broken.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Looks like ati is opening a can of whoop ass. Just wish it was a little greener............
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Quad Crossfire? Honestly, how many people (or companies) will have a use for this? While impressive from a technological viewpoint, I just don't see too many practical uses.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Creig
Quad Crossfire? Honestly, how many people (or companies) will have a use for this? While impressive from a technological viewpoint, I just don't see too many practical uses.

Not to mention, a motherboard holding 4 dual slot R600s is not going to fit in any normal computer case. :Q
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
more tidbits ... this time RV610 and RV630

http://www.theinquirer.net/default.aspx?article=37508
The chips are in the test phase and this bit of driver text proves it. The driver lists three different RV610 cards the straight vanilla, the RV610-DT LE and the RV610-DT Pro. We guess there will be at least three different flavours of this card.

RV630 also has three variations but we don?t have many details about it. We can assume XT, PRO and LE version as the driver recognises three different versions. The good thing about these cards is that the add-in board partners will be able to make their own designs.

"RV610" = ati2mtag_RV610, PCI\VEN_1002&DEV_94C0
"RV610-DT (LE)" = ati2mtag_RV610, PCI\VEN_1002&DEV_94C3
"RV610-DT (Pro)" = ati2mtag_RV610, PCI\VEN_1002&DEV_94C1
"RV630" = ati2mtag_RV630, PCI\VEN_1002&DEV_9589
"RV630 " = ati2mtag_RV630, PCI\VEN_1002&DEV_9588
"RV630 " = ati2mtag_RV630, PCI\VEN_1002&DEV_9580


The launch should be in April with mass availability in May time.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
The new info sucks.

In fact, the general lack of info on these cards sucks (no offence, OP).

To top it off, I don't think R600 will beat G80 by much, and even if it does, what games need this type of power? Only Crysis will need it, and it's not out until "sometime in 2007". :thumbsdown:
 

teflonbilly

Junior Member
Jun 25, 2006
6
0
0
you get a dx 10 video card to play crysis in dx10 glory. xbox 360 and ps3 will never match dx10 for graphical sweetness
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
Originally posted by: teflonbilly
you get a dx 10 video card to play crysis in dx10 glory. xbox 360 and ps3 will never match dx10 for graphical sweetness

And that's why Cevat Yerli said that consoles can't handle it. :D
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Pic of the R600

The 12 inch OEM version in its fine glory. The R600 GPU must be pretty hot, look at the cooler used. 2 copper heat pipes and a copper heat sink as well.

VR-Zone has learned about some new details on 80nm R600 today and there will be 2 SKUs at launch; XTX and XT. There will be 2 versions of R600XTX; one is for OEM/SI and the other for retail. Both feature 1GB DDR4 memories on board but the OEM version is 12.4" long to be exact and the retail is 9.5" long. The above picture shows a 12.4" OEM version. The power consumption of the card is huge at 270W for 12" version and 240W for 9.5" version. As for R600XT, it will have 512MB of GDDR3 memories onboard, 9.5" long and consumes 240W of power. Lastly, there is a cheaper R600XL SKU to be launched at a later date.

XTX comes with 1Gb GDDR4 and the pcb is only 9.5inches? sounds like ATi done a great job fitting 16 memory chips/and 512bit interface including other stuff on a short PCB. However the XT comes with 512mb GDDR3? That sure sounds like a huge gap.

Im getting excited for once.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Kromis
Originally posted by: teflonbilly
you get a dx 10 video card to play crysis in dx10 glory. xbox 360 and ps3 will never match dx10 for graphical sweetness

And that's why Cevat Yerli said that consoles can't handle it. :D

total nonsense

the Crysis Devs *clarified* that Crysis CAN be ported to the next gen consoles, no problem

almost forgot:

http://www.theinquirer.net/default.aspx?article=37543
R600 Uber edition confirmed

R600 Uber edition will end up as the fastest R600-based card of the first batch on the shelves. The key secret is a water cooler. We heard that two companies are competing to make the cooler. One is Aavid , which we have already seen on many graphic cards and the second is Denmark-based Asetek.

We hear that the water cooler has both water pump and the reservoir on the radiator. So the card will look similar to the other cards, just with bigger radiator, powered with fan, and a reservoir.

You should not end up with lot of wires and tubes from and to your card. It should be the fastest card around but we will have to hold our breath for a while more yet.

240 w?
:Q

1000 w seems a little *underpowered* for Quad xfire :p
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Not sure how the XTX version can suck in 270W, when you can clearly see from the pic that 225W is the maximum (PCI-express slot, and dual 6 pin)

The power consumption sounds bloated. We will have to wait for the real benchmarks to see if these rumorus are actually true or not.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
agreed ... the Power consumption seems suspiciously high ...

still looking for a reasonably priced xfire PS ...

[oxymoron, i know ... maybe "best bang-for-buck]
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Look at the pictures of the R600 on VR-Zone and overclockers. Notice any difference? The VR-Zone picture shows two six-pin PCIe power connectors and the overclockers picture shows a single six-pin PCIe and an eight-pin PCIe2 power connector. I'd bet the VR-Zone picture is an older revision card.

Interestingly, the eight-pin PCIe2 connector's rightmost six pins are identical to those on the PCIe connector. I wonder if it might not also be possible to power this card from two six-pin PCIe connectors.

Something else of note is that there are TWO fan headers on the card, but only the top one is being utilized on the OEM card. Possibly the retail card uses two separate fans for cooling?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The fan of that card is at a whopping 24W. Noise could be a problem. R600 could be potentially be as big as G80 in terms of GPU size. Look at the back!