R600 may have external 512-bit memory controller.

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
R600 512-bit from the INQ.

"If the 512 memory ring turns to be the real thing, we are talking about 128 GB/s of memory bandwidth with GDDR4 clocked at 2000MHz. We also learned that the R600 may use memory faster than 2000MHz as it will be available by Q1. If ATI keeps pushing the chip we might get even faster GDDR4 chips at production time.

Even the PCB of the R600 will be super complicated, as you need a lot of wires to make 512 bit memory to work. Overall it has the potential to beat Nvidia's G80, but yet again it will come at least three months after Nvidia. The G80's memory works at 384 bit as Nvidia pretty much dis-unified everything in G80 from shaders to memory controllers. Nvidia likes to make rules and probably could not get more than 384 bit wide controller in the chip, as the G80 is still a 90 nanometre chip. "

If true, all that would remain to be seen is if the 64 unified shader GPU can dish out the pain.
Will be an extremely complicated and expensive PCB and will probably be reflected in the price tag. I don't know who came up with the wider bus idea first (Nvidia or ATI) but it could go either way.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Schadenfroh
The Inquirer: By Fuad Abazovic

It is total bullsh!t.


I would tend to agree, but even though Faud wrote the article, it makes for juicy conversation. Keep your eyes open for any other sources about this.
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Might be true. Might not.

/thread :laugh: ;)

I did notice he used "will have a real 512 bit memory controller" rather than "could well" or "might" or "we hear".


Not like Fraud to commit himself so openly.:D
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Well, it is possible that when ATI got the first wiff of Nvidia's wider bus, they scrambled their team to try to get to 512-bit to out-do them. Could be why we won't see any R600's til around February.

OR

R600 was meant to be 512-bit all along, hence Nvidia's bolt on extra bus? Could go either way.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
All's I know and care about is that both these chips are sounding like they are going to mosters! w00t just hope the price tage drops quickly!
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Assuming the naming stays the same then the XTX will have the 512bit right.

So what is the XT going to have? Because if it's 256bit then the G80 GTS is gonna spank it - 320bit. Or the memory configurations are going to be pretty different.

Someone who's brain works try and figure this out, it's friday afternoon and mine has shut down.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
The architectures are different, so you cant really make any accurate comparisons based on numbers alone. Just look how a x1800xl spanked the x850xt across the board, even though both had 16 pipes (or "pipes"), similar clock speeds and a 256-bit mem bus. Or how the x1k cards with 16 TU's compete with and beat the 7800/7900 cards with 24 TU's. Or even how a 12-pipe 128-bit 7600gt runs even with a 16-pipe 256-bit x850xt. Even if the r600 stays on a 256-bit bus, with the 512-bit mem controller and ddr4 it wont necessarily be held back by bandwidth compared to a g80.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: munky
The architectures are different, so you cant really make any accurate comparisons based on numbers alone. Just look how a x1800xl spanked the x850xt across the board, even though both had 16 pipes (or "pipes"), similar clock speeds and a 256-bit mem bus. Or how the x1k cards with 16 TU's compete with and beat the 7800/7900 cards with 24 TU's. Or even how a 12-pipe 128-bit 7600gt runs even with a 16-pipe 256-bit x850xt. Even if the r600 stays on a 256-bit bus, with the 512-bit mem controller and ddr4 it wont necessarily be held back by bandwidth compared to a g80.


You might be right.
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Originally posted by: keysplayr2003
Originally posted by: munky
The architectures are different, so you cant really make any accurate comparisons based on numbers alone. Just look how a x1800xl spanked the x850xt across the board, even though both had 16 pipes (or "pipes"), similar clock speeds and a 256-bit mem bus. Or how the x1k cards with 16 TU's compete with and beat the 7800/7900 cards with 24 TU's. Or even how a 12-pipe 128-bit 7600gt runs even with a 16-pipe 256-bit x850xt. Even if the r600 stays on a 256-bit bus, with the 512-bit mem controller and ddr4 it wont necessarily be held back by bandwidth compared to a g80.


You might be right.
He might be wrong.

I love speculation thread. :laugh:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Sable
Originally posted by: keysplayr2003
Originally posted by: munky
The architectures are different, so you cant really make any accurate comparisons based on numbers alone. Just look how a x1800xl spanked the x850xt across the board, even though both had 16 pipes (or "pipes"), similar clock speeds and a 256-bit mem bus. Or how the x1k cards with 16 TU's compete with and beat the 7800/7900 cards with 24 TU's. Or even how a 12-pipe 128-bit 7600gt runs even with a 16-pipe 256-bit x850xt. Even if the r600 stays on a 256-bit bus, with the 512-bit mem controller and ddr4 it wont necessarily be held back by bandwidth compared to a g80.


You might be right.
He might be wrong.

I love speculation thread. :laugh:

but he could be right. :p
:Q

theInq get ATi's rumours much better than it get nvidia's . . . ;)

i'd say - since ATi knows it is gonna be months 'late' - they are "leaking" details of their product - 'damage control' to counter g80.
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Originally posted by: apoppin
He might be wrong.



but he could be right. :p
:Q

theInq get ATi's rumours much better than it get nvidia's . . . ;)

i'd say - since ATi knows it is gonna be months 'late' - they are "leaking" details of their product - 'damage control' to counter g80.
I wish they'd leak em to a source I actually trusted.;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Sable
Originally posted by: apoppin
He might be wrong.



but he could be right. :p
:Q

theInq get ATi's rumours much better than it get nvidia's . . . ;)

i'd say - since ATi knows it is gonna be months 'late' - they are "leaking" details of their product - 'damage control' to counter g80.
I wish they'd leak em to a source I actually trusted.;)

they can't . .. it makes sense to leak far-off production plans - 3 or 4 months away - to an UNreliable source

if it turns out to be way OVERestimated . .. well, it was "just the inq" :p

expect reliable leaks in the coming months ;)
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
512-bit memory interface requires 16 memory chips on top of the much, much costly PCB. Anything is possible these days so I wouldn't say it won't happen, but unless ATI thinks they've been making too much money off R520/R580 and they need to somehow reduce their margin, :laugh: it is pretty much a pipe dream. AIB partners would not be too thrilled, either. As much as I'd want to see it happening the reality says otherwise.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: lopri
512-bit memory interface requires 16 memory chips on top of the much, much costly PCB. Anything is possible these days so I wouldn't say it won't happen, but unless ATI thinks they've been making too much money off R520/R580 and they need to somehow reduce their margin, :laugh: it is pretty much a pipe dream. AIB partners would not be too thrilled, either. As much as I'd want to see it happening the reality says otherwise.

16 chips is no big deal, besides costing more. 8 chips per side has been done many times over already.

 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
Originally posted by: keysplayr2003
Originally posted by: lopri
512-bit memory interface requires 16 memory chips on top of the much, much costly PCB. Anything is possible these days so I wouldn't say it won't happen, but unless ATI thinks they've been making too much money off R520/R580 and they need to somehow reduce their margin, :laugh: it is pretty much a pipe dream. AIB partners would not be too thrilled, either. As much as I'd want to see it happening the reality says otherwise.

16 chips is no big deal, besides costing more. 8 chips per side has been done many times over already.

Where?

I thought 512 bit would mean less chips

 

Rock Hydra

Diamond Member
Dec 13, 2004
6,466
1
0
Originally posted by: BassBomb
Originally posted by: keysplayr2003
Originally posted by: lopri
512-bit memory interface requires 16 memory chips on top of the much, much costly PCB. Anything is possible these days so I wouldn't say it won't happen, but unless ATI thinks they've been making too much money off R520/R580 and they need to somehow reduce their margin, :laugh: it is pretty much a pipe dream. AIB partners would not be too thrilled, either. As much as I'd want to see it happening the reality says otherwise.

16 chips is no big deal, besides costing more. 8 chips per side has been done many times over already.

Where?

I thought 512 bit would mean less chips

My GeForce 5900 Ultra had 8 chips on each side.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Rock Hydra
Originally posted by: BassBomb
Originally posted by: keysplayr2003
Originally posted by: lopri
512-bit memory interface requires 16 memory chips on top of the much, much costly PCB. Anything is possible these days so I wouldn't say it won't happen, but unless ATI thinks they've been making too much money off R520/R580 and they need to somehow reduce their margin, :laugh: it is pretty much a pipe dream. AIB partners would not be too thrilled, either. As much as I'd want to see it happening the reality says otherwise.

16 chips is no big deal, besides costing more. 8 chips per side has been done many times over already.

Where?

I thought 512 bit would mean less chips

My GeForce 5900 Ultra had 8 chips on each side.

Exactly, my 5900U also had 8 chips per side.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: BassBomb
Originally posted by: keysplayr2003
Originally posted by: lopri
512-bit memory interface requires 16 memory chips on top of the much, much costly PCB. Anything is possible these days so I wouldn't say it won't happen, but unless ATI thinks they've been making too much money off R520/R580 and they need to somehow reduce their margin, :laugh: it is pretty much a pipe dream. AIB partners would not be too thrilled, either. As much as I'd want to see it happening the reality says otherwise.

16 chips is no big deal, besides costing more. 8 chips per side has been done many times over already.

GDDR3/4 uses 32-bit wide chips. So divide bus width (256/320/384/512/whatever) by 32 to get the number of required chips.

The DDR memory on 5900's must have been only 16-bit wide chips. 256/16 = 16.

512/32 = 16

I'm sure their are variances from vendor to vendor and what memory is used on any given card. Density of memory plays a part as well I would guess. Dunno.

Where?

I thought 512 bit would mean less chips

 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
Originally posted by: SpeedZealot369
Is it true that it's coming out 3 months later? That's a really stupid move by ATi :(

if there card is not as good ad 8800GTX then its not a smart move. If its the same or better its ok.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Cooler
Originally posted by: SpeedZealot369
Is it true that it's coming out 3 months later? That's a really stupid move by ATi :(

if there card is not as good ad 8800GTX then its not a smart move. If its the same or better its ok.

No, it had better be better... IMO anyway...

Assuming Nvidia fixes their IQ with G80 like everyone is hoping and both Nvidia and ATI are truly neck and neck, releasing a product that is merely on par, three months later would be disastrous as far as profits are concerned.