R600 may have external 512-bit memory controller.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Cookie Monster
Originally posted by: Zenoth
Are we even sure that R600 will have 64 Unified Shaders ? Or is it just yet another rumor ?

That another rumour. Now the latest rumour is that the R600 has 96 shaders..

That sounds a bit more sensible. If G80 consumes 200W and R600 consumes 250-300W, something has to be eating all that extra power!
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: jiffylube1024
Originally posted by: Cookie Monster
Originally posted by: Zenoth
Are we even sure that R600 will have 64 Unified Shaders ? Or is it just yet another rumor ?

That another rumour. Now the latest rumour is that the R600 has 96 shaders..

That sounds a bit more sensible. If G80 consumes 200W and R600 consumes 250-300W, something has to be eating all that extra power!

not tem mention R600 uses a smaller die process and lower power consuming memory.

if those power figures are accurate ~~~ R600 is a monster
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: jiffylube1024
Originally posted by: Cookie Monster
Originally posted by: Zenoth
Are we even sure that R600 will have 64 Unified Shaders ? Or is it just yet another rumor ?

That another rumour. Now the latest rumour is that the R600 has 96 shaders..

That sounds a bit more sensible. If G80 consumes 200W and R600 consumes 250-300W, something has to be eating all that extra power!

The 512bit 1024MB GDDR4 could be eating half the power. More Vram the higher the power consumption. Bigger die space could also result in higher power consumption.
 

Rock Hydra

Diamond Member
Dec 13, 2004
6,466
1
0
Originally posted by: Cookie Monster
512bit means a larger die space. The PCB will require more layers, and also it will be much more complex due to having 16 memory chips. This with the cost of 16 GDDR4 memory chips will make the R600 come with a higher price tag. Not to mention if the R600 have 1024mb it will be very expensive.

In a cost perspective its very unlikely (but ATi isnt conservative as nVIDIA which has its both bad and good). Im not sure if GDDR4 is in quantity, but with nVIDIA taking the tried and true in the form of GDDR3, it suggests that GDDR4 is expensive and also not in quantity as GDDR3.

Will be very expensive and the power draw of 1024mb of Vram will be pretty high even with GDDR4.


edit - not to mention the loads of power related components.

I thought one of the features of GDDR4 was reduced power usage.
 

hardwareking

Senior member
May 19, 2006
618
0
0
GDDR4 consumes less compared to GDDR3.But if u take nvidias G80 with 768 MB GDDR3 or 1GB of GDDR4 as on the ATI card,the ATI card might/will consume more.
Still seeing the G80's specs,i think ATI will need to come up with somethin special to beat it.And hopefully it'll be priced lower.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: hardwareking
GDDR4 consumes less compared to GDDR3.But if u take nvidias G80 with 768 MB GDDR3 or 1GB of GDDR4 as on the ATI card,the ATI card might/will consume more.
Still seeing the G80's specs,i think ATI will need to come up with somethin special to beat it.And hopefully it'll be priced lower.

OK but let's say GDDR4 only consumes a modestly lower amount of power, ~10%, and they have 33% more memory compared to Nvidia (768MB to 1 gig) - that still doesn't account for the rumoured power consumption of ATI's beast being 50-100W higher than Nvidia.

If that latest Inquirer tidbit is true (and honestly, it's a crapshoot with INQ), and Nvidia's G80 die is larger than ATI's, then ATI might or might not be able to fit 128 unified shaders in their core, because we know that the internal 512-bit ring bus is a transister hog (from R520/R580). So ATI having 96 shaders would make a lot more sense.

But, remember, ATI is on a smaller manufacturing process - 80nm I believe, vs 90nm on Nvidia. So 128 shaders is an outside possibility, though 96 seems to make more sense to me.


What it looks like it's going to come down to is, Nvidia releases G80 with 128 shaders, 768MB of RAM on a 384-bit bus; ATI counters with R600 with 96 shaders, running at 10-25% higher clockspeed, 512MB of GDDR4, and then Nvidia promptly refreshes to G81.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: josh6079
Originally posted by: apoppin
http://theinquirer.net/default.aspx?article=35096
The funny part is, while [nvidia] may win the size race, they are not going to win the speed crown. µ

a prediction from theinq
:Q

I think you alone may be one of the only contributors to the Inquirer's lingering existence. Stop giving them hits and let that bsing site die!! :p

and yet you comment on it :p
:Q

:roll:

i believe theinq got the ATi 'takeover' right - first. :p
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
On a side note about the Inquirer's trustworthiness, check this thread out, post #88 and #90. Kinda funny actually.

Back on topic- I don't want to get my hopes up for either G80 or R600 but dang, the rumored specs look sweet.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: apoppin
and yet you comment on it :p
:Q

:roll:
I didn't give their site another hit nor link their URL so that others could make the same mistake.

i believe theinq got the ATi 'takeover' right - first. :p
True, after 100+ lies they do sometimes get something right. After their supposed 32 pipe G71 monster and fake Nvidia 16-bit pictures...they are just trash IMO. It's obvious that you don't mind reading (nor linking) their "news" and that's fine. I was just saying that because of people like you they're never going to change their informativness as long as their BS keeps getting their site hits, that's all. If you really wanted them to get better at providing accurate information, you would just stop giving them hits until they figure out that people don't like reading trash from Fraud's orfice.

====================

Originally posted by: Elfear
On a side note about the Inquirer's trustworthiness, check this thread out, post #88 and #90. Kinda funny actually.
That link goes to a thread with only 6 posts.
Back on topic- I don't want to get my hopes up for either G80 or R600 but dang, the rumored specs look sweet.
Which means the refreshes will be even sweeter!
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
That thread doesn't let me get to post 88 or 90. Maybe I need to register.

Holy poodle! wins rep points from me.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: josh6079
Originally posted by: apoppin
and yet you comment on it :p
:Q

:roll:
I didn't give their site another hit nor link their URL so that others could make the same mistake.

i believe theinq got the ATi 'takeover' right - first. :p
True, after 100+ lies they do sometimes get something right. After their supposed 32 pipe G71 monster and fake Nvidia 16-bit pictures...they are just trash IMO. It's obvious that you don't mind reading (nor linking) their "news" and that's fine. I was just saying that because of people like you they're never going to change their informativness as long as their BS keeps getting their site hits, that's all. If you really wanted them to get better at providing accurate information, you would just stop giving them hits until they figure out that people don't like reading trash from Fraud's orfice.

i didn't give them any hits unless someone wanted to. :p

i simply quoted theInq - their FIRST prediction about r600 vs g80 - r600 being FASTER
:Q

and your opinion of theinq will not affect my reading nor my posting style . . . :p

but thanks for mentionting it again

:roll:

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Again, all that bandwidth is useless unless ATi do something useful with it, like finally adding full-screen SSAA on single cards. If they don't I won't even consider a R600.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Hmm.

So we have these tidbits to go on, but not a clue how each will perform. If the tidbits are close to factual, it will be a tough call.

R600: 96 unified shaders, 512-bit (not just ring bus but external to memory), 1GB GDDR4.

G80: (GTX) 128 unified shaders, 384-bit, 768MB GDDR3.

Well, bandwidth would for sure go to the R600. Pixel pushing power looks like it will go to G80. (This is based of number of shaders alone. Unknown how powerful each shader is for either R600 or G80.)

And that's about it for now.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: apoppin
i didn't give them any hits unless someone wanted to. :p
So you never visited the page you linked?
i simply quoted theInq - their FIRST prediction about r600 vs g80 - r600 being FASTER
:Q
Exactly. That remains to be seen and as of now that is just FUD.
and your opinion of theinq will not affect my reading nor my posting style . . . :p
Where did I say you posting style should change? I just made the comment that the Inquirer isn't going to stop posting it's normal uneducated guessings until they see that no one reads their ramblings. If you think I'm attacking your posting style then I'm sorry, you're wrong. Your posts are pretty enjoyable most of the time as well as different. I just think that people using Inquirer quotes for thread starters gets really old since all it does is provoke more trash to be spewed from their mouths.

Think about it, if you got paid to make up something that was just barely believable and got paid even more for getting that barely believable claim to sell more magazines, get more site-hits, sell more productrs, etc. then would you stop making such claims?
but thanks for mentionting it again

:roll:
Again, I simply mentioned that it seems your trying to defend the Inquirer or, if not them than your thread's existence.

Why did you make this thread?
Because the Inquirer said something about it.
Why do you give the Inquirer hits?
I don't, I "quote" them.
And how do you get those quotes?
...uh....

Just chill. I just gave my $.02 and if you think that it's not even worth that, then that's fine.
Again, all that bandwidth is useless unless ATi do something useful with it, like finally adding full-screen SSAA on single cards. If they don't I won't even consider a R600.
QFT. If G80 has an angle independent AF and HDR+AA support, ATI would have to answer with some sort of AA improvment. I just hope that they don't sacrifice AF quality to do it.
R600: 96 unified shaders, 512-bit (not just ring bus but external to memory), 1GB GDDR4.

G80: (GTX) 128 unified shaders, 384-bit, 768MB GDDR3.

Well, bandwidth would for sure go to the R600. Pixel pushing power looks like it will go to G80. (This is based of number of shaders alone. Unknown how powerful each shader is for either R600 or G80.)
Eh, which ever one does better I could care less. As long as the prices fall due to the competition I'll be happy. I won't expect the G80 to really hit it's "true" price tag until R600.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: josh6079
Originally posted by: apoppin
i didn't give them any hits unless someone wanted to. :p
So you never visited the page you linked?
i simply quoted theInq - their FIRST prediction about r600 vs g80 - r600 being FASTER
:Q
Exactly. That remains to be seen and as of now that is just FUD.
and your opinion of theinq will not affect my reading nor my posting style . . . :p
Where did I say you posting style should change? I just made the comment that the Inquirer isn't going to stop posting it's normal uneducated guessings until they see that no one reads their ramblings. If you think I'm attacking your posting style then I'm sorry, you're wrong. Your posts are pretty enjoyable most of the time as well as different. I just think that people using Inquirer quotes for thread starters gets really old since all it does is provoke more trash to be spewed from their mouths.

Think about it, if you got paid to make up something that was just barely believable and got paid even more for getting that barely believable claim to sell more magazines, get more site-hits, sell more productrs, etc. then would you stop making such claims?
but thanks for mentionting it again

:roll:
Again, I simply mentioned that it seems your trying to defend the Inquirer or, if not them than your thread's existence.

Why did you make this thread?
Because the Inquirer said something about it.
Why do you give the Inquirer hits?
I don't, I "quote" them.
And how do you get those quotes?
...uh....

Just chill. I just gave my $.02 and if you think that it's not even worth that, then that's fine.
Again, all that bandwidth is useless unless ATi do something useful with it, like finally adding full-screen SSAA on single cards. If they don't I won't even consider a R600.
QFT. If G80 has an angle independent AF and HDR+AA support, ATI would have to answer with some sort of AA improvment. I just hope that they don't sacrifice AF quality to do it.
R600: 96 unified shaders, 512-bit (not just ring bus but external to memory), 1GB GDDR4.

G80: (GTX) 128 unified shaders, 384-bit, 768MB GDDR3.

Well, bandwidth would for sure go to the R600. Pixel pushing power looks like it will go to G80. (This is based of number of shaders alone. Unknown how powerful each shader is for either R600 or G80.)
Eh, which ever one does better I could care less. As long as the prices fall due to the competition I'll be happy. I won't expect the G80 to really hit it's "true" price tag until R600.

are we absolutely certain that G80 shaders are actually "unified"?
:confused:

i mean really unified like ATi's? this will be 2nd gen 'unified shaders' for ati and first for nvidia.

and yes, i DO read theInq and take it for what it is.... there are some things that are 'scooped' here first and not all of it belongs with the used kittly litter. :p

i don't link to theinq 'hidden' . . . and even took your advice to note the source in the Topic's summary and provide an alternate source --whenever possible. What more could you ask?
:Q

just don't ask me to stop posting what i find interesting . . . whatever the source . . . please ignore what you are not interested in . . . and if it turns out to be crap as many of my threads do . . . it will fall to the bottom quickly.

there is nothing worse than bringing more attention to crap by calling it crap

and i created 'that' thread because theInq brought up something i was interested in . . . it is a valid reason and certainly better than any flamethread.

edited