All AMD R6xx chips are 65 nanometre chips, now

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
that is my analysis

BUT

what *else* can AMD do?

they very stupidly backed themselves into a corner with that release of BS marketing "delaying r600 for a more complete platform" ... i so soundly condemned as "obvious"

this is really gonna *cost* them

they *look* stupid

duke nukem of Video HW

WORST of ALL - they DIDn't look stupid UNTIL they started "explaining"
:roll:

fire the entire marketing team
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
it's dustbuster vs. r300 all over again
:confused:

with the respective companies in reversed position ..
:Q

heck, nvidia survived DustBuster ... i think AMD can survive the fiery Draggin Ass - Two :p

:laugh:
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I wonder how the OEM vendors feel at this point since I know that they already know how R600 performs.

If there are mass defections by ATI's board partners this could be a one and done.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
I wonder how the OEM vendors feel at this point since I know that they already know how R600 performs.

If there are mass defections by ATI's board partners this could be a one and done.

AMD should have the midrange ... and some great price incentives

i think they may lose a few ... but Sapphire and the bigger ones will stick this round out

^my prediction^
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I know it's unlikely, that's why i said "if".

Another disadvantage for AMD is the price war. I already predicted that since yields are low, this is going to be an expensive part.

Nvidia has been selling G80 at close to MSRP for 6 months. What's going to keep them from slashing the prices the day R600 releases?

Who besides the fanboys are going to pay $100 (or more) more for 10% or less performance advantage?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Matt2
What I am interested in is how the architecture functions. Since (according to this report) R600 has very similar performance to G80, it must have some kind of architectural deficiency compared to G80 since it has a 39% higher core clock.

After everyone said that ATI had such a huge head start on nvidia because of R500, Nvidia must have known something ATI didn't. Maybe those Vec4 shaders are harder to keep fed. Afterall, Nvidia did claim near 100% efficiency with simple scalar shaders.
I was curious about this as well. Maybe I missed it, but how many shaders is R600 supposed to have? I was under the impression it would be similar to G80, but even the latest R600 leaks/internal documents haven't mentioned number of shaders. In that one AMD document it even mentioned Xenos' 48 shaders but no hint of R600's shaders. Either its still a moving target or just well-known to everyone but me?

Originally posted by: apoppin
AMD should have the midrange ... and some great price incentives

i think they may lose a few ... but Sapphire and the bigger ones will stick this round out

^my prediction^
I'm not so sure about that now. AMD's "strategic" DX10 product launch was clearly a smokescreen as they're still scrambling to get an acceptable version of R600 to market. They're playing all of their trump cards on R600 and everything else is getting pushed to the back of the deck. Usually flagships are released on a mature process and then budget parts are taped out and tested on a new process, but not the case here.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
I know it's unlikely, that's why i said "if".

Another disadvantage for AMD is the price war. I already predicted that since yields are low, this is going to be an expensive part.

Nvidia has been selling G80 at close to MSRP for 6 months. What's going to keep them from slashing the prices the day R600 releases?

Who besides the fanboys are going to pay $100 (or more) more for 10% or less performance advantage?
that's *why* r600 cannot release in April ...

it'll be beyond expensive to produce --unless they can use them in lower-end boards [probably]

nvidia doesn't need to drop prices ;)
just sell off their current HW, get a cheaper and faster process and wait for r660 with matured drivers

right now, they are congratulating each other and thinking about their bonuses.
:roll:

:D
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: chizow
Originally posted by: Matt2
What I am interested in is how the architecture functions. Since (according to this report) R600 has very similar performance to G80, it must have some kind of architectural deficiency compared to G80 since it has a 39% higher core clock.

After everyone said that ATI had such a huge head start on nvidia because of R500, Nvidia must have known something ATI didn't. Maybe those Vec4 shaders are harder to keep fed. Afterall, Nvidia did claim near 100% efficiency with simple scalar shaders.
I was curious about this as well. Maybe I missed it, but how many shaders is R600 supposed to have? I was under the impression it would be similar to G80, but even the latest R600 leaks/internal documents haven't mentioned number of shaders. In that one AMD document it even mentioned Xenos' 48 shaders but no hint of R600's shaders. Either its still a moving target or just well-known to everyone but me?

There is no "official" number, but this is the most plausible umber I could find:

The article mentioned that researchers at Stanford were already writing supercomputing applications for R600 and that R600 uses 320 multiply-accumulate (MAC) units which could imply 40 vec4 per GPU and ~800MHz clock.

Linky

It should be the equivalent to 160 simple scalar shaders. But as I said above, Nvidia claims 100% efficiency on their simple scalar shaders.

Everything I have read regarding Vec4 shaders points to an efficiency well below 100%, more like 70%.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: chizow
Originally posted by: Matt2
What I am interested in is how the architecture functions. Since (according to this report) R600 has very similar performance to G80, it must have some kind of architectural deficiency compared to G80 since it has a 39% higher core clock.

After everyone said that ATI had such a huge head start on nvidia because of R500, Nvidia must have known something ATI didn't. Maybe those Vec4 shaders are harder to keep fed. Afterall, Nvidia did claim near 100% efficiency with simple scalar shaders.
I was curious about this as well. Maybe I missed it, but how many shaders is R600 supposed to have? I was under the impression it would be similar to G80, but even the latest R600 leaks/internal documents haven't mentioned number of shaders. In that one AMD document it even mentioned Xenos' 48 shaders but no hint of R600's shaders. Either its still a moving target or just well-known to everyone but me?

Originally posted by: apoppin
AMD should have the midrange ... and some great price incentives

i think they may lose a few ... but Sapphire and the bigger ones will stick this round out

^my prediction^
I'm not so sure about that now. AMD's "strategic" DX10 product launch was clearly a smokescreen as they're still scrambling to get an acceptable version of R600 to market. They're playing all of their trump cards on R600 and everything else is getting pushed to the back of the deck. Usually flagships are released on a mature process and then budget parts are taped out and tested on a new process, but not the case here.

r600 is a *FLOP*
[think *dragon* Huge, Long, Hot ... AMD's 'dustBuster']
:Q

OK

AMD is NOT releasing it
[ATi would have]


instead they are releasing it's REFRESH ... r660


*as* r600


so

everything *else* IS ready


[reading between AMD's own lines]

;)
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
They may as well call it R666 and try to bring about armageddon with it's teraflop power!
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
IMO the efficiency issue was blown way out of proportion ever since Nvidia introduced its scalar shaders. The efficiency of vec4 shaders not only depends on the code but also on the compiler. If a fragment program writes to all 4 values (RGBA) in a fragment, that's about 100% efficient in a vec4 shader. If it only uses RGB values, that's 75% efficient. For only 2 of the color components, it depends if you can combine 2 ops in a single vec4 unit, so things get complicated. But I call BS the 70% average efficiency claim for vec4 units.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Matt2
I'm gonna start calling R600 vaporware pretty soon.


I am quite surprised that you would be negative about ati products. Actually the r600 is unreleased, so if someone tries to sell you one, it will be vapourware. Surprises me that the 2 most interested in formenting a silly diatribe both say they use ati cards. Sort of kinky really.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: munky
IMO the efficiency issue was blown way out of proportion ever since Nvidia introduced its scalar shaders. The efficiency of vec4 shaders not only depends on the code but also on the compiler. If a fragment program writes to all 4 values (RGBA) in a fragment, that's about 100% efficient in a vec4 shader. If it only uses RGB values, that's 75% efficient. For only 2 of the color components, it depends if you can combine 2 ops in a single vec4 unit, so things get complicated. But I call BS the 70% average efficiency claim for vec4 units.

So then assuming that VR-Zone is correct, what would you attribute R600's performance deficiency? Like I said before a 225mhz core clock advantage, 512-bit bus, GDDR4, R600 should be smoking G80. Immature drivers or not.

Originally posted by: ronnn
I am quite surprised that you would be negative about ati products. Actually the r600 is unreleased, so if someone tries to sell you one, it will be vapourware. Surprises me that the 2 most interested in formenting a silly diatribe both say they use ati cards. Sort of kinky really.

I love my X1900XTX, when I got it nothing could touch it. But the tables have turned and AMD is falling flat on it's face.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Matt2
Originally posted by: munky
IMO the efficiency issue was blown way out of proportion ever since Nvidia introduced its scalar shaders. The efficiency of vec4 shaders not only depends on the code but also on the compiler. If a fragment program writes to all 4 values (RGBA) in a fragment, that's about 100% efficient in a vec4 shader. If it only uses RGB values, that's 75% efficient. For only 2 of the color components, it depends if you can combine 2 ops in a single vec4 unit, so things get complicated. But I call BS the 70% average efficiency claim for vec4 units.

So then assuming that VR-Zone is correct, what would you attribute R600's performance deficiency? Like I said before a 225mhz core clock advantage, 512-bit bus, GDDR4, R600 should be smoking G80. Immature drivers or not.

It should be smoking the g80 if it was clocked high enough and the core was not bottlenecked by something else. Given the amount of conflicting rumors popping up all over the place, I'm not willing to assume VR-zone is correct. But even if it were so, I wouldn't put it past Ati to design some x1600 on steroids with a massive advantage in shader power that's neutralized by a deficiency in some other area, whether it be TMU's, ROP's, or something else.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: munky
Originally posted by: the Chase
Breaking "news"- R600 is not 65nm after all!! http://www.fudzilla.com/index.php?option=com_content&task=view&id=221&Itemid=1

Lol @ fudzilla :laugh:

Sorry to disappoint you but R600 will be re branded and probably cheaper than everyone expects but its very unlikely that we are waiting for G80 killer.
that's what i said :p

BUT, r660 is 65nm and we can expect that pretty soon --introduced as "r600"

AMD'll have to scramble on the 'refresh' ... leave it to 'then'
;)

--should have been fuadzilla
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I honestly believe that there never really was a R600 (at least not in the form we are all expecting it).

IMO about 2 years ago ATi got wind of what nv50 (g80) was all about through their spy channels, and as The Chase put it "soiled their pantaloons". Frantic merger/buyout talks with AMD ensued and the R600 we are expecting was hastily slapped together on the drawing board. Getting the design off of the drawing board and into reality has proven a little troublesome however...

Only the midrange will be released in any sort of quantity because thats all that ATi's been seriously working on for a reasonable length of time since Xenos.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
that is my analysis

BUT

what *else* can AMD do?

they very stupidly backed themselves into a corner with that release of BS marketing "delaying r600 for a more complete platform" ... i so soundly condemned as "obvious"

this is really gonna *cost* them

they *look* stupid

duke nukem of Video HW

WORST of ALL - they DIDn't look stupid UNTIL they started "explaining"
:roll:

fire the entire marketing team

The marketing team have no real fear of being fired - NASA's desperately trying to headhunt them - put all that "spin power" to use in the space stations gyroscopes...
 

nrb

Member
Feb 22, 2006
75
0
0
Coming to this thread rather late, here, but a few brief points....


1) People really, seriously need to stop spouting the "15 respins" bullsh*t. In the first place the A15 revision was only ever a rumour, and it's a rumour that has since been firmly squashed. The released version is A13.

More importantly, the two digits in "A13" refer to two different things. One digit refers to the silicon revision, the other to the metal revision. I can't actually remember which is which :eek: but if we assume, for the sake of example, that the first digit is silicon and the second metal, A13 would be the first silicon revision combined with the third metal revision. Whichever way round it is, "A13" is the third revision of the chip, not the 13th. PLEASE GET IT RIGHT!


2) It is often stated that DX9 vs DX10 game screenshots must be bullsh*t, because there is no visual effect doable in DX10 that you can't do in DX9. This statement is both true and misleading. It is true to say that any visual effect doable in DX10 is also doable DX9, but the level of performance that you get will not necessarily be the same. What this means in practice is that there are some effects that can be done in DX10 which cannot practically be done in DX9, because, if you tried to do them in DX9, the performance level would drop to the point where the game would be unplayable. Thus, for practical purposes there are effects available in DX10 that are not in DX9. DX10 allows you to do them in a game rather than only in a still picture.


3) If R600 is really only scoring 200 3DMark points higher than 8800GTX, this is dissapointing.


4) It is not impossible that R600 is 65nm, but I have to say it's unlikely.

If it does turn out to be 65nm then clearly what has happened is an exaggerated version of what happened aropund the release of R520 and R580. R520 and R580 were worked on by two entirely independent teams. R520 suffered delay after delay, but R580 had almost no problems. The result was that R580 was actually ready to launch only about two months after R520. If we imagine that R520 had been delayed by yet another two months, it would have been quite possible for ATI to say "sod it, let's just release R580 and forget about R520 completely".

We might be seeing the same thing here. Maybe ATI originally intended to do exactly what Nvidia did, which was to launch the high-end R600 part and then follow up at least 6 months later with mid-range parts on a smaller process, and then, once the smaller process was stable, release a die-shrink version of R600. R600 is now at least 6 months late (and coinciding with the release of the mid-range chips). If the shrunk version is being worked on by a different team that has not experienced delays, it's possible that R600 will be scrapped and they'll go straight to the first revision.

However, I still regard this as highly unlikely. Almost all of these rumours can be traced back to a single news-story, and it's perfectly possible that the 65nm figure was quoted in an AMD presentation, but applied only to mid-range chips (or only to a laptop-version of R600 due out later) and that one website misunderstood THAT, and everyone else got the idea from them.

My prediction remains 80nm.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
what EVER it is ... r600 is becoming AMD's DustBuster
:p

:roll:

:thumbsdown:

and Gstanfor, NASA doesn't want AMD's marketing team ... they all spin in different directions and have completely *unbalanced* AMD imagine what they would do to that old space station
:Q

send them into space ... OK!

:laugh:

they can give the moon an "atmosphere" with their collective Hot Air
--weird shade of green cheese though
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
You have a point about the erratic spin, but NASA seems to like dangerous, unstable things anyway (they are still flying the shuttle, still with solid boosters and I think are considering manned spacecraft that rely soley on SRB engines!)

Wouldn't want to pollute the moon with them, but if we could imprison them and politicians (another fine source of hot air) in the space station we might go a fair way to reversing global warming...