ATi R520 details/specs?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
Originally posted by: hans030390
...that's not even cool. The x900 only has 16 pipelines???

so basically, it's just a group of overclocked x800's that add stuff like SM3, and sometimes more pipelines???

well, the high end one is good :) i like that, but you'd think the x900 is pointless....


UHH, according to the OP source the X900 Pro and up have 24 and 32 pipes SM 3.0 and 10 pixel shaders. Those are definitely not "overclocked X800 parts." What do you think the G70 is? A faster card with more pipes and core enhancements. When the x900 is 200 dollars and is ATI's best seller, I don't think they will consider it pointless.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Radeon X900 XT-PE
? 32 Pixel-Pipelines
? 10 Vertex-Shader Einheiten
? 90nm low-k bei TSMC
? 500 Mhz Chiptakt
? 700 MHz (1,4 Ghz) Speichertakt
? 512 MB GDDR-3

Radeon X900 XT
? 32 Pixel-Pipelines
? 10 Vertex-Shader Einheiten
? 90nm low-k bei TSMC
? 450 Mhz Chiptakt
? 600 MHz (1,2 Ghz) Speichertakt
? 512 MB GDDR-3

Radeon X900 Pro
? 24 Pixel-Pipelines
? 10 Vertex-Shader Einheiten
? 90nm low-k bei TSMC
? 450 Mhz Chiptakt
? 600 MHz (1,2 Ghz) Speichertakt
? 256/512 MB GDDR-3

Radeon X900
? 16 Pixel-Pipelines
? 8 Vertex-Shader Einheiten
? 110nm low-k bei TSMC
? 500 Mhz Chiptakt
? 500 MHz (1,0 Ghz) Speichertakt
? 256/512 MB GDDR-3

Low-K on the 110nm process??? Uh Huh!, I'll believe you... NOT.
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
the 32pipe card will be extremely rare and probably wont make it to consumers until the mid next year. by that time we would see nvidia release a similar performing card.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: ddogg
the 32pipe card will be extremely rare and probably wont make it to consumers until the mid next year. by that time we would see nvidia release a similar performing card.

How do you know that? Personally I will be very surprised if ATI has a full 32 pipe card, am expecting 16 extreme pipes myself.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
Originally posted by: hans030390
...that's not even cool. The x900 only has 16 pipelines???

so basically, it's just a group of overclocked x800's that add stuff like SM3, and sometimes more pipelines???

well, the high end one is good :) i like that, but you'd think the x900 is pointless....


But...but... It has SM3!! It MUST be worthy!

Whether you like it or not Creig, having SM3 is better than not having it. I have a bunch of SM3 games already: Painkiller, Far Cry Splinter Cell, Lego Star Wars- and the Lego Star Wars and Splinter Cell have SM3 only effects?

The fact of the matter is that SM3 has been the Microsoft standard for a year now, most of the big developers have been coding on it longer than that, and not all of them are going to retrocode for the intermediary step of SM2 just because some people only want to upgrade their video cards every 10 years.

Just the way it is ol' buddy- technology keeps moving forward, if you don't keep up you miss out.

 

paadness

Member
May 24, 2005
178
0
0
All rumors and nothing else. There are people saying G70 will kick ATI's butts and ATI claiming the Xbox will be faster than PS-3. WTF is going on.

Haven't u noticed AT and Toms aernt publishing new articles, they are busy benchmarking the G70 im sure. Just 8 days left..........
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: hans030390
...that's not even cool. The x900 only has 16 pipelines???

so basically, it's just a group of overclocked x800's that add stuff like SM3, and sometimes more pipelines???

well, the high end one is good :) i like that, but you'd think the x900 is pointless....

WHAAAAT? X900 is a low-end offering from all these cards. It seems it'll be priced near $249 MSRP level. And its specs are faster than $549 x850xt pe. What's wrong with that? It's like 6600GT smoking 5950Ultra, for half the price when it comes out.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: Rollo
I have a bunch of SM3 games already: Painkiller, Far Cry Splinter Cell, Lego Star Wars- and the Lego Star Wars and Splinter Cell have SM3 only effects?
SM3.0 in Painkiller and Farcry????? I don?t think so. HDR is not SM3.0.

The fact of the matter is that SM3 has been the Microsoft standard for a year now, most of the big developers have been coding on it longer than that,
So, it?s been what, over a year since the 6800 came out and we have maybe 3 games that use SM3.0 ?

Pitfall Harry -- water shader
Pacific Fighters - water shader
Splinter Cell CT -- 8 shaders which exceed SM2.0 -- AFAIK.

A single water shader in 2 games and a few shaders in Splinter Cell shaders that likely don?t exceed SM2.0 (96 instructions) in length but only use more registers than the SM2.0 is capable of handling. Ya, SM3.0 is just catching on like wildfire. :p The few games that have anything to do so far with SM3.0 are likely a result of payola from NV anyway.

The 6800 also chokes on the SM3.0 water shader on Pacific Fighters ?

simhq
?I believe you have PF now, so tell us all how great the GF6800 Ultra can run those water settings? Bear in mind that only recently you were proclaiming the 6800 Ultra as the dogs nads and that PF would prove it. Face it mate you fell for the same line as the rest of us, I too expected a 6800 Ultra to run water=3 and shaders 3.0 like number two's of a shovel


It's a rhetorical question, I don't need an answer as water=3 runs like crap on my system. Go anywhere near a coastline and FPS drops to 15-17, with action near a costline it drops to single figures. A case of Nvidia promising lots but delivering a lot less, there is no doubt that water=3 will need at least an SLI system or the nextgen GPU's. CPU power wont make a big difference, I get the same FPS as Takasaki with a less capable CPU.

AMD64 3200
1GB DDR400 CL2 RAM
6800 Ultra (65.73)

SM3.0 is basically about longer shaders and the X800 can handle shaders many times longer than SM2.0 (its - SM2.0b = extended SM2.0.) As shaders get over 100 instructions long (over the SM2.0 spec) there are some real question marks on how well the current cards will be able to handle these longer shaders anyway whether it?s the 6800 or X800. If we get anywhere near the SM2.0b spec the currents cards will likely be choking on the shader load anyway. The X800 can render 3 lights in a single pass (SM2.0 spec is 1) compared to SM3.0 on the 6800 which can do 4 (SM3.0 spec is 4). This rarely will make any difference, so, the X800 is close to the SM3.0 spec in several ways except for dynamic branching which probably won?t that useful on the 6800 because of it?s large performance impact.

I think it?s a good possibility when developers start using longer shaders than SM2.0, they will be ?SM2.0b? shaders anyway. Well, except where marketing $ may come into play. If a developer needs a shader longer than SM2.0 spec?s -- say a pixel shader 150-200 instructions. Then you might as well make it look like a long PS2.0 ( no branching), so it?s a SM2.0b spec shader. There is no point in adding branching to the shader to make it SM3.0 (instead of SM2.0b) when that will probably kill performance anyway. Since both the 6800?s and X800?s run SM2.0b (SM3.0 is backward compatible with SM2.0b) both current generation of NV and ATI cards are supported with the longer shaders if you use SM2.0b.

Even a SM2.0 card can effectively run longer shaders than 96 instructions (SM2.0 spec) by multipassing them. My understanding is that Splinter cell does this in Chaos Theory with SM1.1, using 6-10 passes to effectively render longer shaders than the base SM1.1 spec. The interesting thing about this is that one gets the same IQ for all graphics cards that support SM1.1 and up. The downside is that older cards like the 4200 will choke on these longer multipassed SM1.1 shaders because they weren?t designed to handle longer shaders well.


 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
Makes me feel bad about buying x850xt pe a few weeks ago but then again the r520 card might not be out to the public till November
 
Jun 14, 2003
10,442
0
0
Originally posted by: hans030390
...that's not even cool. The x900 only has 16 pipelines???

so basically, it's just a group of overclocked x800's that add stuff like SM3, and sometimes more pipelines???

well, the high end one is good :) i like that, but you'd think the x900 is pointless....


how is it pointless it has SM3 :p
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Blastman:
You're wrong about about Far Cry and Painkiller, it's pretty common knowledge they both have SM3 patches.

Your list, besides missing those two, is also missing Lego Star Wars.

You also don't mention Splinter Cell is a SM3 only game, that ATI cards have to run at reduced IQ SM1.1.

So there are at least six SM3 titles- and more on the way. Take a look at nVidias developer testimonial page if you don't believe me.

"Payola" or not, SM3 is the MS standard for the last year, it is easier to code for and more efficient, and some developers won'[t be retro-coding SM2b workarounds just to accomodate those with dated hardware. (e.g. SC:CT)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Rollo
Blastman:
You're wrong about about Far Cry and Painkiller, it's pretty common knowledge they both have SM3 patches.

Your list, besides missing those two, is also missing Lego Star Wars.

You also don't mention Splinter Cell is a SM3 only game, that ATI cards have to run at reduced IQ SM1.1.

So there are at least six SM3 titles- and more on the way. Take a look at nVidias developer testimonial page if you don't believe me.

"Payola" or not, SM3 is the MS standard for the last year, it is easier to code for and more efficient, and some developers won'[t be retro-coding SM2b workarounds just to accomodate those with dated hardware. (e.g. SC:CT)


I think most people did not say sm3 would never be used. Just lots said that by the time it is a major concern the 6800gt would be a mid range card at best. :beer:
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo
Whether you like it or not Creig, having SM3 is better than not having it. I have a bunch of SM3 games already: Painkiller, Far Cry Splinter Cell, Lego Star Wars- and the Lego Star Wars and Splinter Cell have SM3 only effects?

The fact of the matter is that SM3 has been the Microsoft standard for a year now, most of the big developers have been coding on it longer than that, and not all of them are going to retrocode for the intermediary step of SM2 just because some people only want to upgrade their video cards every 10 years.

Just the way it is ol' buddy- technology keeps moving forward, if you don't keep up you miss out.

You entirely missed the point of my post. 90+% of hans030390 posts were telling us how much he loves SM3.0 and how any card you purchased simply HAD to have it. Even to the point that Nvidia fans were getting tired of hearing it.


Originally posted by: keysplayr2003
Hans, I'm an nvidia fan also, but enough with the 3.0 sales pitch already.
I agree that SM3.0 will have a larger role later on, but not right now or in the immediate future. Even if it did, you would do fine with either ATI or Nvidia for all your gaming needs and not miss much if you went with ATI 2.0b. HDR and Stencil Shadowing are another matter.


Now that ATI has SM3.0 capable cards his tune suddenly changed to:


Originally posted by: hans030390
so basically, it's just a group of overclocked x800's that add stuff like SM3, and sometimes more pipelines???

well, the high end one is good i like that, but you'd think the x900 is pointless....


Funny how suddenly SM3.0 takes a back seat to pipelines now that ATI has it.


I've never said SM3.0 wasn't the way of the future. I only said that it's wasn't a "must have" feature in the X800 line. Even today there are only a handful of games that take advantage of it and only 1 or 2 that it makes a large difference in.

It's all become rather a moot point however. Now that both manufacturer's are producing SM3.0 cards you'll end up with it either way you go. Unless you purchase a previous gen card the next time you upgrade.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
If the R520 lineup specifications are accurate, I find it incredible that a 16 pipeline card will be considered their "low end" version. There aren't any games out today that require more than a 16 pipe card to be playable (unless you're running HDR, high AA/AF and high resolution).

A single 24 or 32 pipe card seems like extreme overkill and I don't even have words to describe what a pair of 32 pipe cards would be (besides EXTREMELY CPU limited).
 

biostud

Lifer
Feb 27, 2003
19,928
7,037
136
Originally posted by: Creig
If the R520 lineup specifications are accurate, I find it incredible that a 16 pipeline card will be considered their "low end" version. There aren't any games out today that require more than a 16 pipe card to be playable (unless you're running HDR, high AA/AF and high resolution).

A single 24 or 32 pipe card seems like extreme overkill and I don't even have words to describe what a pair of 32 pipe cards would be (besides EXTREMELY CPU limited).

I wouldn't expect to see the 16 pipe cards until X-mas, or maybe later when software actually will need 16 pipes to run decent.
 

g3pro

Senior member
Jan 15, 2004
404
0
0
Originally posted by: Blastman
Originally posted by: Rollo
I have a bunch of SM3 games already: Painkiller, Far Cry Splinter Cell, Lego Star Wars- and the Lego Star Wars and Splinter Cell have SM3 only effects?
SM3.0 in Painkiller and Farcry????? I don?t think so. HDR is not SM3.0.

The fact of the matter is that SM3 has been the Microsoft standard for a year now, most of the big developers have been coding on it longer than that,
So, it?s been what, over a year since the 6800 came out and we have maybe 3 games that use SM3.0 ?

Pitfall Harry -- water shader
Pacific Fighters - water shader
Splinter Cell CT -- 8 shaders which exceed SM2.0 -- AFAIK.

A single water shader in 2 games and a few shaders in Splinter Cell shaders that likely don?t exceed SM2.0 (96 instructions) in length but only use more registers than the SM2.0 is capable of handling. Ya, SM3.0 is just catching on like wildfire. :p The few games that have anything to do so far with SM3.0 are likely a result of payola from NV anyway.

The 6800 also chokes on the SM3.0 water shader on Pacific Fighters ?

simhq
?I believe you have PF now, so tell us all how great the GF6800 Ultra can run those water settings? Bear in mind that only recently you were proclaiming the 6800 Ultra as the dogs nads and that PF would prove it. Face it mate you fell for the same line as the rest of us, I too expected a 6800 Ultra to run water=3 and shaders 3.0 like number two's of a shovel


It's a rhetorical question, I don't need an answer as water=3 runs like crap on my system. Go anywhere near a coastline and FPS drops to 15-17, with action near a costline it drops to single figures. A case of Nvidia promising lots but delivering a lot less, there is no doubt that water=3 will need at least an SLI system or the nextgen GPU's. CPU power wont make a big difference, I get the same FPS as Takasaki with a less capable CPU.

AMD64 3200
1GB DDR400 CL2 RAM
6800 Ultra (65.73)

SM3.0 is basically about longer shaders and the X800 can handle shaders many times longer than SM2.0 (its - SM2.0b = extended SM2.0.) As shaders get over 100 instructions long (over the SM2.0 spec) there are some real question marks on how well the current cards will be able to handle these longer shaders anyway whether it?s the 6800 or X800. If we get anywhere near the SM2.0b spec the currents cards will likely be choking on the shader load anyway. The X800 can render 3 lights in a single pass (SM2.0 spec is 1) compared to SM3.0 on the 6800 which can do 4 (SM3.0 spec is 4). This rarely will make any difference, so, the X800 is close to the SM3.0 spec in several ways except for dynamic branching which probably won?t that useful on the 6800 because of it?s large performance impact.

I think it?s a good possibility when developers start using longer shaders than SM2.0, they will be ?SM2.0b? shaders anyway. Well, except where marketing $ may come into play. If a developer needs a shader longer than SM2.0 spec?s -- say a pixel shader 150-200 instructions. Then you might as well make it look like a long PS2.0 ( no branching), so it?s a SM2.0b spec shader. There is no point in adding branching to the shader to make it SM3.0 (instead of SM2.0b) when that will probably kill performance anyway. Since both the 6800?s and X800?s run SM2.0b (SM3.0 is backward compatible with SM2.0b) both current generation of NV and ATI cards are supported with the longer shaders if you use SM2.0b.

Even a SM2.0 card can effectively run longer shaders than 96 instructions (SM2.0 spec) by multipassing them. My understanding is that Splinter cell does this in Chaos Theory with SM1.1, using 6-10 passes to effectively render longer shaders than the base SM1.1 spec. The interesting thing about this is that one gets the same IQ for all graphics cards that support SM1.1 and up. The downside is that older cards like the 4200 will choke on these longer multipassed SM1.1 shaders because they weren?t designed to handle longer shaders well.


Nothing but excuses. You were bitching and bitching when the NV3x could only run SM1.1 very well and not 2.0, yet there were so few games which actually used 2.0. Get your story straight. Do you like having good cards or do you want to buy the s*** sold by ATi? You apparently like the latter.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: g3pro

Nothing but excuses. You were bitching and bitching when the NV3x could only run SM1.1 very well and not 2.0, yet there were so few games which actually used 2.0. Get your story straight.
And what were you saying last year ? ? -- that few games support SM2.0 so it doesn?t really matter for the NV30? And now, few games support SM3.0, but it?s the best thing since sliced bread? Better get your story straight too. The NV30 had a bigger feature set than the R300, but it couldn?t really use them. The 9800 also had much faster AF and better AA, and just more shader power period, even when running a heavy SM1.x at a time when a lot games were switching to the shader model of programming.

I?m not knocking SM3.0, I just don?t think it?s ?so great? compared to what ATI is offering in the X800?s. ATI went for SM2.0b and spent the extra money on low-k fab instead of a full SM3.0 feature set. They thought X800?s clock rate advantage giving the X800 more raw FP shader performance would be a better feature. So, it?s the trade-off of better DX performance for the X800?s versus less performance of the 6800?s and a full SM3.0 feature set. The fact is both ATI?s and NV?s current generation of cards are both good in their respective ways and a consumer isn?t really going to ?miss? going either way this time around, even though I prefer ATI?s solution.

Originally posted by: Rollo
You're wrong about about Far Cry and Painkiller, it's pretty common knowledge they both have SM3 patches.
Farcry supports GI (forgot about that) so I?ll give you that one. Just barely ;) :). The effective use of sprites in Farcy pretty well negates the use of GI. The patch updated the Painkiller graphics engine to support SM3.0, but I don?t what (if any) SM3.0 features are in that patch.

So there are at least six SM3 titles- and more on the way. Take a look at nVidias developer testimonial page if you don't believe me.
Well, it's one thing to "support" SM3.0 in that the game engine is SM3.0 compliant. It's another thing to actually support any significant features in SM3.0. I think one of the big things hampering SM3.0 is that developers can achieve such good IQ with SM2.0 -- FArcry and HL2 being examples.

You also don't mention Splinter Cell is a SM3 only game, that ATI cards have to run at reduced IQ SM1.1.
The 1.1 path looks basically as good as the SM3.0 path. Only HDR adds some nice effects, but then you give up AA on NV cards because that method of doing HDR stinks.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I thought ATi supports GI with the X800/X700/X300(?) cards.
That means the Farcry thing is kind of negated.

Only NOW have people started FORCING use of certain shader supporting cards.
Battlefield 2 won't run on the Geforce 4 series because they don't support 1.4 shaders. SM3 is quite a long way away from becoming any kind of standard.