R520 Definitely 16 Pipes, Confirmed at AnandTech

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Turtle 1

Banned
Sep 14, 2005
314
0
0
Hay nvidia or ati doesn't matter to me . I am just tring to put into light what I see as whats been going on with ATI
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: malG
Originally posted by: Cookie Monster

Isnt the prices of the GTXs like 470ish now? A $480 7800GTX vs $599 X1800 XT.

QFT

No, that isn't a QFT. One is comparing street price to MSRP. Not even fair! I am pretty confident that NewEgg will not be charging $599 for the X1800 XT.

I believe the 7800 GTX still has a MSRP of $599... So they are priced the same.

Edit ** oops, I replied to someone's post from page 2... Damnit, they need to fix this forum... When I am logged in and select the last page, it brings me to page 2??!?!? Does it all the time... Anyway, back on topic...
 

Novercalis

Senior member
Aug 5, 2005
453
0
0
nonetheless, im going crazy with currently paper writing my new rig and will be going buying the parts soon..
dont think I can force myself to wait for r520 benchies and price reduction...
ARRRGGHH
 

Turtle 1

Banned
Sep 14, 2005
314
0
0
When Conroe is released I am doing a complete rebuild. The thing that bothers me is that I want CrossFire . The problem is ATI's currenant card will be the R580. So I will buy only 1 of these because the R600 will be released within 6 months of that. Around the time of Microsft Vista. This bothers me as I like to keep my Video cards for 1 1/2 years before I upgrade. If ATI introduces R520 in AGP than I may wait till Vista to do major rebuild.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: BenSkywalker
Kristopher

Thanks for the information, one thing-

600MHz core clock, with a 256-bit memory bus connected to 512MB of GDDR3 memory clocked at 700MHz. The 600MHz core clock will give it a lower fill rate than the GeForce 7800 GTX (24-pipes at 430MHz),

The R520 will have a higher pixel and texel fillrate then the 7800GTX. The fill is limited to the ROPs, not the ALUs.

Higher pixel, lower texel fillrate.

Pixel fill = # Rops (16) x clock
Texel fill = # Texture units (24) x clock
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: Novercalis
why do u want to go Vista?
high requirment.. and to much DRM control for me.
ANY OS that supports future HD, will have those DRM controls. It is part of material. So Mac OS will have it too.
 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.

Trolling at its best.

You joined just for this? You should go back where you belong like Rage3d etc.

As for the rest of you, vote with your wallets...
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Novercalis
nonetheless, im going crazy with currently paper writing my new rig and will be going buying the parts soon..
dont think I can force myself to wait for r520 benchies and price reduction...
ARRRGGHH

I would buy the absolute cheapest PCI-e card to tide you over until these price drops happen, if they happen. You might be able to sell it later in FSFT.

 

Turtle 1

Banned
Sep 14, 2005
314
0
0
Originally posted by: solofly
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.

Trolling at its best.

You joined just for this? You should go back where you belong like Rage3d etc.

As for the rest of you, vote with your wallets...

No flame intended Iam sorry if thats how it came across. I have just read alot of bad things about the nvidia IQ if its not true . Than I am truely sorry for that statement .
You know with the over optimizations and glittering in game play. If it is infact true than it should be talked about openly in the forums so people are aware of this fact.
You don't want people to buy inferior products just because of fanboyism do you?

 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: Turtle 1
Originally posted by: solofly
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.

Trolling at its best.

You joined just for this? You should go back where you belong like Rage3d etc.

As for the rest of you, vote with your wallets...

No flame intended Iam sorry if thats how it came across. I have just read alot of bad things about the nvidia IQ if its not true . Than I am truely sorry for that statement .
You know with the over optimizations and glittering in game play. If it is infact true than it should be talked about openly in the forums so people are aware of this fact.
You don't want people to buy inferior products just because of fanboyism do you?


There are many many threads in this forum concerning this issue and how to remedy it.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: ArchAngel777
Originally posted by: malG
Originally posted by: Cookie Monster

Isnt the prices of the GTXs like 470ish now? A $480 7800GTX vs $599 X1800 XT.

QFT

No, that isn't a QFT. One is comparing street price to MSRP. Not even fair! I am pretty confident that NewEgg will not be charging $599 for the X1800 XT.

I believe the 7800 GTX still has a MSRP of $599... So they are priced the same.

Yeah you are correct, the XT's probably won't be $599, if they are indeed in super short supply, I bet they will go for even more.

 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
*Sigh* there are a few people (Turtle) that have ABSOLUTELY no clue what they are talking about. They are just spewing out biased information. You link me to one place that says 4 quads are disabled. Link me to one place that says free 4x AA. Link me to one place that says SM4 support. Link me to one place that claims that the R520 has WGF1.0 (there is no DX10 douchebag). Link me to one place that says that is is a 512bit controller, it is internally a 512bit ring bus, that does not mean that it is a 512bit architecture, with 256bit "disabled". Show me ONE SINGLE benchmark that shows the R520 (Hint there are no benchmarks out yet) that shows the R520's performance, much less that it is beating the G70. All that is just scratching the surface one your useless posts.

Get a clue before you post.

-Kevin
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Turtle 1
Originally posted by: solofly
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.

Trolling at its best.

You joined just for this? You should go back where you belong like Rage3d etc.

As for the rest of you, vote with your wallets...

No flame intended Iam sorry if thats how it came across. I have just read alot of bad things about the nvidia IQ if its not true . Than I am truely sorry for that statement .
You know with the over optimizations and glittering in game play. If it is infact true than it should be talked about openly in the forums so people are aware of this fact.
You don't want people to buy inferior products just because of fanboyism do you?

Intelia? Who were you before you were banned?
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Gamingphreek
*Sigh* there are a few people (Turtle) that have ABSOLUTELY no clue what they are talking about. They are just spewing out biased information. You link me to one place that says 4 quads are disabled. Link me to one place that says free 4x AA. Link me to one place that says SM4 support. Link me to one place that claims that the R520 has WGF1.0 (there is no DX10 douchebag). Link me to one place that says that is is a 512bit controller, it is internally a 512bit ring bus, that does not mean that it is a 512bit architecture, with 256bit "disabled". Show me ONE SINGLE benchmark that shows the R520 (Hint there are no benchmarks out yet) that shows the R520's performance, much less that it is beating the G70. All that is just scratching the surface one your useless posts.

Get a clue before you post.

-Kevin

If the specs are correct, i dont think the card has "quads"

Its 16 pipes, and 16 vertex shaders, not the typical 4:1 ratio.

Which would make the card a shading monstrosity, but fill rate limited for the extreme resolutions the GTX can handle.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Gamingphreek
*Sigh* there are a few people (Turtle) that have ABSOLUTELY no clue what they are talking about. They are just spewing out biased information. You link me to one place that says 4 quads are disabled. Link me to one place that says free 4x AA. Link me to one place that says SM4 support. Link me to one place that claims that the R520 has WGF1.0 (there is no DX10 douchebag). Link me to one place that says that is is a 512bit controller, it is internally a 512bit ring bus, that does not mean that it is a 512bit architecture, with 256bit "disabled". Show me ONE SINGLE benchmark that shows the R520 (Hint there are no benchmarks out yet) that shows the R520's performance, much less that it is beating the G70. All that is just scratching the surface one your useless posts.

Get a clue before you post.

-Kevin


Getting a little excited? Actually his stuff sounds about as real as the 32 pipes, heat issues, low yields and no availability until christmas - posted around here. Quite a few could get a clue before they post. For all, time will tell. Anyways I did see one bench on 3d05, but of course that came from a site promoting the 32 pipe stuff and it wasn't flattering for the r520.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Acanthus
Originally posted by: Gamingphreek
*Sigh* there are a few people (Turtle) that have ABSOLUTELY no clue what they are talking about. They are just spewing out biased information. You link me to one place that says 4 quads are disabled. Link me to one place that says free 4x AA. Link me to one place that says SM4 support. Link me to one place that claims that the R520 has WGF1.0 (there is no DX10 douchebag). Link me to one place that says that is is a 512bit controller, it is internally a 512bit ring bus, that does not mean that it is a 512bit architecture, with 256bit "disabled". Show me ONE SINGLE benchmark that shows the R520 (Hint there are no benchmarks out yet) that shows the R520's performance, much less that it is beating the G70. All that is just scratching the surface one your useless posts.

Get a clue before you post.

-Kevin

If the specs are correct, i dont think the card has "quads"

Its 16 pipes, and 16 vertex shaders, not the typical 4:1 ratio.

Which would make the card a shading monstrosity, but fill rate limited for the extreme resolutions the GTX can handle.

16 vs? That's news to me...

Although you might be right about the shading monstrosity part - in modern games shader performance is more important than fillrate, and Ati knows it. So, my theory is that they designed the r520 in such a way as to maximize shader performance. And given a long enough shader, all cards would be shader bound and not fillrate bound. So if the 24 pipe 7800 takes 100 cycles to run a shader, but the 16 pipe r520 takes only 60-70 cycles, then both will have similar shader performance, and fillrate becomes a moot point.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Getting a little excited? Actually his stuff sounds about as real as the 32 pipes, heat issues, low yields and no availability until christmas - posted around here. Quite a few could get a clue before they post. For all, time will tell. Anyways I did see one bench on 3d05, but of course that came from a site promoting the 32 pipe stuff and it wasn't flattering for the r520.

Ronnnn f*** off. YOu know full well that when i said perhaps 2 months that i was merely speculating based on the earlier delays. So stop bringing that crap up again. As for heat issues and low yields.... sorry but that is fact.

Seriously, stop trying to cause problems everywhere i post, there was absolutely nothing wrong in my post. Do i need to praise ATI in ever sentence to get you to stop b*tching at me? Sorry it isn't going to happen. Do i praise Nvidia in every sentence that i bash them in? NO. Stop with the fanboy junk, it is just plain annoying.

Its 16 pipes, and 16 vertex shaders, not the typical 4:1 ratio.

16 Pipes yes. However 16VS.... no way on this green earth. Do you know the amount of power that would use, the additional logic, and the enormous increase in transistor count. 8 at most. 16 would be almost 3 times what they had before

Very true munky. HOwever, as for your example, i doubt (it is possible however) that ATI managed to cut the amount of cycles by 30-40% as related to Nvidias. Odds are it is lower but that speculation just seem a little too far (again it is very possible though).

-Kevin
 

Turtle 1

Banned
Sep 14, 2005
314
0
0
Well gamingphreek I will tru in find as many links as possiable.

Here's one on both ATi & nvidia I found interesting a long time back.

http://www.xbitlabs.com/news/video/display/20041104013009.html

Well it seems you were correct I misread.

http://theinquirer.net/?article=21982 Here's a dozzy

http://endian.net/details.asp?ItemNo=4041 THIS IS A MUST VIEW. WHAT if its true G DDR4 . Is that rambus?

Just for the record no one yet knows what the specs are no one.
How close was everyone at this forum on R420 I can tell you everyone was way off .

I have to admitt G DDR 4 2400 did make me laugh pretty hard.

Its looking like putting out a bunch of money befor Longtime arrives is really foolish. Because none of the current cards support WGF 1.0 other than R520's and latter.

The R580 looks like even a worse deal since it won't support Direct X 10 ,If you preferr to call it Direct X Next or Even WGF2.0 it won't be supported. The R520 and R580 will support Vista but because Direct X10 is not backwards compateable to the other Direct X's other than the patch microsoft made for it which is called DirectX 9.L or if you preferr to call it WGF 1.0 . But it seems that there is a speed penalty for this patch. Oh my this is not looking good unless your a banker. So what we have is this none of the present nvidia cards support WGF1.0 so they won't run on Vista and R520 & R580 will but at a speed penilty. Dam X Box is looking really good right now. Maybe I shouldn't say Vista but WGF2.0 . But Vista graphics foundation is based on Direct X 10 . Oh my lord this looks really bad.
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Originally posted by: KristopherKubicki
So R520 here we go. I've got it confirmed twice over now, Definitely 16 pipes, definitely 600MHz core and definitely 700MHz GDDR3. Kinda in line with what we've said before, but now there is no doubt about it from anyone.

R520 LE (or vanilla to some) will only have 12 pipes.. but all will have 256bit mem bus.

Update:

You got a link to any of this? Your blowin smoke until you give us some proof because no one can go on your "word."

http://www.anandtech.com/video/showdoc.aspx?i=2532

Good now?

Kristopher



I like the link to your own article. :p
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Gamingphreek
Getting a little excited? Actually his stuff sounds about as real as the 32 pipes, heat issues, low yields and no availability until christmas - posted around here. Quite a few could get a clue before they post. For all, time will tell. Anyways I did see one bench on 3d05, but of course that came from a site promoting the 32 pipe stuff and it wasn't flattering for the r520.

Ronnnn f*** off. YOu know full well that when i said perhaps 2 months that i was merely speculating based on the earlier delays. So stop bringing that crap up again. As for heat issues and low yields.... sorry but that is fact.

Seriously, stop trying to cause problems everywhere i post, there was absolutely nothing wrong in my post. Do i need to praise ATI in ever sentence to get you to stop b*tching at me? Sorry it isn't going to happen. Do i praise Nvidia in every sentence that i bash them in? NO. Stop with the fanboy junk, it is just plain annoying.

Its 16 pipes, and 16 vertex shaders, not the typical 4:1 ratio.

16 Pipes yes. However 16VS.... no way on this green earth. Do you know the amount of power that would use, the additional logic, and the enormous increase in transistor count. 8 at most. 16 would be almost 3 times what they had before

Very true munky. HOwever, as for your example, i doubt (it is possible however) that ATI managed to cut the amount of cycles by 30-40% as related to Nvidias. Odds are it is lower but that speculation just seem a little too far (again it is very possible though).

-Kevin

the 90nm R520 will be a 16-pipe, 16-shader processor design with a number of different SKUs based on the GPU.

I agree, it would be an enormous increase in transistor count. Dont forget, this is on a die shrink, and has 8 pipes less than the GTX, and required 4 tape outs. It very well could be a huge chip, even for 90nm.