The Inquirer reports clocks for R520XT

DRavisher

Senior member
Aug 3, 2005
202
0
0
http://www.theinquirer.net/?article=25898

Let the flamewars begin :p

If the limmited availability rumour is true, then I guess ATi fan bois will have to wait a little while even after the late release to get their hands on the new shiny R520. But first week of October isn't that far away. Guess ATi are getting done with the R520, finally.

Edit: And I am sure that many will say "oh no, it's the inquirer!! w00t!", but I seem to be missing all the times the inquirer reports something as definate and it isn't true. Seems to be right just about every time...
 
Jun 14, 2003
10,442
0
0
600Mhz isnt too shabby lol, though there was lots of talk of a 700Mhz part. 500Mhz is a little low if its a top end part with 16 pipes (even if they are extreme), but at least ATi are trying to make sure they'll have good enough stocks to get selling right away

so either this isnt the big daddy card, or it has more than 16 normal pipes (ie in comparison to nvidias)

and W00T its the inquirer....yes some times they are right...but ill still take a pinch of salt with the info they provide
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
What i'm saying is that the inq is right most of the time when they report tings such as these (specs a short time before release). When they are wrong it is usually pretty obvious that it is a rumour they are presenting. But that isn't what we're discussing here :)

600MHz certainly isn't shabby, and since they have most likely made their pipes a tad more effective / MHz it'll probably be a nice performer. But if they release a new Phantom Edition, or even a PPE (Phantom Propeller Edition), then that would kind of suck. But that remains to be seen.
 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
GeForce 7800 GTX:

430mhz x 24 pixel shader units = 10320 megapixels/sec
600MHz GDDR3 256MB/256-bit

R520XT:

600mhz x 16 pixel shader units = 9600 megapixels/sec
700Mhz GDDR3 256MB/256-bit


!!! Is that enough to beat the GTX?
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
I doubt you'll see a 7800GTX actually running that many pixels through the shading pipeline anyway, so that is just a theoretical maximum. If the R520 was to actually run 9,6 billion simple shaders every second, then it would be tha bomb:-O, but that isn't likely. As people say, the arcitechture is too different for such a comparison to make sense.

Edit: And it isn't likely that we will have to wait long for a higher clocked part G70, since it seems the G70 still has some potential for higher clocks. So the R520 may be up against a 500MHz+ G70.
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
I think the XLs will be made avalible the day of the launch since they should just be lower clocked 520XTs, so yields won't be as bad. XLs are slated for a mid september launch date.
 
Jun 14, 2003
10,442
0
0
Originally posted by: MegaWorks
GeForce 7800 GTX:

430mhz x 24 pixel shader units = 10320 megapixels/sec
600MHz GDDR3 256MB/256-bit

R520XT:

600mhz x 16 pixel shader units = 9600 megapixels/sec
700Mhz GDDR3 256MB/256-bit


!!! Is that enough to beat the GTX?


theoretical clacs count for nothing

7800gt can stilll only output 16 pixels anyway....but nvidia knows that as games get more complex pixels are gonna spend more time in the pipeline. so the number of pipes to ROPs can increase

these aren't jus 16 pipelines remember....theyre extreme! lol, it'll be more powerful than youve calculated. plus....ATi's AA and AF has been stella since R300, so im banking on that being being very good, probably better than the 7800GTX

i hope to see an even match.....like the last round. Ati wins some, Nvidia wins some....if the prices are right, it probably wont matter which company you choose...you'll be getting a top class product either way
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Guaranteed the r520 will the the top AGP choice - at least for awhile ;)

i am sure many gamers are in my position and won't be artificially "forced" into PCIe prematurely. . . .
 

DRavisher

Senior member
Aug 3, 2005
202
0
0
Yes, ATi likes to support old technology:p ATi: the Power of Tradition; SM2.0sooperdooperATiedition, AGP, LDR (low dynamic range) <-- was a joke just so you know. I perfectly understand that those with nice shiny A64 systems with AGP don't want to upgrade to PCI-E yet, so no need to flame me here...
 

g3pro

Senior member
Jan 15, 2004
404
0
0
And then nVidia releases their AGP part which beats ATi's AGP part. And all will be laid to rest.
 

Malladine

Diamond Member
Mar 31, 2003
4,618
0
71
Originally posted by: g3pro
And then nVidia releases their AGP part which beats ATi's AGP part. And all will be laid to rest.
Shouldn't you be busy testing your G80 and R600 parts?
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
Hmmm.... maybe a 16x2 architecture? That would be pretty damn cool for some massive multitexturing :p

All speculation, just give us a launch and details already! :|

Nat
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Well, the #x2 architecture hasnt been used since umm.. the FX cards, or maybe even earlier. I'm sure there are good reasons why all the modern cards are using the x1 archtecture. Anyway, modern cards and games seem to stress the shader units, not fillrate or texturing.

But, look at what NV has done with their pipes on the gf7 - they tweaked the ALU's so even with 24 pipes it performs better than a 24 pipe gf6 card would. Similarly, Ati can design their pipes with even more IPC, so if the rumors of the r520 getting ~9000 in 3dmark05 are true, then I dont mind it having 16 x-treme pipes as opposed to 24 or 32 regular ones.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: g3pro
And then nVidia releases their AGP part which beats ATi's AGP part. And all will be laid to rest.

because we all know how with the last 2 generations nvidia has been the best(sarcasm). FX was better than 97/9800 right. Geforce 6 series annhilated X8xx series with all that sm3 wholesome goodness and pure video that still isn't really all that useful. Dont get me wrong I like the 6 series( I own a 6800 GT), but I just dont agree with your statement, unless ofcourse you were being sarcastic also
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Lol at #x2 pipelines...multitexturing was why the FX series sucked so bad...they tried to lead the industry in a direction that it wasn't going, and paid for it in spades. I still worry that the R520 is another NV30...any time the word "extreme" is thrown around, consumers get shafted. Unless they seriously improved the architecture on top of what they did with clockspeed, I think it's not going to best, but likely tie G70. And once again, please stop using max fillrate as if that matters.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
Originally posted by: munky
Well, the #x2 architecture hasnt been used since umm.. the FX cards, or maybe even earlier. I'm sure there are good reasons why all the modern cards are using the x1 archtecture. Anyway, modern cards and games seem to stress the shader units, not fillrate or texturing.

But, look at what NV has done with their pipes on the gf7 - they tweaked the ALU's so even with 24 pipes it performs better than a 24 pipe gf6 card would. Similarly, Ati can design their pipes with even more IPC, so if the rumors of the r520 getting ~9000 in 3dmark05 are true, then I dont mind it having 16 x-treme pipes as opposed to 24 or 32 regular ones.



True, but the FX used that method and I think that is why UT2k3 felt speedy on either that or a 9800 Pro, but stepping down to a 5700 which was just 4x1 was quite a step down. My 5900U really rocked HL2, in dx8.1 though because of the weak shaders... it seems to me that a card with good shaders and the extra texture unit could be quite a strong contender. Just my opinion though, single texture fillrate is obviously very important as well :)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
if the rumors of the r520 getting ~9000 in 3dmark05 are true, then I dont mind it having 16 x-treme pipes as opposed to 24 or 32 regular ones

The thing is... The default 3DMark05 score @ 1024x768 isn't all that relevant anymore IMO. These cards are designed for higher resolutions. The 7800GTX doesn't show that much of a jump over the 6-series until you move to resolutions higher than 1600x1200. Case in point, I don't even hit 10k in 3DMark05 @ 1024x768 with SLI (most likely due to my cpu), but I can play the FEAR demo at 1680x1050 w/ 4x tr SSAA 8x AF full details in all it's beautiful splendor without a stutter... Sadly, Futuremark is behind the curve with this, since they have defined 1024x768 as the standard resolution (in the free version). Of course, those with the Pro can make changes, but then 3DMark05 is no longer a standard. Hopefully they will catch make their benchmark more relevant with the next version.

Edit: Don't get me wrong, any card that hits a 9k in 3DMArk05 is a badass, but I just don't put as much stock in 3DMark05 with this generation of cards, since the standard test resolution is, quite frankly, low. The only thing I run at 1024x768 is my 12" laptop.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
Which is why I can't wait to get back to my CRT... this 1280*1024 LCD that I have been using for the summer, while low response time and good color made it decent for gaming, is starting to frustrate me :)
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: munky
Well, the #x2 architecture hasnt been used since umm.. the FX cards, or maybe even earlier. I'm sure there are good reasons why all the modern cards are using the x1 archtecture. Anyway, modern cards and games seem to stress the shader units, not fillrate or texturing.

But, look at what NV has done with their pipes on the gf7 - they tweaked the ALU's so even with 24 pipes it performs better than a 24 pipe gf6 card would. Similarly, Ati can design their pipes with even more IPC, so if the rumors of the r520 getting ~9000 in 3dmark05 are true, then I dont mind it having 16 x-treme pipes as opposed to 24 or 32 regular ones.

Game devs are using shaders to do the same effects.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: MegaWorks
GeForce 7800 GTX:

430mhz x 24 pixel shader units = 10320 megapixels/sec
600MHz GDDR3 256MB/256-bit

R520XT:

600mhz x 16 pixel shader units = 9600 megapixels/sec
700Mhz GDDR3 256MB/256-bit


!!! Is that enough to beat the GTX?

Besides that, there are several versions of 7800GTX clocked signicantly higher than the stock GTX spec from the factory. (e.g. EVGA, XFX, BFG)

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: johnnqq
500 mhz-600mhz? it can't have 16 pipes...

Why not?

It always amazes me how some people seem to think that just because ATI is releasing a part, it will automatically be faster than nVidia's counterpart.

ATI never bested nVidia prior to the R300, and they bought that design/company. If they release a part now that is slightly slower than nVidia's and has some interesting features, we'll basically have come back to where things were every year prior the the 9700Pro.
 

johnnqq

Golden Member
May 30, 2005
1,659
0
0
generally speaking, unless there's something i'm missing, 500mhz-600mhz was about the speed of an x850xtpe...how will it be able to keep up with the new nvidia cards? i'm not a fanboy saying ati is the best, but i just don't think ati will perform that much WORSE than the 7xxx cards.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Rollo
Originally posted by: MegaWorks
GeForce 7800 GTX:

430mhz x 24 pixel shader units = 10320 megapixels/sec
600MHz GDDR3 256MB/256-bit

R520XT:

600mhz x 16 pixel shader units = 9600 megapixels/sec
700Mhz GDDR3 256MB/256-bit


!!! Is that enough to beat the GTX?

Besides that, there are several versions of 7800GTX clocked signicantly higher than the stock GTX spec from the factory. (e.g. EVGA, XFX, BFG)

And? The 5800 Ultra had a higher theoretical peak fillrate than the r300 or r350, yet it managed to get slaughtered in almost every benchmark. Theoretical peak fillrate doesnt mean much in modern gpu's running modern games.
 

g3pro

Senior member
Jan 15, 2004
404
0
0
Originally posted by: Malladine
Originally posted by: g3pro
And then nVidia releases their AGP part which beats ATi's AGP part. And all will be laid to rest.
Shouldn't you be busy testing your G80 and R600 parts?

Yeah, my testing has been going pretty smoothly, but I'm going to hold off publishing the results until I get the data from Creig regarding the G90 and R700 parts. I hear he'll be finished with that shortly. :D