Originally posted by: malG
Originally posted by: Cookie Monster
Isnt the prices of the GTXs like 470ish now? A $480 7800GTX vs $599 X1800 XT.
QFT
Originally posted by: BenSkywalker
Kristopher
Thanks for the information, one thing-
600MHz core clock, with a 256-bit memory bus connected to 512MB of GDDR3 memory clocked at 700MHz. The 600MHz core clock will give it a lower fill rate than the GeForce 7800 GTX (24-pipes at 430MHz),
The R520 will have a higher pixel and texel fillrate then the 7800GTX. The fill is limited to the ROPs, not the ALUs.
ANY OS that supports future HD, will have those DRM controls. It is part of material. So Mac OS will have it too.Originally posted by: Novercalis
why do u want to go Vista?
high requirment.. and to much DRM control for me.
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.
Originally posted by: Novercalis
nonetheless, im going crazy with currently paper writing my new rig and will be going buying the parts soon..
dont think I can force myself to wait for r520 benchies and price reduction...
ARRRGGHH
Originally posted by: solofly
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.
Trolling at its best.
You joined just for this? You should go back where you belong like Rage3d etc.
As for the rest of you, vote with your wallets...
Originally posted by: Turtle 1
Originally posted by: solofly
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.
Trolling at its best.
You joined just for this? You should go back where you belong like Rage3d etc.
As for the rest of you, vote with your wallets...
No flame intended Iam sorry if thats how it came across. I have just read alot of bad things about the nvidia IQ if its not true . Than I am truely sorry for that statement .
You know with the over optimizations and glittering in game play. If it is infact true than it should be talked about openly in the forums so people are aware of this fact.
You don't want people to buy inferior products just because of fanboyism do you?
Originally posted by: ArchAngel777
Originally posted by: malG
Originally posted by: Cookie Monster
Isnt the prices of the GTXs like 470ish now? A $480 7800GTX vs $599 X1800 XT.
QFT
No, that isn't a QFT. One is comparing street price to MSRP. Not even fair! I am pretty confident that NewEgg will not be charging $599 for the X1800 XT.
I believe the 7800 GTX still has a MSRP of $599... So they are priced the same.
4x AA for free
Originally posted by: Turtle 1
Originally posted by: solofly
Originally posted by: Turtle 1
Those that bought the G70's well what can you say about them .They already have to deal with a really bad IQ.
Trolling at its best.
You joined just for this? You should go back where you belong like Rage3d etc.
As for the rest of you, vote with your wallets...
No flame intended Iam sorry if thats how it came across. I have just read alot of bad things about the nvidia IQ if its not true . Than I am truely sorry for that statement .
You know with the over optimizations and glittering in game play. If it is infact true than it should be talked about openly in the forums so people are aware of this fact.
You don't want people to buy inferior products just because of fanboyism do you?
Originally posted by: Gamingphreek
*Sigh* there are a few people (Turtle) that have ABSOLUTELY no clue what they are talking about. They are just spewing out biased information. You link me to one place that says 4 quads are disabled. Link me to one place that says free 4x AA. Link me to one place that says SM4 support. Link me to one place that claims that the R520 has WGF1.0 (there is no DX10 douchebag). Link me to one place that says that is is a 512bit controller, it is internally a 512bit ring bus, that does not mean that it is a 512bit architecture, with 256bit "disabled". Show me ONE SINGLE benchmark that shows the R520 (Hint there are no benchmarks out yet) that shows the R520's performance, much less that it is beating the G70. All that is just scratching the surface one your useless posts.
Get a clue before you post.
-Kevin
Originally posted by: Gamingphreek
*Sigh* there are a few people (Turtle) that have ABSOLUTELY no clue what they are talking about. They are just spewing out biased information. You link me to one place that says 4 quads are disabled. Link me to one place that says free 4x AA. Link me to one place that says SM4 support. Link me to one place that claims that the R520 has WGF1.0 (there is no DX10 douchebag). Link me to one place that says that is is a 512bit controller, it is internally a 512bit ring bus, that does not mean that it is a 512bit architecture, with 256bit "disabled". Show me ONE SINGLE benchmark that shows the R520 (Hint there are no benchmarks out yet) that shows the R520's performance, much less that it is beating the G70. All that is just scratching the surface one your useless posts.
Get a clue before you post.
-Kevin
Originally posted by: Acanthus
Originally posted by: Gamingphreek
*Sigh* there are a few people (Turtle) that have ABSOLUTELY no clue what they are talking about. They are just spewing out biased information. You link me to one place that says 4 quads are disabled. Link me to one place that says free 4x AA. Link me to one place that says SM4 support. Link me to one place that claims that the R520 has WGF1.0 (there is no DX10 douchebag). Link me to one place that says that is is a 512bit controller, it is internally a 512bit ring bus, that does not mean that it is a 512bit architecture, with 256bit "disabled". Show me ONE SINGLE benchmark that shows the R520 (Hint there are no benchmarks out yet) that shows the R520's performance, much less that it is beating the G70. All that is just scratching the surface one your useless posts.
Get a clue before you post.
-Kevin
If the specs are correct, i dont think the card has "quads"
Its 16 pipes, and 16 vertex shaders, not the typical 4:1 ratio.
Which would make the card a shading monstrosity, but fill rate limited for the extreme resolutions the GTX can handle.
Getting a little excited? Actually his stuff sounds about as real as the 32 pipes, heat issues, low yields and no availability until christmas - posted around here. Quite a few could get a clue before they post. For all, time will tell. Anyways I did see one bench on 3d05, but of course that came from a site promoting the 32 pipe stuff and it wasn't flattering for the r520.
Its 16 pipes, and 16 vertex shaders, not the typical 4:1 ratio.
Originally posted by: KristopherKubicki
So R520 here we go. I've got it confirmed twice over now, Definitely 16 pipes, definitely 600MHz core and definitely 700MHz GDDR3. Kinda in line with what we've said before, but now there is no doubt about it from anyone.
R520 LE (or vanilla to some) will only have 12 pipes.. but all will have 256bit mem bus.
Update:
You got a link to any of this? Your blowin smoke until you give us some proof because no one can go on your "word."
http://www.anandtech.com/video/showdoc.aspx?i=2532
Good now?
Kristopher
Originally posted by: Gamingphreek
Getting a little excited? Actually his stuff sounds about as real as the 32 pipes, heat issues, low yields and no availability until christmas - posted around here. Quite a few could get a clue before they post. For all, time will tell. Anyways I did see one bench on 3d05, but of course that came from a site promoting the 32 pipe stuff and it wasn't flattering for the r520.
Ronnnn f*** off. YOu know full well that when i said perhaps 2 months that i was merely speculating based on the earlier delays. So stop bringing that crap up again. As for heat issues and low yields.... sorry but that is fact.
Seriously, stop trying to cause problems everywhere i post, there was absolutely nothing wrong in my post. Do i need to praise ATI in ever sentence to get you to stop b*tching at me? Sorry it isn't going to happen. Do i praise Nvidia in every sentence that i bash them in? NO. Stop with the fanboy junk, it is just plain annoying.
Its 16 pipes, and 16 vertex shaders, not the typical 4:1 ratio.
16 Pipes yes. However 16VS.... no way on this green earth. Do you know the amount of power that would use, the additional logic, and the enormous increase in transistor count. 8 at most. 16 would be almost 3 times what they had before
Very true munky. HOwever, as for your example, i doubt (it is possible however) that ATI managed to cut the amount of cycles by 30-40% as related to Nvidias. Odds are it is lower but that speculation just seem a little too far (again it is very possible though).
-Kevin
