- Feb 14, 2005
- 10,341
- 678
- 126
http://www.hardwarezone.com/articles/view.php?cid=3&id=1808
Sourced from General Hardware thought that it should be in here also.
Sourced from General Hardware thought that it should be in here also.
What a crappy selection of games to bench with.
i would rather buy a x1800xt for $469Given our experience with the Radeon X1800 XT graphics cards in the overclocking department and since the new R580 core is based on the same manufacturing process, we numbed down our hopes in for overclocking to avoid undue disappointment. And true enough, the maximum stable overclock we could attain was a mild one at 675/1480MHz which is not a whole lot more than the default 625/1450MHz. With a few button presses of the calculator, you can tell that the 3% performance gain is certainly not worth risking the warranty of one the most expensive graphics cards in the market. Such tiny gains are not at all tangible and you could better spend your time and effort on fragging your friends for some satisfaction.
Originally posted by: Killrose
Catylist 5.13 drivers still being used![]()
Originally posted by: Killrose
Catylist 5.13 drivers still being used![]()
Originally posted by: Frostwake
What a crappy selection of games to bench with.
Not only that but they use an athlon 64 3500+ on their test.. next one please![]()
Originally posted by: videopho
Originally posted by: Frostwake
What a crappy selection of games to bench with.
Not only that but they use an athlon 64 3500+ on their test.. next one please![]()
What's wrong with using amd 64 3500+ on their test here? Not a mainstream cpu? Am I missing something?
Originally posted by: videopho
Originally posted by: Frostwake
What a crappy selection of games to bench with.
Not only that but they use an athlon 64 3500+ on their test.. next one please![]()
What's wrong with using amd 64 3500+ on their test here? Not a mainstream cpu? Am I missing something?
Originally posted by: videopho
Originally posted by: Frostwake
What a crappy selection of games to bench with.
Not only that but they use an athlon 64 3500+ on their test.. next one please![]()
What's wrong with using amd 64 3500+ on their test here? Not a mainstream cpu? Am I missing something?
Originally posted by: Genx87
And if my hunch is correct and the G71 sees a minimal transistor increase due to the current G70 already having 32 pixel pipes with 2 quads disabled, Nvidia has a small die for lower cost of production.
Originally posted by: Kalessian
What a load of hooey.
You don't suppose that ATi is withholding driver support until the last moment in order to prevent these kinds of previews?
Originally posted by: Genx87
Reading it now, I am surprised that R580 has a 30% increase in transistor count compared to the 7800GTX. I am curious what the G71's transistor count will be.
I agree with Todd33, the game selection sucked. 3 synthetics and two game selections.
afaik Splinter Cell has never been good to ATI and Q4 is of course Nvidia friendly.
Ill await more reviews to pass judgement, if however these benchmarks hold typical for in other games ATI failed to deliver. 10% higher than the 512 most likely wont put it in a very good position against the G71.
And if my hunch is correct and the G71 sees a minimal transistor increase due to the current G70 already having 32 pixel pipes with 2 quads disabled, Nvidia has a small die for lower cost of production.
Originally posted by: RichUK
Originally posted by: Kalessian
What a load of hooey.
You don't suppose that ATi is withholding driver support until the last moment in order to prevent these kinds of previews?
I think you've hit the nail on the head, since proper release isn?t supposed to be until tomorrow (or atleast that?s what i heard).
In reality those extra shader units are not getting utilised properly (i think), because the need of correct driver support.
I'll take this as a taster, and await reviews with the proper driver being used. Everyone knows that drivers can make a huge difference with GFX cards.
Originally posted by: keysplayr2003
Originally posted by: RichUK
Originally posted by: Kalessian
What a load of hooey.
You don't suppose that ATi is withholding driver support until the last moment in order to prevent these kinds of previews?
I think you've hit the nail on the head, since proper release isn?t supposed to be until tomorrow (or atleast that?s what i heard).
In reality those extra shader units are not getting utilised properly (i think), because the need of correct driver support.
I'll take this as a taster, and await reviews with the proper driver being used. Everyone knows that drivers can make a huge difference with GFX cards.
Sounds like that is what ATI is doing. This card damn well better beat the stuffing out of anything that exists today.
Originally posted by: JAG87
Originally posted by: Genx87
And if my hunch is correct and the G71 sees a minimal transistor increase due to the current G70 already having 32 pixel pipes with 2 quads disabled, Nvidia has a small die for lower cost of production.
LINK ME