X1900XT Benchmarks @ Hardware Zone **Complete benchies**

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Reading it now, I am surprised that R580 has a 30% increase in transistor count compared to the 7800GTX. I am curious what the G71's transistor count will be.

I agree with Todd33, the game selection sucked. 3 synthetics and two game selections.
afaik Splinter Cell has never been good to ATI and Q4 is of course Nvidia friendly.

Ill await more reviews to pass judgement, if however these benchmarks hold typical for in other games ATI failed to deliver. 10% higher than the 512 most likely wont put it in a very good position against the G71.

And if my hunch is correct and the G71 sees a minimal transistor increase due to the current G70 already having 32 pixel pipes with 2 quads disabled, Nvidia has a small die for lower cost of production.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Wow. Horrible game selection but better than nothing I suppose? In the ONLY DX9 game, check out the lead the x1900xt has over the x1800xt: 59 to 46.
 

CKXP

Senior member
Nov 20, 2005
926
0
0
not enough benches, the x1800xt looks more and more impressive to me(x1900xt isn't much faster, and definetly not a GTX512 killer), looks like they didn't have much OC headroom with their card
Given our experience with the Radeon X1800 XT graphics cards in the overclocking department and since the new R580 core is based on the same manufacturing process, we numbed down our hopes in for overclocking to avoid undue disappointment. And true enough, the maximum stable overclock we could attain was a mild one at 675/1480MHz which is not a whole lot more than the default 625/1450MHz. With a few button presses of the calculator, you can tell that the 3% performance gain is certainly not worth risking the warranty of one the most expensive graphics cards in the market. Such tiny gains are not at all tangible and you could better spend your time and effort on fragging your friends for some satisfaction.
i would rather buy a x1800xt for $469

edit: why use a A64 3500 to benchmark this card:confused:
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Yeah pretty unimpressive showing for the hype...But all benchmarks done with Nvidia favored games??Using a hacked 5.13 driver?? Terrible overclocking results..I wonder...One guy on another forum had this card-was running 9400 or so in 3D Mark 05 until he discovered his card was defaulting to 500 core speed. Used ATI tool to raise it up to 600 core and 700 memory and scored over 11,000 in 3d Mark 05. Results on these cards have been all over the place. I want to see the 6.2 CAT driver out!!
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Horrible review, but it does give a glimpse.

It seems that CPUs are once again becoming the limiting factor, but of course, that could be the games they tested too...

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Not trying to be crude, but that review sucked mor A$$ than a liposuction clinic. /crude.
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: Killrose
Catylist 5.13 drivers still being used :confused:


I see. Seems as though the drivers are hard to come by for this specific card at the moment. Hopefully they will hit tomorrow to see the real difference.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
So, if they launch the card as promised, have them on the shelves and everything, but do not have a driver available to the public to run the card the way its supposed to, is it considered a paper weight or a paper launch? :D

I mean, the 5.13's don't run it right, and most people are saying the newly released 6.1's don't support it either. Is the latter true?
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Originally posted by: Frostwake
What a crappy selection of games to bench with.

Not only that but they use an athlon 64 3500+ on their test.. next one please :p

What's wrong with using amd 64 3500+ on their test here? Not a mainstream cpu? Am I missing something?

Another power hungry beast and "oven-hot" special by ATI. I'm glad I switched :D
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: videopho
Originally posted by: Frostwake
What a crappy selection of games to bench with.

Not only that but they use an athlon 64 3500+ on their test.. next one please :p

What's wrong with using amd 64 3500+ on their test here? Not a mainstream cpu? Am I missing something?

Everyone wants to see it benched on an FX-60. They have no concerns for people with lesser systems. ;)

 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: videopho
Originally posted by: Frostwake
What a crappy selection of games to bench with.

Not only that but they use an athlon 64 3500+ on their test.. next one please :p

What's wrong with using amd 64 3500+ on their test here? Not a mainstream cpu? Am I missing something?



I agree, using a more mainstreem CPU is better because most don't own a damned FX-60! It will reach a wider audiance. Altho, those drivers would have been nice.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: videopho
Originally posted by: Frostwake
What a crappy selection of games to bench with.

Not only that but they use an athlon 64 3500+ on their test.. next one please :p

What's wrong with using amd 64 3500+ on their test here? Not a mainstream cpu? Am I missing something?


There's nothing wrong with that processor, HardOCP got very similar results for Quake 4 when testing the 1800XT and they were using an FX-55. CPU's don't make that much of a difference.
 

Kalessian

Senior member
Aug 18, 2004
825
12
81
What a load of hooey.

You don't suppose that ATi is withholding driver support until the last moment in order to prevent these kinds of previews?
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Genx87

And if my hunch is correct and the G71 sees a minimal transistor increase due to the current G70 already having 32 pixel pipes with 2 quads disabled, Nvidia has a small die for lower cost of production.

LINK ME
 

RichUK

Lifer
Feb 14, 2005
10,341
678
126
Originally posted by: Kalessian
What a load of hooey.

You don't suppose that ATi is withholding driver support until the last moment in order to prevent these kinds of previews?

I think you've hit the nail on the head, since proper release isn?t supposed to be until tomorrow (or atleast that?s what i heard).

In reality those extra shader units are not getting utilised properly (i think), because the need of correct driver support.

I'll take this as a taster, and await reviews with the proper driver being used. Everyone knows that drivers can make a huge difference with GFX cards.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Genx87
Reading it now, I am surprised that R580 has a 30% increase in transistor count compared to the 7800GTX. I am curious what the G71's transistor count will be.

I agree with Todd33, the game selection sucked. 3 synthetics and two game selections.
afaik Splinter Cell has never been good to ATI and Q4 is of course Nvidia friendly.

Ill await more reviews to pass judgement, if however these benchmarks hold typical for in other games ATI failed to deliver. 10% higher than the 512 most likely wont put it in a very good position against the G71.

And if my hunch is correct and the G71 sees a minimal transistor increase due to the current G70 already having 32 pixel pipes with 2 quads disabled, Nvidia has a small die for lower cost of production.

If I remember correctly, this was discussed to death and a lot of people came up with many good reasons why the 7800GTX could not have had 8 quads on it's core. There is always the slight possibility it could be true, but I seriously doubt that is the case.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: RichUK
Originally posted by: Kalessian
What a load of hooey.

You don't suppose that ATi is withholding driver support until the last moment in order to prevent these kinds of previews?

I think you've hit the nail on the head, since proper release isn?t supposed to be until tomorrow (or atleast that?s what i heard).

In reality those extra shader units are not getting utilised properly (i think), because the need of correct driver support.

I'll take this as a taster, and await reviews with the proper driver being used. Everyone knows that drivers can make a huge difference with GFX cards.

Sounds like that is what ATI is doing. This card damn well better beat the stuffing out of anything that exists today.
 

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
Originally posted by: keysplayr2003
Originally posted by: RichUK
Originally posted by: Kalessian
What a load of hooey.

You don't suppose that ATi is withholding driver support until the last moment in order to prevent these kinds of previews?

I think you've hit the nail on the head, since proper release isn?t supposed to be until tomorrow (or atleast that?s what i heard).

In reality those extra shader units are not getting utilised properly (i think), because the need of correct driver support.

I'll take this as a taster, and await reviews with the proper driver being used. Everyone knows that drivers can make a huge difference with GFX cards.

Sounds like that is what ATI is doing. This card damn well better beat the stuffing out of anything that exists today.


And if it doesnt show but a marginal gain then what? Is it a failure for ATI....??? I mean did this thing get overhyped or what....we'll see in about 20 hours or so
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: JAG87
Originally posted by: Genx87

And if my hunch is correct and the G71 sees a minimal transistor increase due to the current G70 already having 32 pixel pipes with 2 quads disabled, Nvidia has a small die for lower cost of production.

LINK ME

It has been my guess since the original 7800 showed up. I have no proof outside of my hunch.