HARDOCP 2900xt Review! (a proper review)

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
HD2900XT consistantly loses across the board to both 8800GTS 640mb and 8800GTX.

From their conclusion:

The ATI Radeon HD 2900 XT however is more akin to NVIDIA?s GeForce FX 5800. It does not seem like this will have a very long life span in comparison. NVIDIA quickly answered the GeForce FX 5800 by introducing the GeForce FX 5900 (NV35). ATI really needs to do something similar in this situation, or they may lose some loyal fans in the enthusiast community and you can bet they are going to continue to lose sales to NVIDIA?s 8000 series products.

EDIT: Another tidbit from their conclusion:

Despite what the numbers in 3DMark are showing our evaluation has proven that the ATI Radeon HD 2900 XT is slower than a GeForce 8800 GTS when it comes to actually gaming. Even our apples-to-apples real gaming tests confirmed that the 8800 GTS is faster than the HD 2900 XT and nowhere close to the GeForce 8800 GTX, yet here sits 3DMark showing us the opposite!
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
All I can say is, I'm glad I didnt wait for the 2900XL. If the 2900XT is just "about" as fast "but not quite" as the GTS, then how would the XL be ?

I'm grinning from ear to ear now knowing I got the GTS 640 2 months ago.

And i'm no fanboy, cause i've "always" owned ATI up till my current card, but if this is the kind of crap their gonna try to sell me. Well, you can figure out the rest based on my current card.

Could quite possibly change in the future with driver revisions. But i'm not holding my breath for the XT. Maybe the refresh.
 

sisq0kidd

Lifer
Apr 27, 2004
17,043
1
81
:( I guess some people owe DT a huge apology.

I don't get it. The specs, they're great. Wah hoppin?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I guess we'll soon see "hardocp rigged the review", "that dirty no good nvidia loving Kyle cheated" style claims flying thick and fast.
 

CrystalBay

Platinum Member
Apr 2, 2002
2,175
1
0
Originally posted by: Matt2
HD2900XT consistantly loses across the board to both 8800GTS 640mb and 8800GTX.

From their conclusion:

The ATI Radeon HD 2900 XT however is more akin to NVIDIA?s GeForce FX 5800. It does not seem like this will have a very long life span in comparison. NVIDIA quickly answered the GeForce FX 5800 by introducing the GeForce FX 5900 (NV35). ATI really needs to do something similar in this situation, or they may lose some loyal fans in the enthusiast community and you can bet they are going to continue to lose sales to NVIDIA?s 8000 series products.

EDIT: Another tidbit from their conclusion:

Despite what the numbers in 3DMark are showing our evaluation has proven that the ATI Radeon HD 2900 XT is slower than a GeForce 8800 GTS when it comes to actually gaming. Even our apples-to-apples real gaming tests confirmed that the 8800 GTS is faster than the HD 2900 XT and nowhere close to the GeForce 8800 GTX, yet here sits 3DMark showing us the opposite!

 

MadBoris

Member
Jul 20, 2006
129
0
0
Their is saving grace.

The Radeon HD 2900 XT overclocks like a friggin? mad man! It also drinks down power like a mad man and produces heat like a mad man....
Many of you that want to overclock this video card are going to expose some incredible amounts of performance. I am thinking that with a cool case temperature and 300 watts of power feeding the card, you should see 250MHz overclocks on the core. Reaching the 1GHz+ core clock mark should be doable with the stock air cooler?.

You can also use the AA mode(where Kyle said "Don?t Use Narrow Tent and Wide Tent AA") for the washed out appearance to raise performance.
This is a bold title, but we feel it is the correct one from a gamer?s perspective. While the above power line screenshots demonstrated the benefits of these filtering modes for edge antialiasing, they themselves did not show you the deal breaker of a consequence. Since these filter modes are a post processes we have the potential for messing with texture detail, and that is currently what is happening with both modes. Think ?Quincunx? here.
Some reviews will probably do that BTW, and numbers will reflect better performance.
I have to agree filtering my whole screen by softening it in post processing is not the graphical fidelity that I have been looking forward to for the last decade. :thumbsdown:

EDIT: Now we also know why this is ATI's flagship. Too much wattage for a higher part. :(
 

Golgatha

Lifer
Jul 18, 2003
12,394
1,062
126
All I could think of when reading the few reviews posted on these forums is OUCH!

All my DX9 cards (from the 9800Pro to my current X1900XTX) have been ATI except for a nicely OCed 7800GT I owned for awhile. AMD/ATI really dropped the ball on this release and even though it's late in the game it still feels rushed to me. The drivers are still in major need of tweaking, but I imagine no amount of tweaking will increase IQ and raise frame rates simultaneously.

I honestly wanted to upgrade my video card, but I think I'll just wait for the 8900 series from nVidia now.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
It seems the X2900 XT is great at being a "big numbers" cards in specifications.

320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
The Radeon HD 2900 XT overclocks like a friggin? mad man! It also drinks down power like a mad man and produces heat like a mad man....
Many of you that want to overclock this video card are going to expose some incredible amounts of performance. I am thinking that with a cool case temperature and 300 watts of power feeding the card, you should see 250MHz overclocks on the core. Reaching the 1GHz+ core clock mark should be doable with the stock air cooler?.
You'd better have an asbestos plate behind your PC if you attempt this, lest you burn you house down.
 

MadBoris

Member
Jul 20, 2006
129
0
0
Originally posted by: Golgatha
I honestly wanted to upgrade my video card, but I think I'll just wait for the 8900 series from nVidia now.

In all honesty, I'm not sure Nvidia needs to worry too much about revisions like the 8900.
Maybe they can do a dual GPU in coming months, but it will only be Nov. Dec. till next gen comes around again. Maybe you can wait till then.

It was good to finally read a proper review, and hardOCP even did apples to apples comparison :thumbsup:
 

MadBoris

Member
Jul 20, 2006
129
0
0
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.

320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!

This sure didn't help them...

The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: MadBoris
Originally posted by: Golgatha
I honestly wanted to upgrade my video card, but I think I'll just wait for the 8900 series from nVidia now.

In all honesty, I'm not sure Nvidia needs to worry too much about revisions like the 8900.
Maybe they can do a dual GPU in coming months, but it will only be Nov. Dec. till next gen comes around again. Maybe you can wait till then.

It was good to finally read a proper review, and hardOCP even did apples to apples comparison :thumbsup:

I would think it would still be a good idea to do a 80nm shrink of the G80 as that would provide higher margins for Nvidia at easier binning levels. The least they could do is a transparent 80nm replacement of the G80 core on existing SKU's.

Its is in Nvidia's best interests to become more profitable.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Pretty Harsh
The Bottom Line

?A day late and a dollar short.? Cliché but accurate. The Radeon HD 2900 XT is late to the party and unfortunately is bringing with it performance that cannot compete. The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.

This is as good as it is going to get for a while from ATI. The GeForce 8800 GTX will still dominate at the high end of the video card market. Of course we do not know about DX10 games yet, and there is no way to make any predictions how that comparison will turn out. As it stands right now the Radeon HD 2900 XT, in our opinion, is a flop. ATI needs to get its act together quickly. It needs to push out the mainstream cards soon and it needs to deliver a high end card that can actually compete at the high end of the market

if the above is true - the final word ... then we will see $300 HD-2900XTs

the market will "adjust" the price vs perf 'naturally'
 

Laminator

Senior member
Jan 31, 2007
852
2
91
I think this is what most people have been expecting. A few people were making most of the noise and posting all over the boards, but it was getting pretty easy to see what the case was here. It was hard to read some of the conspiracy theories here (ATI waiting to drop the bomb on nVidia, etc.), especially after the multiple launch reschedulings and the preliminary numbers by DailyTech.

Hopefully we'll have some DirectX 10 games coming out soon (Halo 2...ha, ha, ha) so that we can see how G80 and R600 really perform for the games that they're meant to play.
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
I agree. This is painful reading these reviews. They have all used the 8.37 driver so hopefully the 8.38 driver helps out a lot on all fronts. Bad news all around except for the price. But even then...

Also no mention of these doing DX10.1 in addition to 10.0? Bad speculation?

 

Golgatha

Lifer
Jul 18, 2003
12,394
1,062
126
Originally posted by: MadBoris
Originally posted by: Golgatha
I honestly wanted to upgrade my video card, but I think I'll just wait for the 8900 series from nVidia now.

In all honesty, I'm not sure Nvidia needs to worry too much about revisions like the 8900.
Maybe they can do a dual GPU in coming months, but it will only be Nov. Dec. till next gen comes around again. Maybe you can wait till then.

It was good to finally read a proper review, and hardOCP even did apples to apples comparison :thumbsup:

Sad thing is, if I thought they had proper drivers for the HD 2900XT I would probably take the plunge on it if it gets down to the $350 level. I really don't like to pay much more than $300 for any video card anyway. I'm thinking that the 8900GTS will be very nice at stock and also overclockable. Kind of like my switch to the 7800GT back in the day from my X800XT.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: MadBoris
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.

320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!

This sure didn't help them...

The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.

Yeah, it's more like a marketing piece then actually performing all that amazing part. It's alot like the Pentium 4 to Athlon 64 philosophy. 320 Shader Units sounds great, but if they only have 1/4 of the performance of each G80 Shader Unit, then well blah.