The ATI Radeon HD 2900 XT however is more akin to NVIDIA?s GeForce FX 5800. It does not seem like this will have a very long life span in comparison. NVIDIA quickly answered the GeForce FX 5800 by introducing the GeForce FX 5900 (NV35). ATI really needs to do something similar in this situation, or they may lose some loyal fans in the enthusiast community and you can bet they are going to continue to lose sales to NVIDIA?s 8000 series products.
Despite what the numbers in 3DMark are showing our evaluation has proven that the ATI Radeon HD 2900 XT is slower than a GeForce 8800 GTS when it comes to actually gaming. Even our apples-to-apples real gaming tests confirmed that the 8800 GTS is faster than the HD 2900 XT and nowhere close to the GeForce 8800 GTX, yet here sits 3DMark showing us the opposite!
Originally posted by: Matt2
HD2900XT consistantly loses across the board to both 8800GTS 640mb and 8800GTX.
From their conclusion:
The ATI Radeon HD 2900 XT however is more akin to NVIDIA?s GeForce FX 5800. It does not seem like this will have a very long life span in comparison. NVIDIA quickly answered the GeForce FX 5800 by introducing the GeForce FX 5900 (NV35). ATI really needs to do something similar in this situation, or they may lose some loyal fans in the enthusiast community and you can bet they are going to continue to lose sales to NVIDIA?s 8000 series products.
EDIT: Another tidbit from their conclusion:
Despite what the numbers in 3DMark are showing our evaluation has proven that the ATI Radeon HD 2900 XT is slower than a GeForce 8800 GTS when it comes to actually gaming. Even our apples-to-apples real gaming tests confirmed that the 8800 GTS is faster than the HD 2900 XT and nowhere close to the GeForce 8800 GTX, yet here sits 3DMark showing us the opposite!
The Radeon HD 2900 XT overclocks like a friggin? mad man! It also drinks down power like a mad man and produces heat like a mad man....
Many of you that want to overclock this video card are going to expose some incredible amounts of performance. I am thinking that with a cool case temperature and 300 watts of power feeding the card, you should see 250MHz overclocks on the core. Reaching the 1GHz+ core clock mark should be doable with the stock air cooler?.
Some reviews will probably do that BTW, and numbers will reflect better performance.This is a bold title, but we feel it is the correct one from a gamer?s perspective. While the above power line screenshots demonstrated the benefits of these filtering modes for edge antialiasing, they themselves did not show you the deal breaker of a consequence. Since these filter modes are a post processes we have the potential for messing with texture detail, and that is currently what is happening with both modes. Think ?Quincunx? here.
You'd better have an asbestos plate behind your PC if you attempt this, lest you burn you house down.The Radeon HD 2900 XT overclocks like a friggin? mad man! It also drinks down power like a mad man and produces heat like a mad man....
Many of you that want to overclock this video card are going to expose some incredible amounts of performance. I am thinking that with a cool case temperature and 300 watts of power feeding the card, you should see 250MHz overclocks on the core. Reaching the 1GHz+ core clock mark should be doable with the stock air cooler?.
Originally posted by: Golgatha
I honestly wanted to upgrade my video card, but I think I'll just wait for the 8900 series from nVidia now.
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.
320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!
The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.
Originally posted by: MadBoris
Originally posted by: Golgatha
I honestly wanted to upgrade my video card, but I think I'll just wait for the 8900 series from nVidia now.
In all honesty, I'm not sure Nvidia needs to worry too much about revisions like the 8900.
Maybe they can do a dual GPU in coming months, but it will only be Nov. Dec. till next gen comes around again. Maybe you can wait till then.
It was good to finally read a proper review, and hardOCP even did apples to apples comparison :thumbsup:
The Bottom Line
?A day late and a dollar short.? Cliché but accurate. The Radeon HD 2900 XT is late to the party and unfortunately is bringing with it performance that cannot compete. The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.
This is as good as it is going to get for a while from ATI. The GeForce 8800 GTX will still dominate at the high end of the video card market. Of course we do not know about DX10 games yet, and there is no way to make any predictions how that comparison will turn out. As it stands right now the Radeon HD 2900 XT, in our opinion, is a flop. ATI needs to get its act together quickly. It needs to push out the mainstream cards soon and it needs to deliver a high end card that can actually compete at the high end of the market
Originally posted by: MadBoris
Originally posted by: Golgatha
I honestly wanted to upgrade my video card, but I think I'll just wait for the 8900 series from nVidia now.
In all honesty, I'm not sure Nvidia needs to worry too much about revisions like the 8900.
Maybe they can do a dual GPU in coming months, but it will only be Nov. Dec. till next gen comes around again. Maybe you can wait till then.
It was good to finally read a proper review, and hardOCP even did apples to apples comparison :thumbsup:
Originally posted by: MadBoris
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.
320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!
This sure didn't help them...
The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.