HARDOCP 2900xt Review! (a proper review)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CrystalBay

Platinum Member
Apr 2, 2002
2,175
1
0
Nothing where leafblower DX9 Failed against Leafblower DX10 There is a difference,,,,
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
Originally posted by: Matt2
Originally posted by: apoppin
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

HardOCP didn't test with the latest ATI drivers ;)

and Vijay Sharma told a lot of other lies in that interview :p

in case you still don't get it

the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon

and they will continue to do so - as nvidia is still doing - for the next few months

LOL.

Tune for the latest spin. Hehehe. Final silicon.

I cant believe Hector Ruiz still has a job.

I also love how we've been waiting for this GPU forever, but had to wait so AMD could "launch the entire family" and yet we still have to wait another month at least for the rest of the family.


I agree. I don't like this quote either:

"Here is the catch. There is only one video card being launched today in the high-end, the Radeon HD 2900 XT at $399 which ?competes? with the GeForce 8800 GTS 640MB. Here is a GeForce 8800 GTS 640 MB from PNY that can be had for $329.99 with a rebate which is the lowest we've seen yet. This is as fast as it is going to get for ATI?s current lineup. The Radeon HD 2900 XT is available at online retailers today. The Radeon HD 2600 and 2400 however will not be available until late June. After waiting so long it is sad to see even more delays."

It sounds like the Radeon 2900XT drivers are worse off than the Geforce 8800 drivers were when they launched. It doesn't sound like the card is ready to be released at any time like ATI claimed earlier. And what is the point of putting off a launch of the Radeon 2900XT to launch all products at the same time if you are not going to be launching all the products immediately at the same time. AMD has really let me down hard on this one. I am nore disappointed in ATI now than their old quake/quack lies.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Laminator
Originally posted by: apoppin
what makes it *all* suspect is the older drivers

and that the reviewer is "irritated" that r600 is late ... and has zero clue about MSRP vs market prices

I think it was appropriate to factor lateness into his final conclusion about the card. This thing should have been out a long time ago if it wanted the same recognition as the 8800 series.

wrong

you compare what is in your hand with the competition - at the moment ... you do not let your personal irritations interfere with your objectiveness

that is "impartial"

this reviewer is NOT impartial

THERFORE his "best" at this resolution is clouded ... i want to see reviews that are benched in the *normal* way ... too much is dependent on the reviewer's distorted perception ... as to what "looks good"

this one is 'ok' ... with the understanding that the reviewer intends to "fault" the card.

and i guess it is time to change your nick, Amd Inside
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
"We asked ATI why there is no higher-end version and they pretty much told us that no one would buy it when you take the CrossFire into consideration. Well, we know that is simply false because there are people that buy $499 and $599 single video cards, case in point the GeForce 8800 GTX. ATI?s answers were a painful copout to not being able to get the job done and it was obvious. "

Man, what happened with ATI facing the truth? They really need to fire this AMD person or all that talk about facing the truth and making changes within the company was just a lie.
 

Laminator

Senior member
Jan 31, 2007
852
2
91
apoppin's post

True. It is a different story when there are cancelled launches involved, though. It is comparable to the driver problems with the G80. Unfortunately for ATI and fortunately for nVidia, reviewers seemed placated enough by the generational performance leap from G7X/X19XX that they did not make a big hubbub about the numerous compatibility issues with the G80 (and some people blamed the game companies).
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: apoppin
Originally posted by: Golgatha
Originally posted by: coldpower27
Originally posted by: MadBoris
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.

320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!

This sure didn't help them...

The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.

Yeah, it's more like a marketing piece then actually performing all that amazing part. It's alot like the Pentium 4 to Athlon 64 philosophy. 320 Shader Units sounds great, but if they only have 1/4 of the performance of each G80 Shader Unit, then well blah.

That's a good analogy. It's like a Prescott vs an Athlon 64.

Prescott = Hot, hard to cool without a lot of noise, higher clock speeds, lower performance, and more expensive.

Prescott was simply a *bad decision* on intels part ...

the older NW was cooler and much more efficient

after the Prescott, intel *dumped* P4 architecture ... back to the M ... and C2D is kicking A64 butt.

this is *different* ... r600 is a NEW architecture ... it is the basis for the next gen of AMD GPU ... this time AMD was blindsided by G80 much the same way the intel's P4 was smacked by Athlon

i expect the refresh to be a major improvement

this is AMD's turn at a "x1800" kind of launch :p

but they are not 'giving up' on the r600 architecture - it is *sound* ... unlike intel and the P4

and thanks for the *clarification*, coldpower27

... so HardOCP used the *very latest* nvidia drivers and the older Cats ...
sound 'fair' :p

On the Pentium 4 was sound for what it was engineered to do and that was to provide big marketable numbers. I don't wish to discuss the reasoning behind this again as we will be going around in circles. This idea works with a company with the vast resources and market position as Intel.

Unfortunately for AMD, it seems like them ATI isn't all that great at marketing either compared to Nvidia, so I am not sure how great this would be. Le'ts see how many people ATI can dupe into buying this card.

Considering the Cat 4.38 were provided only 3 days before the NDA expired according VR-Zone, it' possible that HardOCP didn't have enough time to use em? The 158.22 drives are over a week old in comparison, so there was sufficient time to run the benches.

Of course they can't give up on R600, they just got it to market it will still be the basis for ATI's products for sometime to come. Even Intel took some time to transition to a new architecture. NetBurst is shelved for now, but elements are to be reincorpoated when the time is right and it makes sense. A version of HyperThreading is to be reintroduced with Nehalem.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: AmdInside
"We asked ATI why there is no higher-end version and they pretty much told us that no one would buy it when you take the CrossFire into consideration. Well, we know that is simply false because there are people that buy $499 and $599 single video cards, case in point the GeForce 8800 GTX. ATI?s answers were a painful copout to not being able to get the job done and it was obvious. "

Man, what happened with ATI facing the truth? They really need to fire this AMD person or all that talk about facing the truth and making changes within the company was just a lie.

that is not what AMD is telling anyone else ... who knows which marketing clown talked to him

--or find a link :p

lets look for supporting reviews ... before i accept this one's conclusion - based on OLD drivers - anyway :p

again coldpower27 .. who cars the reason ... out of date is out of date

invalid

move along to a newer set of more orthodox benchs, please
 

Laminator

Senior member
Jan 31, 2007
852
2
91
Aside from the power consumption, the X2900XT offers decent performance. If priced a little lower than the 8800GTS, I think it would be very competitive in both DX9 and DX10. It seems to be on par with the 8800GTS in DX10 in the Call of Juarez test...
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
I made my decision, and it is very painful for me ... honestly it is, I feel very sad right now, but I will not buy the HD 2900XT, nor XL, nor R600 at all. I will have to switch the coin upside down and stare at the new side I'll be looking at for the months and perhaps years to come. I don't see anything positive about the 2900XT at all so far.

Inferior A-F: Major disappointment for me, because it'd have been the second most important point to consider in my book. It looks like ATi did not even try to noticeably improve their texture-filtering algorithm over the X1K's HQ one, WHAT is the reasoning behind that I don't know, WHY they decided it was the right path to take I don't know, but GeForce 8's algorithm is near perfect, and I doubt anything could be better than it, unless more samples are used, which clearly didn't happened for this generation, and for some reasons I don't see a 24x or 32x A-F mode appearing with R700 and/or G90 either.

Inferior efficiency for power requirements: The single most important point for me, and it fails at it, and it fails good, GeForce 8's GTS 640MB offers equal to or better than performance compared to the HD 2900XT and consumes less power.

Inferior performance: Don't mention driver issues, I won't believe it this time around. It's disappointing, and it has issues. I doubt that mere driver updates will suddenly increase performances across the board, in any games by 30% or 40% or more. Just look at the S.T.A.L.K.E.R. charts, and what's up with the issue concerning grass in Oblivion at high resolutions ... and all that while the GeForce 8 still sleeps on both ears laughing about it in its dreams.

Competitive price ?: I don't think it is since already the GTS 640MB can be found at the same price.

SLi or CrossFire ?: Give me the choice and I'd go with 2x 8800GTS 320MB in SLi over 2x HD 2900XT in CrossFire. Not only will it perform similarly or better in SLi with such a setup, but it will consume less power and even generate less heat in return, not to mention that it will cost me less money to realize just that.

A-A: It seems rather equal to my eyes, nothing very disappointing but nothing jaw-dropping either. I wouldn't feel "inferior" by using any of the GeForce 8's A-A methods over any of the HD 2900XT's methods. This point is perhaps the only neutral one I have. No significant benefits nor disadvantages on either side. It's a "no looses" situation, so if I go with GeForce 8 instead it shouldn't be disappointing in regards to A-A, but I myself almost never use A-A in my games (with the sole exception of Source-based games).

I can't believe what I'm typing here.

I always supported ATi, I always wanted them to show how to do things right to nVidia, but they have nothing to prove this time around, and they even need to learn a couple of things from the green team. I am certainly not going to have a smile on my face when I will give the money to the seller for my GeForce 8, and I will surely walk out of the store still wondering what in the holy heck is happening, but I know when I have to be objective and see the differences in two products operating in the same domain.

NVIDIA clearly wins this round, good job guys.

ATI, please, I beg you to improve on ALL fronts for R700, except for A-A, maybe. The rest is all yours to improve, you have lessons to learn, so please do your homeworks. Is this pretentious from me ? Probably, but you guys have a job to do, IMPROVE against the competition, compete in terms of prices, give us as-good-as or better methods and technologies, and then the consumers will do their own job, to choose if they want to support you or not. I have been a loyal consumer (however I ain't so blindly) with you guys for FOUR years of my life, and now, for the very first time ever I am disappointed ... disappointed enough to change to that extent, to a new company that have LEARNED from their past mistakes.

I'll give myself maybe a month or so before doing it, before buying the GeForce 8 GTS/X, or maybe I will wait for the 8900 series to be there but my system is crying for an upgrade and I ain't a patient consumer nor a patient person at all overall in real life, but I know that the decisions I take are based on facts and not just blind misconceptions or false hopes that have been shaped as "reality" in my head just to give myself good consciousness over a product purchase just because I "believed" in the brand name. No thanks, that ain't me.

Nvidia ... here I come.
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: Zenoth
I made my decision, and it is very painful for me ... honestly it is, I feel very sad right now, but I will not buy the HD 2900XT, nor XL, nor R600 at all. I will have to switch the coin upside down and stare at the new side I'll be looking at for the months and perhaps years to come. I don't see anything positive about the 2900XT at all so far.

Inferior A-F: Major disappointment for me, because it'd have been the second most important point to consider in my book. It looks like ATi did not even try to noticeably improve their texture-filtering algorithm over the X1K's HQ one, WHAT is the reasoning behind that I don't know, WHY they decided it was the right path to take I don't know, but GeForce 8's algorithm is near perfect, and I doubt anything could be better than it, unless more samples are used, which clearly didn't happened for this generation, and for some reasons I don't see a 24x or 32x A-F mode appearing with R700 and/or G90 either.

Inferior efficiency for power requirements: The single most important point for me, and it fails at it, and it fails good, GeForce 8's GTS 640MB offers equal to or better than performance compared to the HD 2900XT and consumes less power.

Inferior performance: Don't mention driver issues, I won't believe it this time around. It's disappointing, and it has issues. I doubt that mere driver updates will suddenly increase performances across the board, in any games by 30% or 40% or more. Just look at the S.T.A.L.K.E.R. charts, and what's up with the issue concerning grass in Oblivion at high resolutions ... and all that while the GeForce 8 still sleeps on both ears laughing about it in its dreams.

Competitive price ?: I don't think it is since already the GTS 640MB can be found at the same price.

SLi or CrossFire ?: Give me the choice and I'd go with 2x 8800GTS 320MB in SLi over 2x HD 2900XT in CrossFire. Not only will it perform similarly or better in SLi with such a setup, but it will consume less power and even generate less heat in return, not to mention that it will cost me less money to realize just that.

A-A: It seems rather equal to my eyes, nothing very disappointing but nothing jaw-dropping either. I wouldn't feel "inferior" by using any of the GeForce 8's A-A methods over any of the HD 2900XT's methods. This points is perhaps the only neutral point I have. No significant benefits not disadvantages on either side. It's a "no looses" situation, so if I go with GeForce 8 instead it shouldn't be disappointing in regards to A-A, but I myself almost never use A-A in my games (with the sole exception of Source-based games).

I can't believe what I'm typing here.

I always supported ATi, I always wanted them to show how to do things right to nVidia, but they have nothing to prove this time around, and they even need to learn a couple of things from the green team. I am certainly not going to have a smile on my face when I will give the money to the seller for my GeForce 8, and I will surely walk out of the store still wondering what in the holy heck is happening, but I know when I have to be objective and see the differences in two products operating in the same domain.

NVIDIA clearly wins this round, good job guys.

ATI, please, I beg you to improve on ALL fronts for R700, except for A-A, maybe. The rest is all yours to improve, you have lessons to learn, so please do your homeworks. Is this pretentious from me ? Probably, but you guys have a job to do, IMPROVE against the competition, compete in terms of prices, give us as-good-as or better methods and technologies, and then the consumers will do their own job, to choose if they want to support you or not. I have been a loyal consumer (however I ain't so blindly) with you guys for FOUR years of my life, and now, for the very first time ever I am disappointed ... disappointed enough to change to that extent, to a new company that have LEARNED from their past mistakes.

I'll give myself maybe a month or so before doing it, before buying the GeForce 8 GTS/X, or maybe I will wait for the 8900 series to be there but my system is crying for an upgrade and I ain't a patient consumer nor a patient person at all overall in real life, but I know that the decisions I take are based on facts and not just blind misconceptions or false hopes that have been shaped as "reality" in my head just to give myself good consciousness over a product purchase just because I "believed" in the brand name. No thanks, that ain't me.

Nvidia ... here I come.


:Q
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Zenoth
I made my decision, and it is very painful for me ... honestly it is, I feel very sad right now, but I will not buy the HD 2900XT, nor XL, nor R600 at all. I will have to switch the coin upside down and stare at the new side I'll be looking at for the months and perhaps years to come. I don't see anything positive about the 2900XT at all so far.

Inferior A-F: Major disappointment for me, because it'd have been the second most important point to consider in my book. It looks like ATi did not even try to noticeably improve their texture-filtering algorithm over the X1K's HQ one, WHAT is the reasoning behind that I don't know, WHY they decided it was the right path to take I don't know, but GeForce 8's algorithm is near perfect, and I doubt anything could be better than it, unless more samples are used, which clearly didn't happened for this generation, and for some reasons I don't see a 24x or 32x A-F mode appearing with R700 and/or G90 either.

Inferior efficiency for power requirements: The single most important point for me, and it fails at it, and it fails good, GeForce 8's GTS 640MB offers equal to or better than performance compared to the HD 2900XT and consumes less power.

Inferior performance: Don't mention driver issues, I won't believe it this time around. It's disappointing, and it has issues. I doubt that mere driver updates will suddenly increase performances across the board, in any games by 30% or 40% or more. Just look at the S.T.A.L.K.E.R. charts, and what's up with the issue concerning grass in Oblivion at high resolutions ... and all that while the GeForce 8 still sleeps on both ears laughing about it in its dreams.

Competitive price ?: I don't think it is since already the GTS 640MB can be found at the same price.

SLi or CrossFire ?: Give me the choice and I'd go with 2x 8800GTS 320MB in SLi over 2x HD 2900XT in CrossFire. Not only will it perform similarly or better in SLi with such a setup, but it will consume less power and even generate less heat in return, not to mention that it will cost me less money to realize just that.

A-A: It seems rather equal to my eyes, nothing very disappointing but nothing jaw-dropping either. I wouldn't feel "inferior" by using any of the GeForce 8's A-A methods over any of the HD 2900XT's methods. This points is perhaps the only neutral point I have. No significant benefits not disadvantages on either side. It's a "no looses" situation, so if I go with GeForce 8 instead it shouldn't be disappointing in regards to A-A, but I myself almost never use A-A in my games (with the sole exception of Source-based games).

I can't believe what I'm typing here.

I always supported ATi, I always wanted them to show how to do things right to nVidia, but they have nothing to prove this time around, and they even need to learn a couple of things from the green team. I am certainly not going to have a smile on my face when I will give the money to the seller for my GeForce 8, and I will surely walk out of the store still wondering what in the holy heck is happening, but I know when I have to be objective and see the differences in two products operating in the same domain.

NVIDIA clearly wins this round, good job guys.

ATI, please, I beg you to improve on ALL fronts for R700, except for A-A, maybe. The rest is all yours to improve, you have lessons to learn, so please do your homeworks. Is this pretentious from me ? Probably, but you guys have a job to do, IMPROVE against the competition, compete in terms of prices, give us as-good-as or better methods and technologies, and then the consumers will do their own job, to choose if they want to support you or not. I have been a loyal consumer (however I ain't so blindly) with you guys for FOUR years of my life, and now, for the very first time ever I am disappointed ... disappointed enough to change to that extent, to a new company that have LEARNED from their past mistakes.

I'll give myself maybe a month or so before doing it, before buying the GeForce 8 GTS/X, or maybe I will wait for the 8900 series to be there but my system is crying for an upgrade and I ain't a patient consumer nor a patient person at all overall in real life, but I know that the decisions I take are based on facts and not just blind misconceptions or false hopes that have been shaped as "reality" in my head just to give myself good consciousness over a product purchase just because I "believed" in the brand name. No thanks, that ain't me.

Nvidia ... here I come.

that's pretty silly, imo .. making a decision on 'which GPU' in the first few hours of half-finished reviews

i hope you don't buy a car like that ... or a home :p

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: apoppin
Originally posted by: Matt2
Originally posted by: apoppin
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

HardOCP didn't test with the latest ATI drivers ;)

and Vijay Sharma told a lot of other lies in that interview :p

in case you still don't get it

the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon

and they will continue to do so - as nvidia is still doing - for the next few months

LOL.

Tune for the latest spin. Hehehe. Final silicon.

I cant believe Hector Ruiz still has a job.

I also love how we've been waiting for this GPU forever, but had to wait so AMD could "launch the entire family" and yet we still have to wait another month at least for the rest of the family.

that's what it IS

nvidia has had their 8800GTX "out" in users hands and couldn't put out a worthwhile driver for over TWO MONTHS :p

after 4 months it was becoming tolerable according to the reports here

SIX months later there are still big issues - with some users

AMD has JUST released their card today

if
their drivers are still a *clusterF&@k* - after two months like nvidia's drivers were
--THEN you can talk.

I think I'll talk now.

I've sat here while everyone tried to debunk every single leak and every single review that we saw before today.

I've sat here while people called me a fanboy because I didnt believe in the pretty picture AMD kept painting for us for seven months.

I've heard every excuse in the book from, "DT is full of crap" to "OMG I refuse to believe that ATI can be worse than Nvidia" to "R600 sucks in DX9, but will kill in DX10" and now "it's using an alpha driver".

Where are all those people that said drivers would be the reason to buy R600? R600 is this late and the best AMD can deliver is an alpha driver? A month or two ago we heard that R600 could ship anyday. Am I really supposed to believe that this is the best driver AMD could deliver when supposedly they had a working, final silicon product months ago?
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
Well, I actually read the whole review now. I am glad I bought my Geforce 8800GTS but I might pick up a Radeon 2000 or whatever low end card solutions that provides the best HTPC experience for my HTPC. Right now I have a lowly Geforce 6200TC and badly want to replace it.
 

Laminator

Senior member
Jan 31, 2007
852
2
91
Listen...there is no reason to support nVidia or ATI over the other. They are companies. Their goal is to make money and they have both acted unethically to try to achieve that goal.

3dfx was arrogant and nVidia kicked it into the pooper. nVidia had the favor returned in 2002/2003 with R300. Now it's ATI's turn to feel the pain. I am continually surprised at the affection people on this board have for ATI.
 

palindrome

Senior member
Jan 11, 2006
942
1
81
I really don't like hardOCP that much, but the review didn't seem unfair. I do believe that performance will increase as new drivers come out. At least the card is stable and still seems to be a great overclockers card (though hardOCP neglected touch on that)...
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
I read the review, but the review didn't answer the biggest question I have - the stability of drivers under XP AND Vista. Blah..

The most intriguing part was the anecdotal evidence of monstrous clockspeed headroom when enough power was supplied. Hmm.. I wonder how high it'll clock with good cooling and unlimited power supplied? Also very interesting and good news that the card has an audio pass-through which delivers audio signals through DVI-HDMI converter. I'm liking it a lot. I knew that the card comes with a converter, but I wasn't sure how they deal with audio part since HDMI = Video + Audio.

Image quality looks to be a wash as expected. There is technical difference but doesn't amount to be a practical difference. Still NV gets the kudos for getting it right.

I'd like to read TechReport's review more than anyone else's.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
that's pretty silly, imo .. making a decision on 'which GPU' in the first few hours of half-finished reviews

i hope you don't buy a car like that ... or a home :p

What difference will it make if I wait a month ?

R600's inferior A-F is hardware-based, that review, as recent as it is, just confirms a technical fact that is as true right now as it will be in ten years.

R600's inferior power-consumption efficiency is hardware-based, same logic as above.

Prove me otherwise and I will honestly reconsider.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
http://www.guru3d.com/article/Videocards/431/26/
It is what it is, and the HD 2900 XT performance wise ended up in the lower to mid part of the high-end segment. Sometimes it has a hard time keeping up with a 320MB 8800 GTS, and in other scenarios we see performance close or equal to the GeForce 8800 GTX. Now that would be weird if we all had to pay 600 USD/EUR for it. AMD knows this, and knows it very well. This is why, and honestly this is fantastic, the product is launched at a 399 MSRP. Now I have been talking with a lot of board partners already and here in Europe the product will launch at 350 EUR; and that's just golden.

So we need to leave the uber-power-frag-meister-performance idea behind us and mentally position the product where it is and compare it with the money you have to pay for it. For your mental picture; performance wise I'd say GeForce 8800 GTS 640 MB is a good comparative product (performance wise). Then the opinion will change as you'll receive absolutely a lot of bang for your bucks here. At 350 EUR you'll have a good performing DirectX 10 compatible product, new programmable tessellation unit. It comes with 512 megs of memory. It comes with a state of the art memory controller, offers HDCP straight out of the box, all cards have HDMI connectivity with support for sound and if that alone is not enough, you receive a Valve game-bundle with some very hip titles in there for free. So yeah, you really can't go wrong there.

The negatives then; only two really. First off is power consumption. I know that power is a cheap commodity, yet the card is rated at a peak of give or take 215 Watts, and that's just a lot. In a high-end system our power-draw was 400 Watts for the entire PC. So chances are that with the HD 2000 XT also comes a PSU upgrade. To make it slightly worse, if you want to overclock you'll need a PSU with the new 8-pin power connector as well. And at this time these are still rare (review here).

The second negative just has to be the fan. It's not dramatic in 2D desktop mode, really. But once you start gaming, you'll definitely hear it. And God forbid, if your card overheats to 100 Degrees C (which is not that awkward) you'll have a local hair-dryer available to you. So my recommendation here is in overclocked situation to be on the lookout for water-cooling or just wait for board-partners like HiS to come up with another solution. For generic usage however, please don't worry.

The positives: Image quality of the HD 2900 XT is just excellent. I could have gone in-depth but at one point this article has to stop right? No, the the new HQ anisotropic filtering is just looking great. Then again, it already was exceptionally good on previous generation products. The new CSAA modes are looking to be promising as well. Surely we did not get all modes to work properly but hey, we had to deal with an alpha driver. ATI has a good reputation in fixing this stuff in a timely manner, so we'll look at these modes closer once working properly. Beta drivers or not, pretty much any game we tested, no re-phrase, all games as tested worked without any issues. The one thing that initially had an issue was 3Dmark. Yet FutureMark will release a patch for this soon as it has to do with Futuremark's system diagnostic scan and thus is not an ATI issue.

What I also have to mention is that the new Avivo HD; or more precisely UVD decoder is looking to be awesome. Once you decode high-definition movies over the GPU you'll be surprised by the excellent video quality, it's really that good. The huge positive here is obviously that the entire decoding process is managed on the graphics card. Your CPU cycles are left alone, watching HD 1080P 20 MBit/s HD movie full screen on a 2560x1600 monitor and still have the CPU utilized 10-15% is really very interesting to observe.

So there you have it, guys and gals. The positives and negatives all sum up to an opinion, one that you have to make your decisions on. You will be charmed with the new Radeon HD 2900 XT if you can life with the knowledge of that power consumption and the fact that it's a noisy product. Performance wise at 350 EUR this Radeon HD 2900 XT offers an incredible amount of bang for your bucks, heaps of functionality and extremely good image quality. For whatever reason, what caught my attention is whenever a game makes a lot use of pixel shaders the overall performance starts to rise massively. And considering the future of games, that's a very good thing.

26 pages ... now that's a (a proper review)
 

Laminator

Senior member
Jan 31, 2007
852
2
91
There was a time where Guru3D didn't label their pages for the drop-down menu. Trying to find a particular section in one of their predictably-long reviews was a PITA.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
http://www.tweaktown.com/articles/1100/15/page_15_final_thoughts/index.html
AMD's new series of R600 DX10 graphics cards are finally here and we have had plenty of time to examine them but we are left with a tough decision if we should recommend the HD 2900 XT or not. It impressed us considering the price tag and the extras such as sound output through DVI to HDMI adapters and the Valve software bundle but it leaves us wondering if AMD could have done a little better...

AMD's Radeon HD 2900 XT is going to cost between US$350 - US$399, closer to the latter in most places. That makes it at least US$150 cheaper than most GeForce 8800 GTX 768MB cards. Sure, Nvidia's 8800 GTX green champ won most of our tests in the 1280 x 1024 to 1920 x 1200 range but not by as bigger margins as we expected. What is interesting are our results at the 2560 x 1600 resolution ? both products have their share of driver issues (which need to be sorted out quickly!) but from our testing at this point in time, the HD 2900 XT makes a really good showing at the 2560 x 1600 resolution. In games like Prey, Company of Heroes and Far Cry, the more expensive 8800 GTX really only has a small lead over the XT. In real-world terms, both offer basically about the same level of game play experience in terms of smoothness and we put that down to the massive 106GB/s of memory bandwidth from the R600, which is able to be properly utilized at this ultra-high resolution. With some overclocking of the core and memory, the HD 2900 XT could probably match or better the more expensive 8800 GTX and that is with 256MB less memory. This in turn tells us that the engineers of the R600, while slow and indecisive, were smart and produced a GPU which is very efficient ? but at the right settings.

We expect factory overclocked HD 2900 XT cards to start selling in less than one month from now. AIB partners currently have the option of ordering 1GB GDDR-4 models with faster clock speeds but it is unsure if this product will be called HD 2900 XT 1GB GDDR-4 or HD 2900 XTX ? you may end up seeing these types of cards appear in early June (around Computex Taipei show time). If we saw a product like this with slightly faster core clock and obviously much faster memory clock (2000 - 2100MHz DDR vs. 1656MHz DDR), we think it would compete very nicely against the GeForce 8800 GTX as far as price vs. performance goes. Sadly we did not have a GeForce 8800 GTS 640MB handy for testing but matching up with our previous testing on similar test beds, the HD 2900 XT will beat it quite considerably, by around the 20% mark in 3DMark06, for example. This is rather interesting since the HD 2900 XT is in the same price range as the GeForce 8800 GTS 640MB ? we will test against this card shortly!

Summing it up, we are happy to see a new high-end Radeon graphics card from AMD ? it literally is the red hot flaming monster but it manages to offer a good amount of performance and impressive feature set with full DX10 and Shader Model 4.0, Crossfire and Windows Vista support and a host of others which we did not even have enough time to cover in full today, such as improved anti-aliasing and UVD. It was a long time coming but it is able to offer very good bang for buck against the equivalent from Nvidia - GeForce 8800 GTS.

rather *different conclusion* from the HARDOCP reviewer ... but not as enthusiastic as Guru3D
 

Laminator

Senior member
Jan 31, 2007
852
2
91
I'm guessing it'll be another hour before the "Big Two" post their reviews (Tom's Hardware and Anandtech).
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
If the midrange is this bad, amd is hooped. If the midrange show promise on the 65mn process - than amd has some hope for the future. :beer:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
http://www.techpowerup.com/reviews/ATI/HD_2900_XT
* ATI's Radeon HD 2900 XT is priced at a mere $399 which is an extremely competitive price for being a highest-class performance product.



* Breathtaking $399 price point
* Support for DirectX 10, Shader Model 4.0
* HDMI + Audio output included
* New video acceleration features
* New AA Modes



* Not fastest GPU in the world as speculated by many
* Fan can be noisy
* High power consumption

8.9 I have to admit I was a bit disappointed after seeing the benchmark results of the Radeon HD 2900 XT. Even though the card is fast it can not beat the NVIDIA GeForce 8800 GTX, it should have no problems beating the GeForce 8800 GTS 320 MB though. Having a product like this AMD did the only right thing - make it affordable. With a price point of $399 USD this card is taking the price/performance crown without breaking a sweat.
Even though ATI has employed a number of power-saving techniques and heat reducing features, the heat output is still fairly high, especially when you are overclocking at a higher voltage. This also means that the fan has to run faster and louder than competing products. On the other hand watercooling will provide seriously improved overclocking potential, companies like Danger Den and Thermaltake already have the first coolers ready.
The new features like integrated Tesselator and CFAA offer new ways to improve image quality while not hurting performance too much. Full HDCP with audio support and H.264 decode acceleration make this card an excellent choice for media PC systems where image quality matters. I really hope that AMD can improve their drivers to offer increased performance, because at this price point the card has the potential to be a card that everybody can afford.
looking good
HardOCP is *out of line* -- along with DT :p

so far
 

Laminator

Senior member
Jan 31, 2007
852
2
91
I think Hexus.net's conclusion pretty much sums it up:

"Ultimately, the AMD Radeon HD 2900 XT is a good product for £250. The innate problem in launching several months after your competitor is that being only as fast as its third-rung GPU is damaging to a company's prestige.

We had hoped for a GeForce 8800 Ultra-killer. The only killing that has gone on with the AMD Radeon HD2900 XT is in relation to its price and performance prestige.

Worth £250 of your money, yes. As fast as we had hoped for, absolutely not."

http://www.hexus.net/content/item.php?item=8687&page=11