HARDOCP 2900xt Review! (a proper review)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
yes, the other reviews did mention a solid performance increase with the newer drivers

and it is *still* hard to read their weird "runs best at" benchmarks

i doubt they ran the nvidia ones again ... just reused them to "compare"

with STALKER the drivers are clearly F*up
Here is what we found out, with Full Dynamic Lighting enabled the game is simply unplayable on the ATI Radeon HD 2900 XT at 1280x1024 and 1600x1200. We experienced an average of 15 FPS at 1600x1200 in the game. When we dropped to 1024x768 we found the game more playable, but the framerates were still in the upper 20?s to mid 30?s in performance. That is quite disheartening to experience on a $400 video card. The only way you are going to be able to play smoothly at Full Dynamic Lighting is going to be to drop to 1024x768 and lower in-game settings.

it plays better on a x1950p

 

Golgatha

Lifer
Jul 18, 2003
12,402
1,078
126
Originally posted by: coldpower27
Originally posted by: MadBoris
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.

320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!

This sure didn't help them...

The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.

Yeah, it's more like a marketing piece then actually performing all that amazing part. It's alot like the Pentium 4 to Athlon 64 philosophy. 320 Shader Units sounds great, but if they only have 1/4 of the performance of each G80 Shader Unit, then well blah.

That's a good analogy. It's like a Prescott vs an Athlon 64.

Prescott = Hot, hard to cool without a lot of noise, higher clock speeds, lower performance, and more expensive.
 

aclim

Senior member
Oct 6, 2006
475
0
0
wow, just wow is all i gotta say. Looks like ill be ordering my 8800gtx now
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: apoppin
yes, the other reviews did mention a solid performance increase with the newer drivers

and it is *still* hard to read their weird "runs best at" benchmarks

i doubt they ran the nvidia ones again ... just reused them to "compare"

No they used a newer driver set compare with Forceware 158.22 compared to older ones used in the 8800 Ultra review the 158.19. These are most definitely newly ran benches. They even used referenced clock cards to be more fair.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Golgatha
Originally posted by: coldpower27
Originally posted by: MadBoris
Originally posted by: coldpower27
It seems the X2900 XT is great at being a "big numbers" cards in specifications.

320 Stream Shaders vs 96!
742MHZ vs 500MHZ!
11K vs 9K 3D Mark 2006!
24xAA vs 16xAA!
512Bit vs 320Bit!
105GB Bandwidth vs 64!

This sure didn't help them...

The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.

Yeah, it's more like a marketing piece then actually performing all that amazing part. It's alot like the Pentium 4 to Athlon 64 philosophy. 320 Shader Units sounds great, but if they only have 1/4 of the performance of each G80 Shader Unit, then well blah.

That's a good analogy. It's like a Prescott vs an Athlon 64.

Prescott = Hot, hard to cool without a lot of noise, higher clock speeds, lower performance, and more expensive.

Prescott was simply a *bad decision* on intels part ...

the older NW was cooler and much more efficient

after the Prescott, intel *dumped* P4 architecture ... back to the M ... and C2D is kicking A64 butt.

this is *different* ... r600 is a NEW architecture ... it is the basis for the next gen of AMD GPU ... this time AMD was blindsided by G80 much the same way the intel's P4 was smacked by Athlon

i expect the refresh to be a major improvement

this is AMD's turn at a "x1800" kind of launch :p

but they are not 'giving up' on the r600 architecture - it is *sound* ... unlike intel and the P4

and thanks for the *clarification*, coldpower27

... so HardOCP used the *very latest* nvidia drivers and the older Cats ...
sound 'fair' :p
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
how come they use 2xAA for the 2900 and 2x TR MSAA and 8x TR MSAA for the 8800's? is there a difference?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

HardOCP didn't test with the latest ATI drivers ;)

and Vijay Sharma told a lot of other lies in that interview :p

in case you still don't get it

the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon

and they will continue to do so - as nvidia is still doing - for the next few months
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Question MadBoris, what makes HardOCP's review proper? I wouldn't call them improper however they are decidedly unscientific being based off of an opinion of what is playable and what looks best.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: apoppin
Prescott was simply a *bad decision* on intels part ...

the older NW was cooler and much more efficient

after the Prescott, intel *dumped* P4 architecture ... back to the M ... and C2D is kicking A64 butt.

this is *different* ... r600 is a NEW architecture ... it is the basis for the next gen of AMD GPU ... this time AMD was blindsided by G80 much the same way the intel's P4 was smacked by Athlon

i expect the refresh to be a major improvement

this is AMD's turn at a "x1800" kind of launch :p

but they are not 'giving up' on the r600 architecture - it is *sound) ... inlike intel and the P4

If they're not giving up on the architecture, then they're going to have to do something to improve performance. They've already got the memory bus as wide as they can, it's got a 1kbit ring bus, it has the theoretical shader power, I just can't put my finger on what's wrong.

Since I doubt there will be any architectural changes, they're going to have to shrink it and boost the clocks by a lot. Depending on how much they have to boost the clocks, it could end up sucking up almost as much power as it does now.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

You mean this one? (You'll notice I simulated CFAA when creating it, also) :D
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: swtethan
all the benches used different AA settings??? :( what!
Its HardOCP's "benchmarking" method, they determine what is the maximum quality settings at a "playable" frame rate then compare what they could enable on each card.
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: Gstanfor
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

You mean this one? (You'll notice I simulated CFAA when creating it, also) :D


dx10 bench 2900 vs 8800 http://www.guru3d.com/article/Videocards/431/17/




 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: apoppin
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

HardOCP didn't test with the latest ATI drivers ;)

and Vijay Sharma told a lot of other lies in that interview :p

in case you still don't get it

the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon

and they will continue to do so - as nvidia is still doing - for the next few months

LOL.

Tune for the latest spin. Hehehe. Final silicon.

I cant believe Hector Ruiz still has a job.

I also love how we've been waiting for this GPU forever, but had to wait so AMD could "launch the entire family" and yet we still have to wait another month at least for the rest of the family.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Gstanfor
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

You mean this one? (You'll notice I simulated CFAA when creating it, also) :D

Priceless. :D

I think you could write anything over that pic and it be hilarious.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: fierydemise
Question MadBoris, what makes HardOCP's review proper? I wouldn't call them improper however they are decidedly unscientific being based off of an opinion of what is playable and what looks best.

what makes it *all* suspect is the older drivers

and that the reviewer is "irritated" that r600 is late ... and has zero clue about MSRP vs market prices

his bias shows ... as badly as the other review that tried to AMD in a good light

i want to see his next benchs
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: fierydemise
Question MadBoris, what makes HardOCP's review proper? I wouldn't call them improper however they are decidedly unscientific being based off of an opinion of what is playable and what looks best.

Scroll to the end of the review for the Apples to Apples benches if your prefer, it still isn't pretty.
 

Laminator

Senior member
Jan 31, 2007
852
2
91
Originally posted by: swtethan
all the benches used different AA settings??? :( what!

HardOCP focuses on "best settings" as opposed to highest FPS. They try to go for the best gaming experience - highest settings at a predetermined, acceptable framerate. Sometimes the average framerate will be different because one card will have a higher minimum framerate than the other and will thus provide an equivalent gaming experience at a lower framerate. Either way, the game with the highest settings wins.

The 8800 cards are faster so they can use higher resolutions/levels of AA at a fixed framerate, subsequently providing a better gaming experience. Thus, they "win". If you want to see direct framerate comparisons, go to their "Apples to Apples" section near the end of the review.
 

MadBoris

Member
Jul 20, 2006
129
0
0
Originally posted by: fierydemise
Question MadBoris, what makes HardOCP's review proper? I wouldn't call them improper however they are decidedly unscientific being based off of an opinion of what is playable and what looks best.

Simple. it was the first review that I consider trustworthy. If you don't, no problem, you can make your own topic and title it improper. ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
Originally posted by: apoppin
Originally posted by: Matt2
I'm patiently waiting for someone to come in here and say it's because the drivers are immature.

What happened to ATI's drivers being the bestest?

Didnt Vijay Sharma tell us that the drivers were in "great condition" or something along those lines?

Then there is the whole, "G80 is not optimized for DX10!". LOL. Gstanfor, where is that pic you posted the other night.

HardOCP didn't test with the latest ATI drivers ;)

and Vijay Sharma told a lot of other lies in that interview :p

in case you still don't get it

the SW guys have to "tune" the drivers for the LATEST spin of the FINAL silicon

and they will continue to do so - as nvidia is still doing - for the next few months

LOL.

Tune for the latest spin. Hehehe. Final silicon.

I cant believe Hector Ruiz still has a job.

I also love how we've been waiting for this GPU forever, but had to wait so AMD could "launch the entire family" and yet we still have to wait another month at least for the rest of the family.

that's what it IS

nvidia has had their 8800GTX "out" in users hands and couldn't put out a worthwhile driver for over TWO MONTHS :p

after 4 months it was becoming tolerable according to the reports here

SIX months later there are still big issues - with some users

AMD has JUST released their card today

if
their drivers are still a *clusterF&@k* - after two months like nvidia's drivers were
--THEN you can talk.
 

Laminator

Senior member
Jan 31, 2007
852
2
91
Originally posted by: apoppin
what makes it *all* suspect is the older drivers

and that the reviewer is "irritated" that r600 is late ... and has zero clue about MSRP vs market prices

I think it was appropriate to factor lateness into his final conclusion about the card. This thing should have been out a long time ago if it wanted the same recognition as the 8800 series.