• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

BFG 8800 GTX OC vs. Diamond HD 2900 XT 1 GB mini-review

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: tuteja1986
lol 🙂 your an mature benchmarker : )

apoppin and keysplayr2003 are still trying to get all their 1st gpu benchmarking done.

Hey you should also do IQ testing with HL 2
ATI 8x AA vs Nvidia 8x AA vs Nvidia 8x QAA vs ATI CFAA 6x vs Nvidia 16x QAA , Nvidia 16xAA vs AT

hehe, very funny. My 2900XT scores are now showing up in the other thread.

Nice N7!!! You are going to include some "mere-mortal" resolutions yes? hehe.
 
Not too happy right now...

I have the HD 2900 XT installed (i sneaked pics into the second post) 😉

But it doesn't come clocked at the advertised 825/1050 :|

The really frustrating thing is that i basically expected it to be wrong clock speeds, since Diamond has 743/1100 listed, reviews of said card have 743/1000 listed, & NCIX was listing 825/1050.

Against better judgement, i decided to trust their numbers, & heck, even the damn sticker on the card reads 825/2100 :frown:
But the actual 3D clocks speed?

743/1000 :|

So i left a pretty unhappy post on NCIX's forums about the card.

I discovered AMD GPU Clock Tool does nicely work for manual OCing, so i have "OCed" the card from its 743/1000 to the advertised 825/1050, but i don't know of any tools that will work for setting that speed upon bootup, like Rivatuner did so well for the 8800.
ATi Tool doesn't work at all (doesn't recognize card at all), Rivatuner sees card but doesn't let me do anything with it.

Only other app i know of is ATI Tray Tools, which i didn't like last time i used.

I guess the CCC overclocking is unlocked if i get the 8-pin PCI-e adaptor?

Still though, it's stupid i have to OC to get the rated speed :frown:

Results will be added as soon i as can, but i can already say that basically, the 8800GTX > HD 2900 XT 1 GB in all the games i ran through quickly, & i doubt the extra GPU/RAM speed is going to change that.
 
Originally posted by: n7
Not too happy right now...

I have the HD 2900 XT installed (i sneaked pics into the second post) 😉

But it doesn't come clocked at the advertised 825/1050 :|

The really frustrating thing is that i basically expected it to be wrong clock speeds, since Diamond has 743/1100 listed, reviews of said card have 743/1000 listed, & NCIX was listing 825/1050.

Against better judgement, i decided to trust their numbers, & heck, even the damn sticker on the card reads 825/2100 :frown:
But the actual 3D clocks speed?

743/1000 :|

So i left a pretty unhappy post on NCIX's forums about the card.

I discovered AMD GPU Clock Tool does nicely work for manual OCing, so i have "OCed" the card from its 743/1000 to the advertised 825/1050, but i don't know of any tools that will work for setting that speed upon bootup, like Rivatuner did so well for the 8800.
ATi Tool doesn't work at all (doesn't recognize card at all), Rivatuner sees card but doesn't let me do anything with it.

Only other app i know of is ATI Tray Tools, which i didn't like last time i used.

I guess the CCC overclocking is unlocked if i get the 8-pin PCI-e adaptor?

Still though, it's stupid i have to OC to get the rated speed :frown:

Results will be added as soon i as can, but i can already say that basically, the 8800GTX > HD 2900 XT 1 GB in all the games i ran through quickly, & i doubt the extra GPU/RAM speed is going to change that.

first of all, i would be really annoyed with the advertised clockspeed being totally wrong.

beyond annoyed

as i understand it, you can jury-rig your Card by running a molex-ground to the upper pin of the two uncovered by the 6-pin plug.
i'd say do it at your own risk ... the 8-pin is made to carry extra voltage over the 6-pin PCIe; and the ground just appears to "unlock" CCC without adding that extra voltage safeguard .. anyway i want to see "more" than a few guys risking it n another foruim before i risk my card. ... i might try a mild manual OC i guess with AMD GPU Clock Tool


i guess you didn't need to do a clean install ... just used Driver Cleaner?

 
Final results are up, & see my second post for my thoughts on the card, & pics.

/n7 is pretty tired & frustrated right now...$1200 spent on video cards & i don't like either of them 🙁
 
<div class="FTQUOTE"><begin quote>Originally posted by: tuteja1986
lol 😉 n7 is the sucker 🙂

</end quote></div>

Has anyone ever told you that you really don't make any sense 😕

I'm a sucker because i decided to see for myself which card is going to suit my needs best, rather than just relying on reviewers?

You don't seem to appreciate the effect involved, or even have much comprehension of the reasoning behind it.
 
Originally posted by: n7
I guess the CCC overclocking is unlocked if i get the 8-pin PCI-e adaptor?

Still though, it's stupid i have to OC to get the rated speed

I do recall having heard/seen/read that the 8-pin cable is necessary in order to reach the stated clock speeds.

The 8-pin adapters can be had for a relatively low price. A quick Google search revealed a link to this site (with credit due to HardOCP's forum readers for finding it).

I suggest that you give an 8-pin adapter a try before you completely write off the card and see if that changes anything (or even allows an easy OC to the "rated" clock speeds). Good luck, either way.. and thanks for providing this set of benchmark results!!
 
<div class="FTQUOTE"><begin quote>Originally posted by: n7
<div class="FTQUOTE"><begin quote>Originally posted by: tuteja1986
lol 😉 n7 is the sucker 🙂

</end quote></div>

Has anyone ever told you that you really don't make any sense 😕

I'm a sucker because i decided to see for myself which card is going to suit my needs best, rather than just relying on reviewers?

You don't seem to appreciate the effect involved, or even have much comprehension of the reasoning behind it.</end quote></div>

lol 😉

"/n7 is pretty tired & frustrated right now...$1200 spent on video cards & i don't like either of them"... dude i could have told you that easy :!

person who buys upgrades from a x800xl to 7900gt to 8800gts 320mb is more happy than the person who buys the best always. You feel into number games and buying the best doesn't make you happy since the best is not always falls from expectation.

I have a 8800GTX SLI , X6800 @ 3.9Ghz , 2GB Crucial PC8500 @ 1300Mhz 5.5.5.12 , 4x 320GB in raid 0. It didn't feel any better than a computer my other AMD X2 3800 @ 2.4Ghz , 2GB DDR 1 ram cheapo value ram , x1900xtx 512mb crossfire.

The only thing that give me happiness is my file server 😉
celeron 2.8Ghz , 512mb ram , 1x 40GB for os and stuff , 6x 320GB in raid 5 , 3x 500GB in raid 5 = awesomeness 🙂 better than a $1200 video card 🙂
 
Originally posted by: Brent
<div class="FTQUOTE"><begin quote>Originally posted by: n7
I guess the CCC overclocking is unlocked if i get the 8-pin PCI-e adaptor?

Still though, it's stupid i have to OC to get the rated speed</end quote></div>

I do recall having heard/seen/read that the 8-pin cable is necessary in order to reach the stated clock speeds.

The 8-pin adapters can be had for a relatively low price. A quick Google search revealed a link to this site (with credit due to HardOCP's forum readers for finding it).

I suggest that you give an 8-pin adapter a try before you completely write off the card and see if that changes anything (or even allows an easy OC to the "rated" clock speeds). Good luck, either way.. and thanks for providing this set of benchmark results!!

http://www.bjorn3d.com/forum/showthread.php?t=13891
not much different than sticking a ground wire in it 😛

uh, no thanks ... i don't believe this is what AMD engineered the 8-pin plug for ... to be 'bypassed'
 
Wait, are you guys saying it's actually downclocking 3D speeds because i don't have the 8-pin adaptor?

I thought that was only needed for OCing, so i would assume if the card is set to 825/1050 in the bios, it should be fine even w/o said connector?
 
Yeah, i guess.

I finally ran the CoJ DX10 bench on the 2900 XT.
It doesn't crash like on the 8800 GTX, but it can't run @ 825, or it artifacts.
Had to lower to ~ 775 core to get it to run w/o any artifacting, which is certainly interesting, since everything else i ran it thru @ 825 had no issues.
 
Originally posted by: n7
Wait, are you guys saying it's actually downclocking 3D speeds because i don't have the 8-pin adaptor?

My thought exactly. I wonder how much juice this sucker is sucking up with that 8pin connector connected...

BTW thanks for the benches guy.

 
Originally posted by: apoppin
do you like CoJ?

I couldn't say, sorry.

I just have the DX10 benchmark demo, that's all.

It's not really my style of game though, so even if i did have the full game, i doubt i'd like it too much.
 
Hey, I really appreciate the review.

I was so tired of shoddy reviews that I didn't know what to believe. But your thread along with Keys & Apop really showed me that the card still has some severe performance issues. On the performance end of things, I believe AMD/ATI has failed. I was hoping that 7.6 cats would have turned the tide, but the card just isn't what it should be.

So it basically comes down to this: Stability.
 
Originally posted by: ArchAngel777
Hey, I really appreciate the review.

I was so tired of shoddy reviews that I didn't know what to believe. But your thread along with Keys & Apop really showed me that the card still has some severe performance issues. On the performance end of things, I believe AMD/ATI has failed. I was hoping that 7.6 cats would have turned the tide, but the card just isn't what it should be.

So it basically comes down to this: Stability.
well, i am just starting to seriously bench my cards now ... but i can't see much difference from Keys or n7's benches by my preliminary testing

cat 7.6 made incredible difference in actully PLAYING Stalker .. althought Fame rate still dips unsatisfactorily, the difference IS like night and day ... i can play STALKER now - without cringing
... what benches don't yet tell is how the *overall* of the card performs, the driver issues, the IQ, the noise ... and how MANY of my favorite games run well on which GPU ... and of course my best guess as to future performance in the kind of game i like and upgradeability - if any.

as i said in the other thread, i am SO looking forward to actually loading up my games to just PLAY them on each card - after the benching is over ... the OVERALL impression ... i onnly got to play stalkeer ofr a few minutes ... and it actually 'plays' unlike with 7.5 ,,, so there IS improvement

stay tuned

 
<div class="FTQUOTE"><begin quote>Originally posted by: ArchAngel777
Hey, I really appreciate the review.

I was so tired of shoddy reviews that I didn't know what to believe. But your thread along with Keys & Apop really showed me that the card still has some severe performance issues. On the performance end of things, I believe AMD/ATI has failed. I was hoping that 7.6 cats would have turned the tide, but the card just isn't what it should be.

So it basically comes down to this: Stability.</end quote></div>

??

The card was built to beat the 8800GTS, and here it's being compared to the GTX, and it isn't much slower at all. Both the cards have overclocks though the XT has a larger (10% vs 3%) overclock... yes, the XT has 1GB of memory and much more bandwidth, but I don't think anyone expected that to make much of a difference as no games (yet) use more than 512MB of VRAM and the R600 is not memory bandwidth limited.

I don't see how it is a failure that this card is slightly slower than the GTX. I have a very, very strong feeling that people with a 1GB HD 2900XT are going to be playing Crysis and future DX10 games better than the 8800GTX, however. If you look at Beyond3D's recent benchmarks, the HD 2900XT 512MB beats the GTX in Company of Heroes DX10 with the newest drivers.
 
I have the supcom demo, just installed it yesterday. I'll check that out. I just bought a corsair hx520, too! I only have 2 hd's but I'm planning on a quad core in a few months, so it's good to see that I didn't make a mistake by getting the 520 instead of 620.

edit: no dice on the demo. got an error message, then added a space before /perf and it just loaded up like a normal game.
 
Originally posted by: Extelleron
<div class="FTQUOTE"><begin quote>Originally posted by: ArchAngel777
Hey, I really appreciate the review.

I was so tired of shoddy reviews that I didn't know what to believe. But your thread along with Keys & Apop really showed me that the card still has some severe performance issues. On the performance end of things, I believe AMD/ATI has failed. I was hoping that 7.6 cats would have turned the tide, but the card just isn't what it should be.

So it basically comes down to this: Stability.</end quote></div>

??

The card was built to beat the 8800GTS, and here it's being compared to the GTX, and it isn't much slower at all.


The card was most certainly NOT built to go head to head with the GTS. That is what AMD wants you to believe. They created the card hoping to beat the GTX and probabably thought it would... After they seen how it performed, they decided to pit it against the GTS.

Apop, Keys both know that I wanted to see this card succeed, but at this point, my mind is made up. The card has failed in the peformance department. We can always say "wait 2 months until better drivers, or wait a year!" That isn't the point though... The point is, the card right now has serious issues.

The only thing going for it at this point, is driver stability (perhaps) which will be tested by N7 and Apop & Keys.
 
Originally posted by: Extelleron

I have a very, very strong feeling that people with a 1GB HD 2900XT are going to be playing Crysis and future DX10 games better than the 8800GTX, however.

Feelings come and go bro... A feeling isn't something I would want to bet on. You might be right, the card might actually beat the GTX in future titles. But by the time that is the case, they will both be slow! So, really... It will be like saying the tortus is faster than the snail... It may be true, but in the end they both are slow.



 
Originally posted by: Extelleron
Originally posted by: ArchAngel777
Hey, I really appreciate the review.

I was so tired of shoddy reviews that I didn't know what to believe. But your thread along with Keys & Apop really showed me that the card still has some severe performance issues. On the performance end of things, I believe AMD/ATI has failed. I was hoping that 7.6 cats would have turned the tide, but the card just isn't what it should be.

So it basically comes down to this: Stability.
??

The card was built to beat the 8800GTS, and here it's being compared to the GTX, and it isn't much slower at all. Both the cards have overclocks though the XT has a larger (10% vs 3%) overclock... yes, the XT has 1GB of memory and much more bandwidth, but I don't think anyone expected that to make much of a difference as no games (yet) use more than 512MB of VRAM and the R600 is not memory bandwidth limited.

I don't see how it is a failure that this card is slightly slower than the GTX. I have a very, very strong feeling that people with a 1GB HD 2900XT are going to be playing Crysis and future DX10 games better than the 8800GTX, however. If you look at Beyond3D's recent benchmarks, the HD 2900XT 512MB beats the GTX in Company of Heroes DX10 with the newest drivers.

I think the failure part is subject to perspective. In terms of price and performance, it isn't so much of a failure as I think the card performs pretty well for its price point. But that doesn't make this card a complete success either.

I believe people refer to this card as a failure (pretty rough choice of words) because it was several months late, draws more power and is louder than the competition's offering. And let us not be naive to think that AMD didn't want to compete with Nvidia's 8800gtx. No one is in the game to make a graphics card to be just slightly slower than their competition's high end card. I'm pretty sure AMD engineers didn't think to themselves, "hey, let's go for Nvidia's second best card this time around!"

Anywho, excellent, excellent job n7. Thank you for your time and effort into benchmarking the two cards. It is much appreciated by all of us.
 
Back
Top