CrossFire Woes - need *Help* 2900xt/2900p

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: superbooga
There are certain games that the 2900xt and HD 38xx excel at, but the 8800 cards still give you the most consistent performance across all games. Is it NV's developer relations, good architecture, or both? Who knows. But the games that are the most taxing are the ones the 8800 does the best at. For older games I expect a dual-GPU from ATI to beat a single high end Nvidia GPU, but does 90 vs 110 fps really matter that much?

As yes, I am willing to send my GTS.

Thats a rather subjective statement, a quick google search turned up This review of 2900 Crossfire vs an Ultra also here. *Consistent* performance maybe....but If you can get Crossfire on the cheap as Apoppin has then theres no reason not to go dual GPU over a more expensive single GPU solution. I wish I would have held onto my GTX longer so I could put up some comparison results (sold it to a friend) but what I do remember is 30-40FPS walking around in Stalker with the GTX and 40-50FPS walking around in Stalker with my current Crossfire Rig and similar performance between the two in COH- Dual GPU was cheaper so why not.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: superbooga
There are certain games that the 2900xt and HD 38xx excel at, but the 8800 cards still give you the most consistent performance across all games. Is it NV's developer relations, good architecture, or both? Who knows. But the games that are the most taxing are the ones the 8800 does the best at. For older games I expect a dual-GPU from ATI to beat a single high end Nvidia GPU, but does 90 vs 110 fps really matter that much?

As yes, I am willing to send my GTS.

not true ... when Keys and i benchmarked our [each] 2900xts vs GTS-640 this early Summer, we each concluded that they were more or less equivalent cards ... with each their strengths and weaknesses.... Keys ended up keeping his GTS and i kept my 2900xt - each of us with no regrets.
--i would have regretted not trying Xfire, however ... someday i may be privileged to try SLi

If course, i'd love to benchmark your card ... do you need my 2900xt in the meantime?
:Q


--

anyway ... more that will make its way into the Showdown thread:


CPU Crysis demo 32bit Vista

CrossFire ----- Average FPS: 13.89, Min FPS: 4.46, Max FPS: 21.68
Single2900xt - Average FPS: 11.59, Min FPS: 4.95, Max FPS: 14.62


CPU Crysis demo 32bit Vista

Single2900xt - Average FPS: 10.60, Min FPS: 0.74, Max FPS: 17.16
CrossFire ----- Average FPS: 13.96, Min FPS: 1.98, Max FPS: 17.60



finally CoJ -16x10- High Shadows/Shader Map - 2048x2048

Vista 32 -- 14.4 Min/38.5 Avg/85.5 Max *Crossfire*
Vista 32 -- 14.7 Min/24.9 Avg/52.3 Max 1-2900xt
Vista 64 -- 15.9 Min/20.7 Avg/49.3 Max 1-2900xt

===


Seriously, i think i got a nice performance increase with the Crossfired Pro - despite it's being 256bit, despite it's being slower, despite it's being stuck in a 4x PCIe slot .. that is still +30% and it sure looks like 'Ultra territory' to me

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
TWO 2900XT's in Crossfire is alot better than what you'd get from any ultra, GTS or GTX.
Except he isn't running two 2900XTs in Crossfire, he's running a 2900 XT and a 2900 Pro (which likely makes the 2900 XT behave like a Pro), plus he's running one card on PCIe x4 which is known to cripple Crossfire.

Not to mention his cards are overclocked and the noise from both of them will drive the average person insane.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
I'll send you a PM sometime for your address. I'll just use my 7800GTX in the meantime (I don't do that much gaming atm).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
TWO 2900XT's in Crossfire is alot better than what you'd get from any ultra, GTS or GTX.
Except he isn't running two 2900XTs in Crossfire, he's running a 2900 XT and a 2900 Pro (which likely makes the 2900 XT behave like a Pro), plus he's running one card on PCIe x4 which is known to cripple Crossfire.

Not to mention his cards are overclocked and the noise from both of them will drive the average person insane.

nonsense ... the following review also used the same P35 i use

what *boggles* my mind is my little bastardized el cheapo XFire nearly equals the much more expensive 2900xt/2900xt Crossfire or a GTX ultra
WHY- Because i can Overclock them SEPARATELY ... CCC has improved that much!!!

in 3DMark06, i fall behind by 600 Marks [13090 vs 13692]
http://www.tweaktown.com/artic...ks_3dmark06/index.html
that is the only thing i can compare as i evidently use more demanding settings then they do and we do not use the same resolutions.

note our conclusions are the same:

http://www.tweaktown.com/artic...al_thoughts/index.html
The Crossfire setup really kicks butt. But the best thing? It?s consistent.

What helps Crossfire is the fact that the P35 is such a fantastic board and only currently supports Crossfire as a multi-card solution. So if you want an excellent motherboard that is going to give your CPU huge overclocks, along with the graphics performance to go with it, Crossfire is the only solution. When compared to Nvidia?s SLI arrangement, Crossfire generally manages to scale better as well, everything saw a performance increase and at 2560 x 1600 there were some pretty significant ones. ...

While the Ultra is not a heap of crap, to be brutally honest... it is not far from it! Huge price tags and a ridiculous price-to-performance ratio doesn?t make the card that attractive. Throw in the fact that a large amount of people still feel more comfortable using their Intel CPU on an Intel chipset and you really see yourself looking at Crossfire as an enthusiast setup. I know we mentioned this the other week but the bottom line is that thanks to the ridiculous price tag of the 8800Ultra it makes the HD 2900 XT smell like roses.

If you have the money and the motherboard, check out the HD 2900 XT in Crossfire. It is safe to say you won?t be disappointed, especially if you?re the owner of a screen which is capable of outputting 2560 x 1600. And thanks to more and more of them being released lately, pricing has become more aggressive, making a screen of this calibre a more viable choice.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: superbooga
I'll send you a PM sometime for your address. I'll just use my 7800GTX in the meantime (I don't do that much gaming atm).

this is a really really heavy thing ... ask nullpointerus ... the PO lost his Vista DVD ... it cost him $20 :p

if ANYTHING went wrong, i'd not only feel responsible, i BE responsible
:Q

is it that important i test it in my rig? i am pretty sure that it would beat my 2900xt almost across each benchmark and at most resolutions i run. But i seriously doubt that it would even come close to my CrossFire setup ... a GTX ultra probably would be a 'toss-up' in my rig

we can get a pretty good picture from the existing benchmarks ... and i am a little behind "real" 2900xt/2900xt CrossFire

But thank you for your extraordinary offer


Not to mention his cards are overclocked and the noise from both of them will drive the average person insane.
it would only bother a noise princess
:lips:

*one* card is significantly overclocked - the Pro isn't even *noticeable* over the XT [which isn't loud unless it get to 60%]

it appears that 2900xt/2900pro isn't much slower than 2900xt/2900xt crossfire THANKS to the recent BIG improvements in CCC which do not allow one card to penalize the other like before

i want to nvidia do that :p

didn't they have a full year's head start over ATi .. way back when?
:confused:


 

superbooga

Senior member
Jun 16, 2001
333
0
0
Apoppin, I think the GTS will be a lot closer than you think, especially when overclocked (yours is overclocked, so it's only fair). So how close do you think yours comes to an actual 2900XT CF? 90%?

We can compare benchmarks for any games we both have.
 

kmmatney

Diamond Member
Jun 19, 2000
4,363
1
81
It would be interesting to compare the 2900Xt versus the 2900Pro separately, with the 2900Pro overclocked to 2900XT speeds. That should test the 512-bit versus 256-bit memory interfaces.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: kmmatney
It would be interesting to compare the 2900Xt versus the 2900Pro separately, with the 2900Pro overclocked to 2900XT speeds. That should test the 512-bit versus 256-bit memory interfaces.

But it won't take into consideration the restriction of the 4xPCIe slot on the bandwidth of the GPU in that slot. A Pro will be less restricted than an XT. It *appears* from other testing that the 256-bit vs. 512-bit comes into play at higher resolutions than 16x10 .. so it should not affect my gaming much if at all

=================
Originally posted by: superbooga
Apoppin, I think the GTS will be a lot closer than you think, especially when overclocked (yours is overclocked, so it's only fair). So how close do you think yours comes to an actual 2900XT CF? 90%?

We can compare benchmarks for any games we both have.
actually the GTS OC'd is pretty close to an Ultra ... and it only really seems to start to lose at higher resolutions than 16x10 ... and mine is really NOT oc'd - not compared to 2900xt crossfire ... i only OC the Pro to bring it up to near XT speeds ... i am a bit short on the core by 45hz.

As to my mismatched Xfire, i would say it is at least 90 percent of "real 2900xt" Crossfire. And i will base it partly on that 3DMark06 score - i get 600 Marks less - 10090 vs 10692 - BUT i am using Vista 64 which scores about 200-300 marks lower anyway.

And the only reason i got "lucky" is my "timing" ... timing IS everything ... six weeks ago this would not have worked so well. Cat 8.1 unlocked it. i did a lot of calculating before i jumped for the 256-bit model [i just missed the 512-bit Pro] and i would say i got Lucky. Really lucky.

CrossFire has finally matured to the point where 2 GPUs that have different cores and could have different Memory clocks can work together - at their OWN clocks - with very little performance penalty over a perfect match.

maybe i am just easily impressed :p

==============

when i was away, i replayed Hellgate: London ... and i am blown-away ... Full DX10 - no compromise - plus ... plus, get this - 4xAA/16xAF with every in-game detail fully maxed.

And it is smooth ... i played for quite awhile - till i got tired maybe over 1-1/2 hours without a hitch or stutter - no memory mismanagement problems that used to plague it pre-patch - and i am playing it on Vista32!

i could only dream of this Hg:L fluidity with a single 2900xt ..
yep, it is the best $150 GPU upgrade i made in a long time ...
- i can't remember the last one that was so cheap and so bang-for-buck

.. a AMD step-up indeed :D


bedtime

goodnight
:moon:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
it would only bother a noise princess
I would bother just about anyone that isn't deaf, just like it bothered most reviewers.

utterly ridiculous .. Keys had one .. many others here have posted about it and i HAVE one .. it is LOUD as HELL at full load [period]

That is what the reviewers mentioned - at FULL LOAD it sounds like a mad genie is trying to get out of the bottle all at once ... quite a "whoosh" of air - but no "whine" like the DustBuster.

But it NEVER - ever - gets to full load unless it is about to melt or the fan is locked at maximum by the user ... it is not louder than my x850xt and it is not louder than my x1950p/512M and it is CERTAINLY not louder then my OC'd 6800GS OC which was an annoying little buzzsaw .. and probably no louder - most of the time - than your card at full load.

You only have your interpretation of 2nd hand news to say otherwise ... everyone here who owned or owns one will TESTIFY to the same thing ... 'normal' at normal play - loud at full load.
in other words, you have ... nothing but your imagination :p
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
You only have your interpretation of 2nd hand news to say otherwise ... everyone here who owned or owns one will TESTIFY to the same thing ... 'normal' at normal play - loud at full load.
in other words, you have ... nothing but your imagination
My 8800 Ultra regularly spins up to 100% during gaming and it was definitely loud enough to be annoying before I started using headphones for gaming.

(Incidentally my choice to use headphones was not because of this, I just found I could hear more sounds from the game with them compared to speakers, but I digress).

So given the 2900 XT has equal to or greater thermal characteristics than 8800 Ultra I'd say it would spin up often enough to be a factor.

Now I suppose if the 2900?s temperature thresholds are significantly higher before spinning up (AFAIK my Ultra will spin up to 100% somewhere in the range of 75-85 C), it might be less of a factor.

I also suppose if someone ran their 2900 XT at low resolutions with low AA levels and vsync it wouldn?t strain it as much either, but that is not my style.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
"what *boggles* my mind is my little bastardized el cheapo XFire nearly equals the much more expensive 2900xt/2900xt Crossfire or a GTX ultra
WHY- Because i can Overclock them SEPARATELY ... CCC has improved that much!!!"



This is why I asked you if you are running AFR. Even if you can independantly o/c each card the moon, both cards will only render scenes as fast as the slowest card can. So, having said that, your Crossfire setup can only be as fast as 2 2900 pro's o/c'd in 4 x slots. Period. If you were able to run SFR where there is a load balance (e.g. the 2900XT renders the top 65% of your scene and the 2900pro renders the bottom 35%. The 2900XT would be doing the heavy lifting in the PCI-e x16 slot, and the pro would take up whatever slack the load balancing mechanism thinks it could handle.

That's why I asked which rendering method was being used. Which is?...

P.S. You paid 330 bucks for your XT. Then 150 on the pro. 480 total. GTX's can be had for just under 400 these days with rebates and around 445 without rebates. And with the new GTS, which is on par with a GTX and often beats it at stock clocks, that would be around 279 with rebates and 299 without.

The more I look at it, the more I think you could have done better with a single faster card, only because of the setup you have in particular. Why? Because I don't think your setup can beat a single 8800GTS G92 as it sits. Not with AFR. And unless I'm seriously mistaken, AFR is your only option.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
You only have your interpretation of 2nd hand news to say otherwise ... everyone here who owned or owns one will TESTIFY to the same thing ... 'normal' at normal play - loud at full load.
in other words, you have ... nothing but your imagination
My 8800 Ultra regularly spins up to 100% during gaming and it was definitely loud enough to be annoying before I started using headphones for gaming.

(Incidentally my choice to use headphones was not because of this, I just found I could hear more sounds from the game with them compared to speakers, but I digress).

So given the 2900 XT has equal to or greater thermal characteristics than 8800 Ultra I'd say it would spin up often enough to be a factor.

Now I suppose if the 2900?s temperature thresholds are significantly higher before spinning up (AFAIK my Ultra will spin up to 100% somewhere in the range of 75-85 C), it might be less of a factor.

I also suppose if someone ran their 2900 XT at low resolutions with low AA levels and vsync it wouldn?t strain it as much either, but that is not my style.

i'd say my 2900xt and your ultra would both be annoying to a noise princess - and certainly annoying to everyone at 100%. For most people, they are not annoying ... and my 2900xt has - again - never EVER - hit 60% during gaming [the threshold of irritation]. --f it hit 100% i would immediately shut down my rig to find out what happened

i don't know about "low resolutions", but i always ran my 2900xt at 16x10 or 16x12 with as much AA as performance would allow ... and now i just find that i can use more AA/details at a higher level of performance with the *same* amount of noise - the 2900p - even OC'd cannot be heard - or at least distinguished from the other sounds.

============

Originally posted by: keysplayr2003
"what *boggles* my mind is my little bastardized el cheapo XFire nearly equals the much more expensive 2900xt/2900xt Crossfire or a GTX ultra
WHY- Because i can Overclock them SEPARATELY ... CCC has improved that much!!!"



This is why I asked you if you are running AFR. Even if you can independantly o/c each card the moon, both cards will only render scenes as fast as the slowest card can. So, having said that, your Crossfire setup can only be as fast as 2 2900 pro's o/c'd in 4 x slots. Period. If you were able to run SFR where there is a load balance (e.g. the 2900XT renders the top 65% of your scene and the 2900pro renders the bottom 35%. The 2900XT would be doing the heavy lifting in the PCI-e x16 slot, and the pro would take up whatever slack the load balancing mechanism thinks it could handle.

That's why I asked which rendering method was being used. Which is?...

P.S. You paid 330 bucks for your XT. Then 150 on the pro. 480 total. GTX's can be had for just under 400 these days with rebates and around 445 without rebates. And with the new GTS, which is on par with a GTX and often beats it at stock clocks, that would be around 279 with rebates and 299 without.

The more I look at it, the more I think you could have done better with a single faster card, only because of the setup you have in particular. Why? Because I don't think your setup can beat a single 8800GTS G92 as it sits. Not with AFR. And unless I'm seriously mistaken, AFR is your only option.

i told you i did not know :p
-i do not know what type of rendering that CCC uses currently. All i know is that it is "enabled" and that my 3DMark06 score nearly *matches* a 2900xt/2900xt Crossfire rig. AND that each GPU can be clocked separately with a noticeable effect on performance.

And i also *believe* that my system will leave a GTS-92/GTX in the dust and compete with the Ultra in my rig ... which *still* cannot be had for $480 - with or without a gaming bundle :p

don't forget ... this is my "step up" - i bought the first Part of it when you got your GTS in early Summer ... i paid $480 over 'time' with no interest charged and no cost to have it "exchanged"

i think you would SLI your own GTS IF you could get another brand-new matching one for $150 :p
-and SLi'd G80-GTS640 would also kill an OC'd GTS-92 and GTX just like my rig does - even if your 2nd GPU was held back a little bit by being in a bandwidth-limited slot. It also appears that 256-bit makes no difference at my resolution.

You really think i should have gone thru the hassle of selling my 2900xt - possibly for less than $200 [or chance getting ripped off] and buying an Ultra for $500+ just to get a similar level of performance? Not my style.
:confused:

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: apoppin


============

Originally posted by: keysplayr2003
"what *boggles* my mind is my little bastardized el cheapo XFire nearly equals the much more expensive 2900xt/2900xt Crossfire or a GTX ultra
WHY- Because i can Overclock them SEPARATELY ... CCC has improved that much!!!"



This is why I asked you if you are running AFR. Even if you can independantly o/c each card the moon, both cards will only render scenes as fast as the slowest card can. So, having said that, your Crossfire setup can only be as fast as 2 2900 pro's o/c'd in 4 x slots. Period. If you were able to run SFR where there is a load balance (e.g. the 2900XT renders the top 65% of your scene and the 2900pro renders the bottom 35%. The 2900XT would be doing the heavy lifting in the PCI-e x16 slot, and the pro would take up whatever slack the load balancing mechanism thinks it could handle.

That's why I asked which rendering method was being used. Which is?...

P.S. You paid 330 bucks for your XT. Then 150 on the pro. 480 total. GTX's can be had for just under 400 these days with rebates and around 445 without rebates. And with the new GTS, which is on par with a GTX and often beats it at stock clocks, that would be around 279 with rebates and 299 without.

The more I look at it, the more I think you could have done better with a single faster card, only because of the setup you have in particular. Why? Because I don't think your setup can beat a single 8800GTS G92 as it sits. Not with AFR. And unless I'm seriously mistaken, AFR is your only option.

i told you i did not know :p
-i do not know what type of rendering that CCC uses currently. All i know is that it is "enabled" and that my 3DMark06 score nearly *matches* a 2900xt/2900xt Crossfire rig.

And i also *believe* that my system will leave a GTS-92/GTX in the dust and compete with the Ultra in my rig ... which *still* cannot be had for $480 - with or without a gaming bundle :p

don't forget ... this is a "step up" - i bought the first Part of it when you got your GTS ... it is $480 over 'time' with no interest charged and no cost to have it "exchanged"

i think you would SLI your own GTS IF you could get another brand-new matching one for $150 :p
-and it would also kill a GTS-92 and GTX just like my rig does

For 150, I certainly would pick up a second GTS640. But since they cant be found for so little, I am considering selling mine in anticipation of a new 9800GX2. But not before i see how it performs in the games I play. Also, 2 8800GT's might be in order. I really haven't decided what to do yet. Or lastly, by some miracle, if a new 9800 emerges. But that may not be for a while yet. I figure I can get 225 for my GTS, and that would go toward a new card. I have a 7600GT to tide me over if that time comes.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: apoppin
...
-i do not know what type of rendering that CCC uses currently. All i know is that it is "enabled" and that my 3DMark06 score nearly *matches* a 2900xt/2900xt Crossfire rig.

And i also *believe* that my system will leave a GTS-92/GTX in the dust and compete with the Ultra in my rig ... which *still* cannot be had for $480 - with or without a gaming bundle :p

don't forget ... this is a "step up" - i bought the first Part of it when you got your GTS ... it is $480 over 'time' with no interest charged and no cost to have it "exchanged"

i think you would SLI your own GTS IF you could get another brand-new matching one for $150 :p
-and it would also kill a GTS-92 and GTX just like my rig does

For 150, I certainly would pick up a second GTS640. But since they cant be found for so little, I am considering selling mine in anticipation of a new 9800GX2. But not before i see how it performs in the games I play. Also, 2 8800GT's might be in order. I really haven't decided what to do yet. Or lastly, by some miracle, if a new 9800 emerges. But that may not be for a while yet. I figure I can get 225 for my GTS, and that would go toward a new card. I have a 7600GT to tide me over if that time comes.

So we agree? :p

IF you had the SAME path i have, you would definitely consider it. SLi'd GTS-640s or Xfired 2900XTs make for a very fast Video solution - at least very close to the fastest of the current single GPUs.

i just took a *chance* on a cheap 'n dirty upgrade that blew my mind
:D
 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Because of this thread I'm really considering adding a second 3870. If I can find one for ~200 or less that would be a steal.

Btw apoppin plz put CrossFire in your Rig. :)

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: MegaWorks
Because of this thread I'm really considering adding a second 3870. If I can find one for ~200 or less that would be a steal.

Btw apoppin plz put CrossFire in your Rig. :)

CrossFire IS in my rig :p
-also in my sig since i set it up ... Do you think "VT HD2900xt/Sapphire 2900Pro" gives a clue? --or perhaps someone will think i am running them separately and still getting over 13K in 3DMark06?
:confused:

:D

a 2nd 3870 should give impressive results.

Btw, here is my 2900P:
*be careful* ... the advertising might be *NSFW*
[depends on where you work]

http://xtreview.com/addcomment...56-Bit-memory-bus.html
The frequency is the same , they are equal to 600/1600 MHz. This Radeon HD 2900 Pro version differs from the Radeon HD 2900 GT by the memory capacity and the availability of all 320 stream processor.

ALSO .. here is 2900xt/3970/8800GT compared ... i think either of our setups in Xfire will eat a single GT for breakfast, lunch OR dinner
:cool:

Here