Mixed Crossfire action

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Not many people like Fud but I'll post this anyway for those that may be thinking of adding another card to their setup. Looks good for those that later on down the track would want to add another 38xx series card......now I am tempted to combine a 3870X2 with my 3850's :thumbsup:.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i am impressed ... and it appears that you can now 'mix and match' the clockspeeds ... they used to have to be identical before

Catalyst 8.1 correctly identifies the cards. We could see one HD3870 and one HD3850 card and we could access them separately in the Overdrive section. You can overclock them and the driver allowed us to change GPU and memory clocks on each card regardless from the other card's settings. However, this didn't give us much in terms of performance, but we're still pleased to see that you can mix these R670 based cards and set the clocks independantly.

i can Overclock my 2900XT separately from my 2900p
-live and learn ... thanks for the link

and i appears that my 3DMark06 score is inline with their mix 'n match
[edit - nope, mine is higher :)]... but what are they measuring in HL2 - average?
-mine is way higher ... but probably not the same bench


damn is it THAT late?
:clock:

g'nite!
:moon:


 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
apoppin: Are you using Catalyst 8.1 for your 2900 crossfire? Unlinking the GPU clocks should allow you to find how much extra you can gain from the second card before you become totally CPU or PCIe4x limited.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: betasub
apoppin: Are you using Catalyst 8.1 for your 2900 crossfire? Unlinking the GPU clocks should allow you to find how much extra you can gain from the second card before you become totally CPU or PCIe4x limited.

yep ... and you no longer have to "unlink" them to clock them separately
[i figured it out at 2AM before i shut down for self-maintainance :D]

CCC is THAT smart :)
[unfortunately i am not when i am exhausted :p]

added -

what *boggles* my mind is that nvidia had a year's Head-start over AMD with multi-GPU ... and yet AMD's CrossFire has surpassed nvidia's SLI - in *every* way
--i don't think anyone has to worry about future support for X2 GPUs from AMD

CrossFirefinished with my Pro at 686/840 and my XT at 743/828 and i settled on 700/840 for the Pro and 752/849 for the XT ... the XT is evidently not held back like in earlier editions of Xfire
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Thanks for posting your results: I'm particularly interested in how well your results scale because I have a very similar set-up (E4500@3.33GHz, MSI P35) with a single 2900pro 512MB 512bit. The GPU core will run 825MHz+ but only with minor increases to 3dMark06, whereas pushing the CPU from 3.20 to 3.33GHz gained 200points to break the 11k mark.

I can add either a 512bit pro, or a cheaper 256bit pro, to go crossfire. Either version would put me in your situation: CPU-limited, sure, but what about the effect of the CPIe 4x on the second card?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Overclocking the Pro more in the 2nd slot did nothing for my 3DMark06 score ... still 13K ... i am *guessing* it is pretty much at it's point of diminishing returns and also bandwidth limited by the 4x slot

i am further guessing that the only way to get a further performance increase is to OC my XT a lot further amd flash the BIOS of the Pro but then i am at the point of diminishing returns :p
- and i DO plan to sell them on FS/T ... i don't want my buyer to have burned-out junk :p

i think the solid increase in some games i noted makes it a worthy "$150 upgrade" - especially in CoJ and LP - the DX10 pathway actually allowing me to really "play" them now with minimal adjustments downward - and note that i am benchmarking them with 4xAA/16xAF and at 16x10 - they were completely unplayable even at 10x7 back in May. There is nothing else i can do for $150 that will get me into Ultra territory. ;)
:Q

!
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
It must be nice to be getting a decent boost from a cheap, secondary "support" card (compared to the XT!). In fact my local price for the 256bit pro has dropped so low, the 512bit is now >50% more expensive. Sapphire must have a container-load of R600 GPUs to sell, but they're being priced to sell to the budget market using 256bit memory, instead of the original expensive 512bit memory. At this rate I could sell my 512bit pro and pick up two 256bit cards... :Q

apoppin, I'm looking for a comparison of similarly clocked R600s with 256 vs 512bit memory. I know you are busy with your crossfire benchmarks, but if you know of any online comparison pls could point me to the URL.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: betasub
It must be nice to be getting a decent boost from a cheap, secondary "support" card (compared to the XT!). In fact my local price for the 256bit pro has dropped so low, the 512bit is now >50% more expensive. Sapphire must have a container-load of R600 GPUs to sell, but they're being priced to sell to the budget market using 256bit memory, instead of the original expensive 512bit memory. At this rate I could sell my 512bit pro and pick up two 256bit cards... :Q

apoppin, I'm looking for a comparison of similarly clocked R600s with 256 vs 512bit memory. I know you are busy with your crossfire benchmarks, but if you know of any online comparison pls could point me to the URL.

it is nice ... and very unexpected

just google your keywords

examples:

http://www.theinquirer.net/gb/...01/ati-rv670-beat-r600
This will probably be the first time in history that the part with narrower memory controller (256-bit) will beat wider part (512-bit), at least in some tests. We were told to pay special attention to titles that are heavy on longer shader code. Expect that Radeon HD 2900XT will speed pass through RV670 in all tests that require heavy bandwidth load, such as FullHD (1920x1200) or XHD (2560x1600) variants. Bear in mind that AMD will push for multi-GPU configurations for these users, and if you opt for a single-slot board, it should be no problem to put 2-3-4 boards in the same system.

ARS technica went into it a bit here, i think:

http://arstechnica.com/reviews...n-hd-3850-review.ars/2
In addition to its DirectX 10.1 support, the HD 3800 series offers several other features that differentiate it from its R600-based brethren. The new core retains the R600's 512-bit ring bus, but substitutes a higher-efficiency 256-bit memory bus over the old 512-bit configuration. ATI maintains that the increased efficiency of the new memory bus will generally compensate for bandwidth loss.

The HD 3800 series has also been updated to compute double-precision floating point data, as opposed to the single-point precision the R600 was capable of.

This compares 3870 with 2900xt and 8800GT ... i am certain CrossFire will eat the GT for breakfast.

http://it-review.net/index.php?option=com_content&task=view&id=2290&Itemid=91&limit=1&limitstart=6

this review might be worth looking at also:

http://www.extremetech.com/art.../0,1697,2217138,00.asp

i remember the discussions here that the 512-bit for the 2900xt was OverKill ... evidently :p
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I have a Gigabyte AM2 board that supports crossfire but only has (1)16x and (1)4x PCi-e slots. I wonder what the penalty would be in that scenario? Do both slots become 4x only?
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
so when they get crossfire working for more than 2 gpu's, ( i believe they just need to releasethe driver) then i can run a 3870x2 plus my 3870 and have triple crossfire and run starcraft at unheard of levels of detail?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Killrose
I have a Gigabyte AM2 board that supports crossfire but only has (1)16x and (1)4x PCi-e slots. I wonder what the penalty would be in that scenario? Do both slots become 4x only?

no ... not at all ... what happens is that the bandwidth of the *2nd* is restricted - 10-25% in extreme cases; You get the "horsepower" of the first card [clocked to it's highest OC] with the 2nd card contributing whatever it can - also on its own separate OC.

we are still unsure of what a 256-bit coupled with a 512-bit card does to the wider bandwidth card .... but i am surmising it is *moot* in my case since my resolutions are only up to 16x12.
 

manimal

Lifer
Mar 30, 2007
13,560
8
0
If continuing driver updates continue in this path then ATI will increase market share. This type of flexibility and ease can only help them. Dual screen over xfire is compelling in its own right.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
Originally posted by: apoppin
Originally posted by: Killrose
I have a Gigabyte AM2 board that supports crossfire but only has (1)16x and (1)4x PCi-e slots. I wonder what the penalty would be in that scenario? Do both slots become 4x only?

no ... not at all ... what happens is that the bandwidth of the *2nd* is restricted - 10-25% in extreme cases; You get the "horsepower" of the first card [clocked to it's highest OC] with the 2nd card contributing whatever it can - also on its own separate OC.

we are still unsure of what a 256-bit coupled with a 512-bit card does to the wider bandwidth card .... but i am surmising it is *moot* in my case since my resolutions are only up to 16x12.

Nice to know thanks. Would be interesting to see the crossfire performance difference between (2)16XPCi-e vs (1)16X and (1)4XPCi-e to know if it is even a option to consider. Certainly not into buying another 3850 to try it out. Even then not sure my X2 5000+ would be able to push it.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
54
91
yea its amazing how u can mix and match cards now to create your own level of performance.
dual monitors are great for gamers that also like to be productive. and the driver support looks great.

if they keep this up i'll be upgrading sooner than i thought.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: apoppin
i remember the discussions here that the 512-bit for the 2900xt was OverKill ... evidently :p

Thinking about this, I realised all I needed to do was to simulate the 256bit card was run my 512bit card at half memory clock. Using 3dmark06 as a convenient bench, the results were surprising to me, having accepted conventional wisdom, but agree with some of your links. (CPU scores just to show unchanged basic platform).

2900 Clocks......850/850 ... 850/425
Bandwidth Gb/s....108.8 ... 54.4
3DMark Score......11127 ... 9791
Sm2.0 Score........4526 ... 3801
GT1.....................34.568 ... 28.001
GT2.....................40.868 ... 35.347
HDR/SM3.0...........5314 ... 4542
HDR1...................48.507 ... 42.028
HDR2...................57.777 ... 48.819
CPU Score............2889 ... 2907
CPU1....................0.916 ... 0.923
CPU2....................1.458 ... 1.466

OK - no more than 10fps difference in any one game test, but enough of a difference to show bandwidth does matter at 1280x1024. And for gaming at higher resolutions, while feeding two GPUs in crossfire, the effect is only going to get larger.

Now to try out that PCIe 4x slot :D
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: betasub
Originally posted by: apoppin
i remember the discussions here that the 512-bit for the 2900xt was OverKill ... evidently :p

Thinking about this, I realised all I needed to do was to simulate the 256bit card was run my 512bit card at half memory clock. Using 3dmark06 as a convenient bench, the results were surprising to me, having accepted conventional wisdom, but agree with some of your links. (CPU scores just to show unchanged basic platform).

2900 Clocks......850/850 ... 850/425
Bandwidth Gb/s....108.8 ... 54.4
3DMark Score......11127 ... 9791
Sm2.0 Score........4526 ... 3801
GT1.....................34.568 ... 28.001
GT2.....................40.868 ... 35.347
HDR/SM3.0...........5314 ... 4542
HDR1...................48.507 ... 42.028
HDR2...................57.777 ... 48.819
CPU Score............2889 ... 2907
CPU1....................0.916 ... 0.923
CPU2....................1.458 ... 1.466

OK - no more than 10fps difference in any one game test, but enough of a difference to show bandwidth does matter at 1280x1024. And for gaming at higher resolutions, while feeding two GPUs in crossfire, the effect is only going to get larger.

Now to try out that PCIe 4x slot :D

huh? why half the memory clock?
:confused:

my XT is 752/849Mhz and my Pro is 700/840Mhz their separate clocks are maintained by CCC without restricting each other in any way.

However, the bandwidth of the Pro IS restricted by the 4X PCIe slot - maybe ~10% and the 256-bit "penalty" if also applied to the XT would be only severe at high resolutions ... maybe another 10% @ 16x10
-- after all is said and done, when CrossFire scales - and it has in the popular titles i am playing - it scales very well in my system; i am playing at higher detail PLUS AA/AF with a better FPS bottom/average/top then with a single 2900xt which was sufficient until i decided i needed to run DX10 games in the DX10 pathway ... my success with Hg:L is amazing ... the graphics have actually "hooked" me back into replaying it for about the 5th time :p