New 3870X2: Crossfire a no-show

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
you will get the "deluxe" experience with Hg:L with Vista 64

hmmm... I dunno about all that. It plays in DX10 mode decently enough, but I'm not able to use all the goodies. It probably plays about like an 8800GTX.

GoW plays pretty well though. So far, it's about what I expect from a $450 card. I do hope the drivers improve over time though.

I'm honestly just kind of stoked about running a completely non-NVIDIA machine. It's been years. Not that there's anything wrong with NVIDIA, but I like change sometimes as well.

The card is really quiet, which is nice. I was a little worried about it being noisy, but it's not at all. Just the steady hum of a quality video card. :)
 

Evenkeel

Member
Sep 3, 2004
189
0
0
Originally posted by: Sylvanas
Although the X2 is essentially a Crossfire card, AMD hid the 'Crossfire' tab from the CCC as the card acts as a single card solution- theres no point having a 'Crossfire' tab when both GPU's are used by default. To use Overdrive you have to click on the lock up the top left, once you do that you'll have access to frequency control. However to use Overdrive you'll need to have an 8pin connector +6pin connector plugged in- not just two 6 pins, there are ways to get Overdrive with only 2 6pins but it requires shorting out two pins on the 8pin connection and only really recommended if you know exactly what your doing :).

EDIT: GPU-Z will show you two cores, however CCC won't- thats fine and normal and how things work, additionally the core/mem clocks are unified between both GPU's.

Thanks... I did catch the "Overdrive" lock button tonight--that's what happens when I try to solve problems at 3 a.m.

BTW, I do have the proper power connectors connected to the card: one PCIe 8-pin, and one PCIe 6-pin, coming off a PC Power & Cooling Turbo-Cool 860.

On the Crossfire issue... Okay, I sort of get it that a Crossfire tab isn't needed if both GPUs are acting as one. Notice I said if. I'm still not sure they are--it still looks like CCC is showing me 2 cores. (But if they are, please chalk up my questioning to inexperience.)

Anyway, here's what I'm seeing:

1) In CCC, in "ATI Overdrive", in the "Select GPU to configure" dropdown list, there are 2 listings for the card, both reading the same thing: "ATI Radeon HD 3870 X2". It makes it look, to my inexperienced eyes, that there are 2 seperate GPUs to configure.

2) In the "Information Center", in "Graphics Hardware", I see "Memory Size" listed as 512 MB. There's supposed to be 1 GB on the card--why does it only say 512 MB?

3) In the CCC icon down in the Tasktray, when I right-click for the popup menu, there are 2 listings for the card, numbered "1" and "2", and both again say "ATI Radeon HD 3870 X2". But the flyout menu for #1 has sub-flyouts for "Rotate Display", "Set desktop area to", and "3D Settings" w/its large sub-flyout menu of additional 3D options. However, the flyout menu for "card" #2 only has "Extended desktop" available. If the 2 GPUs are acting as one, why do I see 2 listings?

4) I know you said GPU-Z shows 2 cores, but shouldn't the data it shows on both cores be pretty much the same? For example, on one core, it says the "Default Clock" is running at 825 MHz, and the "Memory" at 901 MHz. But on the other core, it says the "Default Clock" is running at 411 MHz, and the "Memory" at 450 MHz. And at the bottom, it says "ATI Crossfire" is disabled. Under the "Sensors" tab, the "Fan Speed" for one GPU is currently running at 35%, while the fan speed for the other GPU is listed at 0%.

I know I'm a terrible n00b asking stuff like this, but I want to learn. And I also want to make sure the card is operating the way it's supposed to. I appreciate your answering these questions.
 

Evenkeel

Member
Sep 3, 2004
189
0
0
Originally posted by: BFG10K
Yep, having no Crossfire setting for the 3870 X2 is normal. You can confirm it's Crossfired by the presence of 16xAA.

Again, pardon my inexperience--what is "16xAA" and where do I find it in CCC?
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Evenkeel
Originally posted by: BFG10K
Yep, having no Crossfire setting for the 3870 X2 is normal. You can confirm it's Crossfired by the presence of 16xAA.

Again, pardon my inexperience--what is "16xAA" and where do I find it in CCC?

AA= Anti Aliasing. If you can set the slider in CCC to 16x samples (all the way to the right) then that confirms crossfire is operational.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Evenkeel
Originally posted by: BFG10K
Yep, having no Crossfire setting for the 3870 X2 is normal. You can confirm it's Crossfired by the presence of 16xAA.

Again, pardon my inexperience--what is "16xAA" and where do I find it in CCC?

In the CCC > 3D > Anti-Aliasing... (make sure you enable 'Advanced View' in the View menu)

"16x" refers to the level of anti-aliasing (AA) available. With a single card/singe gpu setup the max value on the slider is 8x.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Evenkeel
Originally posted by: Sylvanas
Although the X2 is essentially a Crossfire card, AMD hid the 'Crossfire' tab from the CCC as the card acts as a single card solution- theres no point having a 'Crossfire' tab when both GPU's are used by default. To use Overdrive you have to click on the lock up the top left, once you do that you'll have access to frequency control. However to use Overdrive you'll need to have an 8pin connector +6pin connector plugged in- not just two 6 pins, there are ways to get Overdrive with only 2 6pins but it requires shorting out two pins on the 8pin connection and only really recommended if you know exactly what your doing :).

EDIT: GPU-Z will show you two cores, however CCC won't- thats fine and normal and how things work, additionally the core/mem clocks are unified between both GPU's.

Thanks... I did catch the "Overdrive" lock button tonight--that's what happens when I try to solve problems at 3 a.m.

BTW, I do have the proper power connectors connected to the card: one PCIe 8-pin, and one PCIe 6-pin, coming off a PC Power & Cooling Turbo-Cool 860.

On the Crossfire issue... Okay, I sort of get it that a Crossfire tab isn't needed if both GPUs are acting as one. Notice I said if. I'm still not sure they are--it still looks like CCC is showing me 2 cores. (But if they are, please chalk up my questioning to inexperience.)

Anyway, here's what I'm seeing:

1) In CCC, in "ATI Overdrive", in the "Select GPU to configure" dropdown list, there are 2 listings for the card, both reading the same thing: "ATI Radeon HD 3870 X2". It makes it look, to my inexperienced eyes, that there are 2 seperate GPUs to configure.

2) In the "Information Center", in "Graphics Hardware", I see "Memory Size" listed as 512 MB. There's supposed to be 1 GB on the card--why does it only say 512 MB?

3) In the CCC icon down in the Tasktray, when I right-click for the popup menu, there are 2 listings for the card, numbered "1" and "2", and both again say "ATI Radeon HD 3870 X2". But the flyout menu for #1 has sub-flyouts for "Rotate Display", "Set desktop area to", and "3D Settings" w/its large sub-flyout menu of additional 3D options. However, the flyout menu for "card" #2 only has "Extended desktop" available. If the 2 GPUs are acting as one, why do I see 2 listings?

4) I know you said GPU-Z shows 2 cores, but shouldn't the data it shows on both cores be pretty much the same? For example, on one core, it says the "Default Clock" is running at 825 MHz, and the "Memory" at 901 MHz. But on the other core, it says the "Default Clock" is running at 411 MHz, and the "Memory" at 450 MHz. And at the bottom, it says "ATI Crossfire" is disabled. Under the "Sensors" tab, the "Fan Speed" for one GPU is currently running at 35%, while the fan speed for the other GPU is listed at 0%.

I know I'm a terrible n00b asking stuff like this, but I want to learn. And I also want to make sure the card is operating the way it's supposed to. I appreciate your answering these questions.

1) Even better, when there are two drop downs in the Overdrive tab and you can configure both- that is move the slider for core and memory that means you can adjust the clocks individually for each core on your card- this is a great new feature, and I wasn't aware you could adjust both cores individually on the X2.

2) 512mb is allocated to each core, so you have 1gig on the card physically, but its *really* 512mb per core.

3) It's the same for my Crossfire setup. I guess the second listing with 'extended desktop' is the means to enable dual monitor Crossfire acceleration. This is of no consequence if you aren't using dual monitors, all settings within the CCC will affect both cores.

4) GPU-Z reads 'default clocks' as the speed which the BIOS determines that the GPU's will be running when in a 3D application. First you must understand that when on the desktop all modern ATI GPU's go into '2D Powerplay mode' in which the clockspeed is lowered, as is certain voltages. This is to lower power consumption in addition to heat, as you don't need full power on all the time if you're just loading a word document. Now these 2D clocks, the CCC records as 'Current clock settings' in overdrive. When you start a 3d application, the clockspeed and voltages will be restored to their full power 3D settings.

As for GPU-Z crossfire reading, the application is buggy so don't trust it all of the time, it's a known bug that it says Crossfire is disabled when in fact it's enabled, I have come across this (others have aswell) primarily in Vista 64, see the techpowerup GPU-Z forum for more details. The sensors are only really worth it to view temps but then again CCC does that aswell. Fanspeed is best adjusted in Rivatuner if you really need it, this application can be downloaded from Guru3D. The fanspeed is also 'dynamic' in that it will spin faster or slower depending on load and temperatures, so expect it to be slow in 2D mode as nothing is happening and its not getting too hot.

I will edit my post later if I made any mistakes- I am pretty tired ATM :p, I hope that clears a few things up anyway. Enjoy your X2!
 

dadach

Senior member
Nov 27, 2005
204
0
76
i tried those chinese drivers...they dont recognize my x2 boards...does anyone have beta driver for crossfireX...thats the only time the crossfire tab should show up in catalysts...ok, it shows even with current drivers and 2 x2s but there is no option to enable it...or do i have to wait a week :D
 

Evenkeel

Member
Sep 3, 2004
189
0
0
Originally posted by: Sylvanas
Originally posted by: Evenkeel
Originally posted by: BFG10K
Yep, having no Crossfire setting for the 3870 X2 is normal. You can confirm it's Crossfired by the presence of 16xAA.

Again, pardon my inexperience--what is "16xAA" and where do I find it in CCC?

AA= Anti Aliasing. If you can set the slider in CCC to 16x samples (all the way to the right) then that confirms crossfire is operational.

Okay, I found it. It was currently set to "Use application settings"--i.e. the checkbox was checked. I unchecked the checkbox, and was indeed able to move the slider to the right, to get 16x. However, when I got past 8x and reached 16x, the "Temporal anti-aliasing" checkbox, and the "Filter" drop-down, both grayed out. Is that okay? I mean, is that normal when the slider is moved to 16x?

Also, is it better at this point to just leave it checked to "Use application settings", at least until I can learn what the hell I'm doing?
 

Evenkeel

Member
Sep 3, 2004
189
0
0
Originally posted by: Sylvanas

1) Even better, when there are two drop downs in the Overdrive tab and you can configure both- that is move the slider for core and memory that means you can adjust the clocks individually for each core on your card- this is a great new feature, and I wasn't aware you could adjust both cores individually on the X2.

2) 512mb is allocated to each core, so you have 1gig on the card physically, but its *really* 512mb per core.

3) It's the same for my Crossfire setup. I guess the second listing with 'extended desktop' is the means to enable dual monitor Crossfire acceleration. This is of no consequence if you aren't using dual monitors, all settings within the CCC will affect both cores.

4) GPU-Z reads 'default clocks' as the speed which the BIOS determines that the GPU's will be running when in a 3D application. First you must understand that when on the desktop all modern ATI GPU's go into '2D Powerplay mode' in which the clockspeed is lowered, as is certain voltages. This is to lower power consumption in addition to heat, as you don't need full power on all the time if you're just loading a word document. Now these 2D clocks, the CCC records as 'Current clock settings' in overdrive. When you start a 3d application, the clockspeed and voltages will be restored to their full power 3D settings.

As for GPU-Z crossfire reading, the application is buggy so don't trust it all of the time, it's a known bug that it says Crossfire is disabled when in fact it's enabled, I have come across this (others have aswell) primarily in Vista 64, see the techpowerup GPU-Z forum for more details. The sensors are only really worth it to view temps but then again CCC does that aswell. Fanspeed is best adjusted in Rivatuner if you really need it, this application can be downloaded from Guru3D. The fanspeed is also 'dynamic' in that it will spin faster or slower depending on load and temperatures, so expect it to be slow in 2D mode as nothing is happening and its not getting too hot.

I will edit my post later if I made any mistakes- I am pretty tired ATM :p, I hope that clears a few things up anyway. Enjoy your X2!

Yes this does clear up quite a bit. So bottom line, my X2 is functioning normally?

I think I'm pretty tired ATM too, but it's still a bit hazy: is the card actually running in Crossfire mode? I know, w/the 16xAA test, that says it's capable, but is it actually right now running as a Crossfire board?

I'm also a bit unclear why ATI would not show the full 1 GB of memory in the "Information Center | Graphics Hardware" section. I mean, shouldn't they at least say 512MB x2, or something--so nervous n00bs like me won't keep asking dumb-ass questions?

Also, just to check the quality of my new build and the performance of the card's and case's cooling fans, right now CCC says there is 0% GPU activity, and the GPU temp is 57C. I know these cards run warm, but this being my first experience w/one, I just want to make sure everything is working right before I start loading the system up w/other stuff.

If I want a good website or two for a crash-course in all this to educate myself, where would you recommend I look?
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Yes everything is functioning normally, you'll want to keep AA set to 'application preference' for now, I suggest you read Tweakguides Catalyst guide to learn a few more of the ins and outs of the CCC, once you understand that it'll be easier to play with settings to find your best config. Might aswell checkout the Anandtech X2 article aswell.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Evenkeel
Originally posted by: Borealis7
run PongMark. immediately.

I Googled this, but got no results. What is it, and where do I find it?

He was being rude .. sarcastic at the least

You should look for 3DMark06 ... it is a Graphics benchmark that gives relative performance of similar systems ... it is by Futuremark

and it now appears that your rig is running OK .. Sylvanas gave a LOT of very useful info!
:thumbsup:
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
apoppin, don't hold out on him.

Pongmark is a very difficult application to use, but it can yield some startling results. One time I beat roger federer at it, in fact. I think my gpu was up to 891 core on that round ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
apoppin, don't hold out on him.

Pongmark is a very difficult application to use, but it can yield some startling results. One time I beat roger federer at it, in fact. I think my gpu was up to 891 core on that round ;)

hey, i still have Pong :p
what was that 1972 or '73?
:Q

i think it still works