First glance at 3870X2 Crossfire X results.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
who is to blame is irrelevant. I get to choose between a buggier, sometimes faster sometimes slower (usually faster), more expensive, more power consuming (needs more expensive PSU and costs more in electricity) and more heat generating beast. Or I can go with consistent, stable, less buggy card. So I make my choice. a year from now the situation might change and I might make a different choice. But in the past 4 years the single GPU has always been the better solution in my opinion.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: nRollo



You and Taltemir are both right.

In the main body of my post I was only thinking of 3870X2 vs single GPU, in the edit I added later I was thinking of 3870X2 vs GX2 and how the GX2 s multicard limitations are lessened by the additional driver flexibility. The GX2 would still have some multi-card limitations though, so a single card offering similar performance to it would be "better" and/or worth more money.

I see this as the main problem with R6XX and ATi's stance from the beginning that multi GPU is how they would combat the 8800U and 8800GTX. Multi GPU has inherent limitations, more so with Crossfire, even though they've come light years from where it was just last year.

To me there are two reasons to go multiGPU.

Best: To get a level of performance unavailable with single cards, and for this you accept there will be variable scaling, more tweaking, more expensive hardware.

Second: You can't afford that level of performance, but realize adding a second card down the road you pick up used or discounted may be cheaper than selling your old card at big loss, buying new high end at launch price. (in short- flexibility)

A single 3870X2 doesn't offer consistently (appreciably) higher performance than a 8800U or OCd GTX, so it's more a hobbyists solution. An interesting alternative.

This of course may all change when the drivers for Quadfire launch as the one set of benches I've seen offer good scaling, at which point it may be a more attractive high end dual card solution.

My $.02

Is there a block specific user feature on this forum? Hearing from nv on nv is ok, but this is crap.
 

gingerstewart55

Senior member
Sep 12, 2007
242
0
0
Originally posted by: taltamir
who is to blame is irrelevant. I get to choose between a buggier, sometimes faster sometimes slower (usually faster), more expensive, more power consuming (needs more expensive PSU and costs more in electricity) and more heat generating beast. Or I can go with consistent, stable, less buggy card. So I make my choice. a year from now the situation might change and I might make a different choice. But in the past 4 years the single GPU has always been the better solution in my opinion.

Damn, for a minute there, I thought you were talking about the 8800 Ultra as the buggy, faster, more expensive, more power consuming and hotter running video card.........wait....outside of the buggy snide remark (which I don't think fits either card), you almost could have been just as easily speaking of the Ultra instead of the X2.
 

gingerstewart55

Senior member
Sep 12, 2007
242
0
0
Originally posted by: ronnn


Is there a block specific user feature on this forum? Hearing from nv on nv is ok, but this is crap.

Ehhh.....what else do you expect from Rollo? He's been giving BJ's to nVidia for years.......and now seems to have a circle jerk going with some of his boy-buds.
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
Originally posted by: Gikaseixas
Originally posted by: CP5670
The CCC might report it as a single GPU but it is not a completely transparent solution... its just a covered up solution. At no place is it mentioned to be two GPUs, but the drivers ARE treating it as a crossfire solutions... with some games actually loosing performance compared to one card due to lack of driver profiling. All in all it depends on driver support, and it will only be good for so long. (I bet you that there wouldn't be any good drivers for the X2 for windows 2009 or whatever its called)

Yes, I think the claims of it being treated a single card by the drivers are exaggerated. From what I have seen in the reviews, it seems that it's operating just like a pair of cards in Crossfire but simply trying to hide that from the user.

It ususally performs better than two 3870 cards in xfire due to the PCIe bridge chip in the card and also higher core speeds. This card is not perfect, sometimes it won't excel at certain games due to drivers but that's only here and there, for the majority of games it trounces the competition and that's why sites like Anandtech gave it the title of the best single card out there.
Competition is good, i'm loving it. We need another GTX or 9700pro, this is not one but it might ignite ATI or Nvidia to make one.

That is only in the blockbuster games that are commonly benchmarked on sites, and that also only includes things from the last year or two. The thing with these multi-GPU setups is you never know what they will do in a randomly chosen game that is not necessarily popular, or if they will even work properly at all. This could change in the future, if the companies put more resources towards improving the drivers (or find a way to do all the scaling in hardware), but as things stand my experience has been that these setups are more trouble than they're worth.

I want a high end card really badly by this point but am not considering this 3870 X2, as I don't like the idea of a new card making me downgrade my resolution or refresh rate. :p (similar comments apply for the 9800GX2; I use vsync in almost everything and have no intention of giving that up)

And yes, competition is always good. As you said, if nothing else, the X2 should at least spur Nvidia into releasing something new.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: gingerstewart55
Originally posted by: ronnn


Is there a block specific user feature on this forum? Hearing from nv on nv is ok, but this is crap.

Ehhh.....what else do you expect from Rollo? He's been giving BJ's to nVidia for years.......and now seems to have a circle jerk going with some of his boy-buds.

Actually I was comparing the usage of two ultras to one ultra, or the X2 or two 3870 to a single 3870. or two 8800GT to a single GT.
The 3870x2 is actually slower then a single 3870 in some games, and very slightly faster (10% or so) in others... Sure some games give an amazing 80% benefit over a single 3870, but not all. This has nothing to do with nvidia vs ati.
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
I just wish there was a dependable database of optimized settings for games/cards where you could reasonably determine if your prefered setup or upgrade is ""supported"" or beneficial.

There are clearly folks who are torqued one way or the other and don't really add a whole lot to the discussion or answer any questions ... a little objectivity would go a long way and improve the overall quality of this thread and ATF in general ...

(now shutting up)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
unfortunately there is no such database, there are, however, many individual reviews.
Tom's hardware has a database but its horribly outdated, with the latest multi GPU solution from AMD being the X2900...
Toms multi GPU database: http://www23.tomshardware.com/graphics_sli2007.html
Toms single GPU database: http://www23.tomshardware.com/graphics_2007.html

The biggest issue though, is that the exact same hardware setup varies in performance (greatly for multi GPU, slightly for single GPU) with the driver version used. Each driver version tends to improve multi GPU performance in specific games a LOT!

Since tom's hardware does not retest each game with the latest drivers as they come out, you have a situation where each card has been tested using different driver versions, and as such the numbers are insubstantial. (it could show a multi GPU solution performing worse then a single GPU solution, or only 10-20% faster on a specific game because they tested with drivers that didn't profile it, when the current up to date drivers might give an 80% boost over a single card... so you never know)
 

NinjaJedi

Senior member
Jan 31, 2008
286
0
0
Originally posted by: ronnn
Originally posted by: nRollo



You and Taltemir are both right.

In the main body of my post I was only thinking of 3870X2 vs single GPU, in the edit I added later I was thinking of 3870X2 vs GX2 and how the GX2 s multicard limitations are lessened by the additional driver flexibility. The GX2 would still have some multi-card limitations though, so a single card offering similar performance to it would be "better" and/or worth more money.

I see this as the main problem with R6XX and ATi's stance from the beginning that multi GPU is how they would combat the 8800U and 8800GTX. Multi GPU has inherent limitations, more so with Crossfire, even though they've come light years from where it was just last year.

To me there are two reasons to go multiGPU.

Best: To get a level of performance unavailable with single cards, and for this you accept there will be variable scaling, more tweaking, more expensive hardware.

Second: You can't afford that level of performance, but realize adding a second card down the road you pick up used or discounted may be cheaper than selling your old card at big loss, buying new high end at launch price. (in short- flexibility)

A single 3870X2 doesn't offer consistently (appreciably) higher performance than a 8800U or OCd GTX, so it's more a hobbyists solution. An interesting alternative.

This of course may all change when the drivers for Quadfire launch as the one set of benches I've seen offer good scaling, at which point it may be a more attractive high end dual card solution.

My $.02

Is there a block specific user feature on this forum? Hearing from nv on nv is ok, but this is crap.

Yes go to your *my forums* page and scroll to the bottom to the ignored users section.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Thanks for the info!! Can't get the feature to work for some reason, but soon will I hope.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
CP5670
I use vsync in almost everything and have no intention of giving that up

Actually you don't have to give up Vsync... see this thread you can use D3D Overrider (doesn't Rivatuner have something like this?) to use Vsync.
 

GonePlaid

Member
Jan 12, 2008
39
0
0
Got my 3870x2 on Thursday and have been putting it through it's paces.

F.E.A.R using fraps in game benchmarking everything set to maximum 4x fsaa 16x anisotrophic.

Frames: 116590 Time:559640 Min:0 Max:1556 Avg:208.33 fps.

Min:0 is when the game does its auto save checkpoint
Max:1556 is when I went into one of the menus in the game so the score is a bit lopsided.

I ran the benchmark Thursday night without going into the menu and averaged out to 153fps.

This is with the latest drivers installed. Oh, I did run 3dmark05 and 06 don't remember the scores off hand except that 06 did almost clock my card in at score at almost 11k will rerun them again them and this time write the scores down.

The only game so far I seem to be running into trouble is Jedi Academy. I get to a certain points in a level and the card just shuts off and I have to reboot the computer.

I also have a hannsg 19" lcd monitor with a max of 75hz at 1440x900. My old x1650 registered that fine and ran the computer at that. This new card, probably a driver issue, only lets me set it to 60hz. It's recognizing the fact that the monitor is 75hz and all the other video modes have the 75hz as well as 85hz but noooooo there's no option for the 75hz for my resolution. Grrrr.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: GonePlaid
Got my 3870x2 on Thursday and have been putting it through it's paces.

F.E.A.R using fraps in game benchmarking everything set to maximum 4x fsaa 16x anisotrophic.

Frames: 116590 Time:559640 Min:0 Max:1556 Avg:208.33 fps.

This is with the latest drivers installed. Oh, I did run 3dmark05 and 06 don't remember the scores off hand except that 06 did almost clock my card in at score at almost 11k will rerun them again them and this time write the scores down.

The only game so far I seem to be running into trouble is Jedi Academy. I get to a certain points in a level and the card just shuts off and I have to reboot the computer.

I also have a hannsg 19" lcd monitor with a max of 75hz at 1440x900. My old x1650 registered that fine and ran the computer at that. This new card, probably a driver issue, only lets me set it to 60hz. It's recognizing the fact that the monitor is 75hz and all the other video modes have the 75hz as well as 85hz but noooooo there's no option for the 75hz for my resolution. Grrrr.

what is with the minimum - zero

i also have a 'stutter' - twice, when you turn a corner and when you come up out of the water ... but it n;y drops to 30FPS with my 2900xt/Pro xfire ... but i am not getting anywhere near your averages or maximums.

i got 13090 in 3DMark06
 

GonePlaid

Member
Jan 12, 2008
39
0
0
Originally posted by: apoppin
Originally posted by: GonePlaid
Got my 3870x2 on Thursday and have been putting it through it's paces.

F.E.A.R using fraps in game benchmarking everything set to maximum 4x fsaa 16x anisotrophic.

Frames: 116590 Time:559640 Min:0 Max:1556 Avg:208.33 fps.

This is with the latest drivers installed. Oh, I did run 3dmark05 and 06 don't remember the scores off hand except that 06 did almost clock my card in at score at almost 11k will rerun them again them and this time write the scores down.

The only game so far I seem to be running into trouble is Jedi Academy. I get to a certain points in a level and the card just shuts off and I have to reboot the computer.

I also have a hannsg 19" lcd monitor with a max of 75hz at 1440x900. My old x1650 registered that fine and ran the computer at that. This new card, probably a driver issue, only lets me set it to 60hz. It's recognizing the fact that the monitor is 75hz and all the other video modes have the 75hz as well as 85hz but noooooo there's no option for the 75hz for my resolution. Grrrr.

what is with the minimum - zero

i also have a 'stutter' - twice, when you turn a corner and when you come up out of the water ... but it n;y drops to 30FPS with my 2900xt/Pro xfire ... but i am not getting anywhere near your averages or maximums.

i got 13090 in 3DMark06

Oh sorry about that I was just editing my post with the explanations of the fps of the min and max fps. The reason is in my above post now.

Min:0 is when the game does its auto save checkpoint
Max:1556 is when I went into one of the menus in the game so the score is a bit lopsided.

I ran the benchmark Thursday night without going into the menu and averaged out to 153fps.

 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
Originally posted by: Sylvanas
CP5670
I use vsync in almost everything and have no intention of giving that up

Actually you don't have to give up Vsync... see this thread you can use D3D Overrider (doesn't Rivatuner have something like this?) to use Vsync.

I think vsync works in Crossfire. I haven't heard anything about it, at least. I was referring to SLI there, and I know from experience that vsync is impossible to get in a large number of games even with the use of this or other similar programs. Not only that, but the resulting tearing effects are often much worse than on a single card, which makes motion look surprisingly choppy and largely negates the effect of the increased framerates beyond a point. I ran into a few games that actually looked smoother on a single card despite having a lower framerate, because this tearing wasn't present.
 

GonePlaid

Member
Jan 12, 2008
39
0
0
Okay I went into my benchmark log for thursday night and this is what fraps registered. This for the the very first mission in F.E.A.R:

Frames: 93024 Time:591256 Min: 0 Max: 386 Avg:157.333
 

GonePlaid

Member
Jan 12, 2008
39
0
0
3dmark05 scores which I just ran. I also forgot to mention that the card is running at 825mhz and is not overclocked.

1440x900 8x AA 16x anisotropic 3 tests (just the games): 14576. All six tests 14671.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: GonePlaid
Originally posted by: apoppin
Originally posted by: GonePlaid
Got my 3870x2 on Thursday and have been putting it through it's paces.

F.E.A.R using fraps in game benchmarking everything set to maximum 4x fsaa 16x anisotrophic.

Frames: 116590 Time:559640 Min:0 Max:1556 Avg:208.33 fps.

This is with the latest drivers installed. Oh, I did run 3dmark05 and 06 don't remember the scores off hand except that 06 did almost clock my card in at score at almost 11k will rerun them again them and this time write the scores down.

The only game so far I seem to be running into trouble is Jedi Academy. I get to a certain points in a level and the card just shuts off and I have to reboot the computer.

I also have a hannsg 19" lcd monitor with a max of 75hz at 1440x900. My old x1650 registered that fine and ran the computer at that. This new card, probably a driver issue, only lets me set it to 60hz. It's recognizing the fact that the monitor is 75hz and all the other video modes have the 75hz as well as 85hz but noooooo there's no option for the 75hz for my resolution. Grrrr.

what is with the minimum - zero

i also have a 'stutter' - twice, when you turn a corner and when you come up out of the water ... but it n;y drops to 30FPS with my 2900xt/Pro xfire ... but i am not getting anywhere near your averages or maximums.

i got 13090 in 3DMark06

Oh sorry about that I was just editing my post with the explanations of the fps of the min and max fps. The reason is in my above post now.

Min:0 is when the game does its auto save checkpoint
Max:1556 is when I went into one of the menus in the game so the score is a bit lopsided.

I ran the benchmark Thursday night without going into the menu and averaged out to 153fps.

i see you are using FRAPS -- press "F-11" - later ... after you see FPS start to roll ... also press it again just before you exit ... do 3 runs remembering to press F11 at the same frames.
... and Why not use the "test settings" built-into FEAR? - no FRAPs is needed at all for min/max/average
:confused:


edited

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: CP5670
Originally posted by: Sylvanas
CP5670
I use vsync in almost everything and have no intention of giving that up

Actually you don't have to give up Vsync... see this thread you can use D3D Overrider (doesn't Rivatuner have something like this?) to use Vsync.

I think vsync works in Crossfire. I haven't heard anything about it, at least. I was referring to SLI there, and I know from experience that vsync is impossible to get in a large number of games even with the use of this or other similar programs. Not only that, but the resulting tearing effects are often much worse than on a single card, which makes motion look surprisingly choppy and largely negates the effect of the increased framerates beyond a point. I ran into a few games that actually looked smoother on a single card despite having a lower framerate, because this tearing wasn't present.

How many years ago was that?

Vsynch for all games since October 2005
Also, with this new driver release, SLI gamers will be allowed to enable vertical sync for ALL games!

 

GonePlaid

Member
Jan 12, 2008
39
0
0
Originally posted by: apoppin


i see you are using FRAPS -- press "F-11" - later ... after you see FPS start to roll ... also press it again just before you exit ... do 3 runs remembering to press F11 at the same frames.
... and Why not use the "test settings" built-into FEAR? - no FRAPs is needed at all for min/max/average
:confused:


edited


Oh? I didn't even realize F.E.A.R. had one built in. I got the game not that long ago and finished it and didn't really look around in it for goodies. Thanks for the heads up on that.

I did run Half Life 2: Lost Coast Video stress test and got 131fps. With 6x aa 16x anis everything on as high as I could set it as well.
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
Originally posted by: nRollo
Originally posted by: CP5670
Originally posted by: Sylvanas
CP5670
I use vsync in almost everything and have no intention of giving that up

Actually you don't have to give up Vsync... see this thread you can use D3D Overrider (doesn't Rivatuner have something like this?) to use Vsync.

I think vsync works in Crossfire. I haven't heard anything about it, at least. I was referring to SLI there, and I know from experience that vsync is impossible to get in a large number of games even with the use of this or other similar programs. Not only that, but the resulting tearing effects are often much worse than on a single card, which makes motion look surprisingly choppy and largely negates the effect of the increased framerates beyond a point. I ran into a few games that actually looked smoother on a single card despite having a lower framerate, because this tearing wasn't present.

How many years ago was that?

Vsynch for all games since October 2005
Also, with this new driver release, SLI gamers will be allowed to enable vertical sync for ALL games!

LOL, that was a complete joke. I actually read that exact page before buying my cards, confident that it would work, only to find that it was still broken in about 80% of my games for numerous driver revisions after that. Other people have had the same experience with it, and from the comments I've seen since then it appears that it's still broken in a lot of games.

Apparently Nvidia only tested it in a handful of big name games before releasing that statement, or of course, they're simply lying.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
nice ... mine is 35/70/115 at 16x10 with Vista 64 - everything in-game fully maxed 0xAA/16xAF - SS on

-i am not sure AA works with SS ... and i will take SS any day ... those particle effects are spectacular
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: CP5670
Originally posted by: nRollo
Originally posted by: CP5670
Originally posted by: Sylvanas
CP5670
I use vsync in almost everything and have no intention of giving that up

Actually you don't have to give up Vsync... see this thread you can use D3D Overrider (doesn't Rivatuner have something like this?) to use Vsync.

I think vsync works in Crossfire. I haven't heard anything about it, at least. I was referring to SLI there, and I know from experience that vsync is impossible to get in a large number of games even with the use of this or other similar programs. Not only that, but the resulting tearing effects are often much worse than on a single card, which makes motion look surprisingly choppy and largely negates the effect of the increased framerates beyond a point. I ran into a few games that actually looked smoother on a single card despite having a lower framerate, because this tearing wasn't present.

How many years ago was that?

Vsynch for all games since October 2005
Also, with this new driver release, SLI gamers will be allowed to enable vertical sync for ALL games!

LOL, that was a complete joke. I actually read that exact page before buying my cards, confident that it would work, only to find that it was still broken in about 80% of my games for numerous driver revisions after that. Other people have had the same experience with it, and from the comments I've seen since then it appears that it's still broken in a lot of games.

Apparently Nvidia only tested it in a handful of big name games before releasing that statement, or of course, they're simply lying.

Somebody seems to have some misinformation going on, that much is sure.

I just turned on FRAPs, fired up six games, and vsynch worked in every one. I had DX9, DX10, and one OpenGL in the mix.

Go figure.