I suggest you read this before going with Crossfire

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
Why does everyone only look at FPS when looking at multi GPU...is microstutter really THAT big a threat against people's e-Peen?

Certain people around here do have such a diplomatic way of wording things...

Anyway, like I had said:
So yes, CF has problems but so does SLI. Personally I am not in the market for top-end dual GPUs but I may want to try hybridCF if Trinity is good enough for my next CPU.

Even if Nvidia were to have better IQ or MS, I'm not fuzzy for IQ while gaming and can only go by my experience: Nvidia sold me and people I know defective parts because despite being a company who sell silicon chips their actual engineering competence at manufacturing is rather poor. And the G84/G86/G92 fiasco affected millions of parts. 'Fool me once shame on you, fool me twice...' So I simply will not by Nvidia again although I do welcome them to bring AMD's prices down :)
 

tulx

Senior member
Jul 12, 2011
257
2
71
I am very satisfied with my 5870ies in Crossfire. I bought the first one when it was launched and the second one about two years later - a few months ago. I'll be well equipped for at least a year. Probably till the next console generation comes out and allows the PC games tech to progress.

Having said that, some games have issues with crossfire. Skyrim, for example. Though that game was technically pretty much broken - like all TES games. But in general, performance is great and I have never had problems with AMD drivers regarding CFX. I've also never noticed this "micro-stuttering" that's being mentioned so often as a dual-GPU issue.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Thanks for responding.

I should probably send AMD a CV/resume. For a mere 100K USD per anum (and swag) I'll coordinate with all of the review sites for them. ;)

That would be awesome. Just update your sig if it works out!
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Certain people around here do have such a diplomatic way of wording things...

Anyway, like I had said:


Even if Nvidia were to have better IQ or MS, I'm not fuzzy for IQ while gaming and can only go by my experience: Nvidia sold me and people I know defective parts because despite being a company who sell silicon chips their actual engineering competence at manufacturing is rather poor. And the G84/G86/G92 fiasco affected millions of parts. 'Fool me once shame on you, fool me twice...' So I simply will not by Nvidia again although I do welcome them to bring AMD's prices down :)


Dosn't matter if you are in market or not...you only look at FPS graphs...making your "opnion" both uninformed...and likely flawed.

That you have personal feelings towards hardware companies and think they are diffrent...well that is very naive...putting it mildy here, since you seem like a very touchy person.
 

flopper

Senior member
Dec 16, 2005
739
19
76
single card always the best choice.
crossfire works well when there is support and the game developer did their job right.
however often that is severly lacking due to the consol mania games have including BF3 and dice should be sued due to providing such a shitty experience for the PC. they sold out.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
I appreciate when sites like xbit or Hardocp publish reviews geared towards getting AMD off their lazy behinds.
 

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
Dosn't matter if you are in market or not...you only look at FPS graphs...making your "opnion" both uninformed...and likely flawed.

That you have personal feelings towards hardware companies and think they are diffrent...well that is very naive...putting it mildy here, since you seem like a very touchy person.

Well, whatever one can say about the flaws of FPS graphs, they are at least half-way scientific. Like I said I'm not in the market for dual GPUs, but micro-stutter has not been measured in any scientific way by any site as far as I'm aware. As has been pointed out on this forum numerous times, a bunch of FRAPS graphs don't tell nowhere near the whole MS story and the only scientific analysis would be for someone using a high-speed video camera and then analyzing all the frame rates. Which no site does. Certainly some site should, but considering that most sites cannot even be bothered to measure GPU power usage rather than total system draw, I for one will not be holding my breath for a proper MS methodology.

And yes, I'm well aware that AMD might be the next to sell broken hardware but if that happens I'll just have to swap back to NV - or maybe Intel will have finally made a decent GPU by then. But from past experience, I'll stick with my prejudice since I did pay £140 for that self-destructing 8800GT and would be stupid not to have learned a lesson from that.
 
Last edited:

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
After using 2 crossfire and SLI setups I've decided to go with a powerful single GPU setup for my next upgrade. I admire those that can put up with the issues but it's just too frustrating for me now.

Microstutter, poor profiles (like the performance reducing one for 6900 cards in GTA 4), sometimes a lack of support on release days, the added heat, noise (my Twin Frozer IIIs are not quiet, at all) are all just too much of a compromise for me.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Well, whatever one can say about the flaws of FPS graphs, they are at least half-way scientific. Like I said I'm not in the market for dual GPUs, but micro-stutter has not been measured in any scientific way by any site as far as I'm aware. As has been pointed out on this forum numerous times, a bunch of FRAPS graphs don't tell nowhere near the whole MS story and the only scientific analysis would be for someone using a high-speed video camera and then analyzing all the frame rates. Which no site does. Certainly some site should, but considering that most sites cannot even be bothered to measure GPU power usage rather than total system draw, I for one will not be holding my breath for a proper MS methodology.

And yes, I'm well aware that AMD might be the next to sell broken hardware but if that happens I'll just have to swap back to NV - or maybe Intel will have finally made a decent GPU by then. But from past experience, I'll stick with my prejudice since I did pay £140 for that self-destructing 8800GT and would be stupid not to have learned a lesson from that.

FPS graphs on multi GPU is useless...with out attention to microstutter.

I'd rather have 60 STEADY FPS...than 200 microstuttering FPS...so for multi GPU...FPS are useless.
They don't give a scienticfic answer...it only seems that way for the uninformed

A siencetific way would look BOTH at the FPS AND the frametimes...it's the achilles heel of multi-GPU...microstutter.

And funny you choose the rag the 8800GT...it's was one of the best GPU's ever...still use mine as a PhysX card in secondary PC...it's really an own goal to blame that GPU...
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
FPS graphs on multi GPU is useless...with out attention to microstutter.

I'd rather have 60 STEADY FPS...than 200 microstuttering FPS...so for multi GPU...FPS are useless.
They don't give a scienticfic answer...it only seems that way for the uninformed

A siencetific way would look BOTH at the FPS AND the frametimes...it's the achilles heel of multi-GPU...microstutter.

Utterly and completely agree. There was a Q&A yesterday at PCGH.de with Nvidia Germany and I asked their rep specifically if one could measure Nvidias frame metering with fraps. They said no. So even IF people measure frametimes like some websites did - the most current being techreport - it will not allow any statements about what is presented to the gamer. At least not when comparing Nvidia with AMD. Subjective judgement is all that we currently have.
 

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
And funny you choose the rag the 8800GT...it's was one of the best GPU's ever...still use mine as a PhysX card in secondary PC...it's really an own goal to blame that GPU...

Well let me see, the 8800GT sure got great reviews at the time but didn't buy it until later (launch was more like £200 than £140). However mine died with classic bumpgate symptoms and a few months later my brother's died too with the exact same symptoms. Furthermore I had a 8400M MXM die on me as well as an Asus 7150 chipset motherboard. And also I've seen a large number of dead laptop with Nvidia chipsets too although I'd throw in one caveat there: most of them were HP DVs which also had notoriously bad cooling.

Funny enough but your favourite website and journalist (inq/SA) had exactly those parts listed as being affected by bumpgate and Charlie even listed exact same batch numbers as went bad on me. And yes, I'm well aware that not all G92s are affected but yours still being good is irrelevant: mine are all dead and no I did not overclock them and always ran a reasonably cool case.
 
Feb 19, 2009
10,457
10
76
Utterly and completely agree. There was a Q&A yesterday at PCGH.de with Nvidia Germany and I asked their rep specifically if one could measure Nvidias frame metering with fraps. They said no. So even IF people measure frametimes like some websites did - the most current being techreport - it will not allow any statements about what is presented to the gamer. At least not when comparing Nvidia with AMD. Subjective judgement is all that we currently have.

This should have been obvious from the original techreport frame time article, they even stated that their frame time measurements are pointless as it does not reflect reality with what the user sees on the monitor.

As to the bogus claims of lack of CF scaling for BF3 and crysis 2, sorry but this is a case of stupid user errors as these games have been fine since release (bf3 beta had issues). A non-story.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Utterly and completely agree. There was a Q&A yesterday at PCGH.de with Nvidia Germany and I asked their rep specifically if one could measure Nvidias frame metering with fraps. They said no. So even IF people measure frametimes like some websites did - the most current being techreport - it will not allow any statements about what is presented to the gamer. At least not when comparing Nvidia with AMD. Subjective judgement is all that we currently have.

Would it be possible for you to ask NVIDIA if they have an accurate way of measuring frametimes?
It's seems they at least give some effort to the issue, unlike AMD, so if asked correctly...it might give them incentive to release some API instruction that could be usefull.

It would also help developers in Q&A testing.

But I somehow doubt the openess in this area...it's takes the "halo-cards"...and put a big stain on them.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Personally I have no direct contact with Nvidia. I know someone who does, though. The Q&A yesterday was rather uninformative overall - these are marketing people. Anyway, I was told that there is more to come in regard to flip metering (how they call it).
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Personally I have no direct contact with Nvidia. I know someone who does, though. The Q&A yesterday was rather uninformative overall - these are marketing people. Anyway, I was told that there is more to come in regard to flip metering (how they call it).

That would be welcome news.
Because today...you have to trust the biology of the reviewer in regards to multi-GPU as microstutter is not percived by everyone...and that is really not help full.

It's like have a FPS graph, where the grahp has been exchanged by a labled with the text:

"ENOUGH FPS"

Really not informative...or helpfull.

So all the news and features to combat alternating frametimes is welcome.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
That would be welcome news.
Because today...you have to trust the biology of the reviewer in regards to multi-GPU as microstutter is not percived by everyone...and that is really not help full.

It's like have a FPS graph, where the grahp has been exchanged by a labled with the text:

"ENOUGH FPS"

Really not informative...or helpfull.

So all the news and features to combat alternating frametimes is welcome.
That would be a welcome change in the reviews :thumbsup:
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
So, i was looking at the article...

And it has CF results without AA. Seems like when they turned on AA for Crysis the thing bottomed out haha.

Not even sure about BF3, because I remember seeing tons of BF3 CF articles with good scaling.

Glad I'm not in the market for CFX. My one card is doing fine on it's own so far.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Actually in hexus review Xfire scales fine in BF3 but it has issues in Batman AC and Crysis 2.
http://hexus.net/tech/reviews/graphics/39013-asus-geforce-gtx-680-directcu-ii-top/
graph-03.png


graph-05.png


graph-04.png
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I'd like to see what hurdles Adam the Giant runs into. Not that it affects me directly (last multiGPU system I used was 4870x2), just curious of the drivers are borked, or installation is borked, but what is a given is AMD needs to fix their drivers.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
U said 4870X2,i had that gem :p
I like [H]s approach,they ask to amd for a resolution and they provide the full explanation in their reviews.
 

AdamK47

Lifer
Oct 9, 1999
15,780
3,601
136
I played both Crysis 2 and Battlefield 3 recently with the 12.4 drivers @ 2560x1600. Butter smooth! CrossFire was most definitely working across all three of my cards.
 

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
Actually in hexus review Xfire scales fine in BF3 but it has issues in Batman AC and Crysis 2.

graph-05.png

Strange, I've seen quite good CF scalling in Crysis 2 from computerbase and hardware.fr, for instance:

IMG0036438.gif


Makes me wonder whether which reviews are trustworthy really. Also

IMG0036423.gif


Has 7970Cf beating both 680SLI and 690 in everything except 1080P while other sites did not. They can't all be right...

Oh, and that's it: I'm not looking at these dual GPU threads again. Too much confusion plus the occasional bit of fud from both posters or reviewers.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Strange, I've seen quite good CF scalling in Crysis 2 from computerbase and hardware.fr, for instance:

IMG0036438.gif


Makes me wonder whether which reviews are trustworthy really. Also

IMG0036423.gif


Has 7970Cf beating both 680SLI and 690 in everything except 1080P while other sites did not. They can't all be right...

Oh, and that's it: I'm not looking at these dual GPU threads again. Too much confusion plus the occasional bit of fud from both posters or reviewers.
The strange thing is some of them are contradicting each other,AMD should really step on to clear up this mess.