CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
Thanks for the results Keys as they confirm the issue.

Look at frame 8 & 9 for example:

  • Single card variation: 40.30 ms - 36.49 ms = 3.81 ms.
  • SLI Variation: 31.77 ms - 17.94 ms = 13.83 ms.
So between frame 8 & 9 SLI has a variation of more than 3.5 times the single card.

If you plotted those figures you?d see more zig-zagging on average with SLI than the single card, just like the German website observed.

AFR delivers a higher framerate but it comes at a cost of less consistency than a single card.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: BFG10K
Thanks for the results Keys as they confirm the issue.

Look at frame 8 & 9 for example:

  • Single card variation: 40.30 ms - 36.49 ms = 3.81 ms.
  • SLI Variation: 31.77 ms - 17.94 ms = 13.83 ms.
So between frame 8 & 9 SLI has a variation of more than 3.5 times the single card.

If you plotted those figures you?d see more zig-zagging on average with SLI than the single card, just like the German website observed.

AFR delivers a higher framerate but it comes at a cost of less consistency than a single card.

You are most welcome. This may confirm "an" issue (that also exists in single cards), but also denies the "severity" of it.

Ah, but you focus on only the first 11 frames. Look at frames 12 through 50. Seems to stabilize. I can only attribute this to some combination of disk thrashing.
These results were taken from the second pass out of the four in the crysis bench.
I am looking right now at pass #4 results and the discrepancy in the first 11 frames is not there. They appear to be 20 to 25ms in differences throughout with an occasional spike to 30 every once in a while.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
This may confirm "an" issue (that also exists in single cards), but also denies the "severity" of it.

Ah, but you focus on only the first 11 frames. Look at frames 12 through 50.
Are we looking at the same figures here?

For starters try 11->12, 17->18, 18->19, 21->22...the list goes on. Well in excess of double the variation of the single card.

Seems to stabilize
Stabilizing is great but the question is how do the variations compare to a single card? That?s the key.

They appear to be 20 to 25ms in differences throughout with an occasional spike to 30 every once in a while.
Again how do these figures compare to a single card?

I?ve calculated the variations and the averages here:

http://img149.imageshack.us/im...60/microstuttervu8.png

The final average is 5.85 ms (SLI) vs 4 ms (single), which is a ~46% difference so it?s quite significant.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
And what is the average when you do not include the first 11 frames of the multicard result? And 4.00ms to 5.58ms average is pretty significant?
The single card had the highest single duration of 18.28ms, and the SLI setup had a highest single duration of 16.00. Now what's that percentage?
According to the results, the single 8800GTS had a "larger" microstutter than the SLI setup. But then again, it all depends on what the cards are rendering
at the time.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: BFG10K
This isn?t some imaginary conspiracy theory that just popped up today; this phenomenon has been known for years but the general public is oblivious and/or unbothered by it, much like they were with shimmering issues.

So, since only a portion of the few people on Earth who have two or more video cards in one computer experience this/can notice the difference, and there are settings that will either completely erradicate it or make it not nearly as noticeable, how could this even be considered a "problem"?
 

Jessica69

Senior member
Mar 11, 2008
501
0
0
Hey, Datenschleuder, are you the same guy on XS that was screaming about this on end? Think your name over there was Katzenschleuder. You pretty did much the same thing when you started that thread over there on Apr. 15th......screamed, yelled at, belittled everyone that questioned, disagreed with your assessment of how problematic this was.....and then started calling names in the end....think ignorant was the latest one. It almost seems like you simply registered there to stir up a commotion........and then, when a whole host of people over on XS don't automatically fall down prostrate and thank you for opening their eyes to this horrible problem (a problem most using Crossfire/SLI have never noticed, btw)......you call them ignorant.....such as in this quote from your thread on XS:

"The ignorance in this forum is of galactic dimensions."

Sure, that's a great way to make friends and influence others.......call those that don't see a problem with what they're using, or don't have a problem using their particular hardware setup and games combinations.......ignorant........
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Jessica69
Hey, Datenschleuder, are you the same guy on XS that was screaming about this on end? Think your name over there was Katzenschleuder. You pretty did much the same thing when you started that thread over there on Apr. 15th......screamed, yelled at, belittled everyone that questioned, disagreed with your assessment of how problematic this was.....and then started calling names in the end....think ignorant was the latest one. It almost seems like you simply registered there to stir up a commotion........and then, when a whole host of people over on XS don't automatically fall down prostrate and thank you for opening their eyes to this horrible problem (a problem most using Crossfire/SLI have never noticed, btw)......you call them ignorant.....such as in this quote from your thread on XS:

"The ignorance in this forum is of galactic dimensions."

Sure, that's a great way to make friends and influence others.......call those that don't see a problem with what they're using, or don't have a problem using their particular hardware setup and games combinations.......ignorant........

i dunno .. i notice it is happening all over - *suddenly* .. and it reminds me of "intel PR" - back in the day when they were fibbing about the K7 :p
... and then again with P4's NetBust .. i am NOT saying it is happening again .. but whenever they have a campaign based on FUD - perhaps like for the upcoming Octo-beast that will be USELESS for graphics for the next ten years
... just be aware

rose.gif
 

Jessica69

Senior member
Mar 11, 2008
501
0
0
What slays me is there was an "original" thread at XS referencing this same "problem", and it was from February last year. The original "discovery" which was posted on a German "testing" website was "made" and posted about back on February 8, 2008....seems very odd that a year would pass before the screaming, gnashing of teeth, and dire warnings and alerts are suddenly being posted all over the internet's enthusiast and tech forums by one poser, eerrrr, poster, that seems to be using variations of a single name....Datenschleuder, Katzenschleuder.....you go figure it out.
 

ChrisRay

Junior Member
Oct 21, 2005
20
0
0
Originally posted by: BFG10K
i did say *educated* .. NVIDIA doesn't "push" nHancer very well .. that is an awesome tool!
Not only do they not push it, nVidia implicitly tried to shut it down by introducing a profile checksum into Vista?s drivers.

As a result changing preset settings through the likes of nHancer would cause the entire profile to stop working.

Fortunately Grestorn managed to work around it. :thumbsup:

This is not true. Nvidia actually has little problem with Nhancer and some of the people there actually use the tool. Nvidia has been working to change the way Profiles are generated/restored for a while now. And unfortunately Grestorm has had to be ontop of the problem too as Nvidia been altering its profiling system to get to where it is today with the profile restore functions. These checksums were put in place based upon feedback ((By me and others in the sli community)) that Nvidia'd needed some kind of checksum restore point. To which the point they have actually been listening.

I actually have Nhancer stickied over at slizone and I completely endorse the program and Nvidia has absolutely zero problem with that. I have actually done my very best to keep grestorm updated with the latest going on's of SLI and driver changes.

Chris
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: BFG10K
Thanks for the results Keys as they confirm the issue.

Look at frame 8 & 9 for example:

  • Single card variation: 40.30 ms - 36.49 ms = 3.81 ms.
  • SLI Variation: 31.77 ms - 17.94 ms = 13.83 ms.
So between frame 8 & 9 SLI has a variation of more than 3.5 times the single card.

If you plotted those figures you?d see more zig-zagging on average with SLI than the single card, just like the German website observed.

AFR delivers a higher framerate but it comes at a cost of less consistency than a single card.

Yes, but 31.77 and 17.94 is still less than 40.30 and 36.49, respectively. This is the case with almost every SLI duration length when you compare each frame's duration to its single card duration. Essentially, no matter how great the variation in the size of the gap between frames, the gap is smaller with the SLI setup. I would expect that for micro stutter to occur, the gap would have to be larger.

Originally posted by: keysplayr2003
And what is the average when you do not include the first 11 frames of the multicard result? And 4.00ms to 5.58ms average is pretty significant?
The single card had the highest single duration of 18.28ms, and the SLI setup had a highest single duration of 16.00. Now what's that percentage?
According to the results, the single 8800GTS had a "larger" microstutter than the SLI setup. But then again, it all depends on what the cards are rendering
at the time.

Keys makes a good interesting point... micro stutter is not a function of averages, but of spikes (or dips of you prefer). You have to look at the maximum interval between frames.
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
Yeah Jessica, I was posting on this in other forums as well - big deal huh?
Congratulations on this earth shattering scandal! :D
So I must be an evil Intel agent trying to indoctrinate the internet in a harms way! ROFL


keysplayr:

Thanks for making the test!
Yes, the problem isn't that distinct at every point in this sample.
It of course can happen that the AFR frame times can get somewhat regular because the timing alignment is random.
You will see that the problem can get very significant if you make more tests.


Here is another review site reports on this issue.
You can clearly see the asynchronous frame buffer refreshes at the "Frametimes mit zwei Grafikkarten" (frame times with two graphics cards) data tab.

The funny thing is that they got a response on the issue by ATI, who confirm the problem and say that they assume the problem comes from lazy scene updates, which clearly isn't the case, because single card setups are not effected.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Datenschleuder
Yeah Jessica, I was posting on this in other forums as well - big deal huh?
Congratulations on this earth shattering scandal! :D
So I must be an evil Intel agent trying to indoctrinate the internet in a harms way! ROFL

It is your timing that wew are questioning - along with the manner in which you one-sidedly and aggressively presented you findings as thought it is something "new"

We are well aware of it .. and i guess i thank you that we have the technical expertise and incentive - spurred by your "challenge" to actually have figured it out!

so thanks, i guess

You also need to realize that there are HW manufacturer's "Viral" interests present on all forums that are not completely "owned" by a manufacturer.

We busted a major viral marketing program aimed at the forums a couple of years ago and we are very sensitive to obvious incursions that appear to be beginning a repetition of the chaos we experienced then

it is to our advantage to have a completely free and open debate as we are able to work together and discover much - that makes us a leader in my book.

rose.gif
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
apoppin: Whatever. I am not interested in arguing with characters like you.
I put a "I am an evil Intel agent" sticker on and you are happy - OK?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Datenschleuder
apoppin: Whatever. I am not interested in arguing with characters like you.
I put a "I am an evil Intel agent" sticker on and you are happy - OK?

nothing makes me happy ... but i like you
- can i please get one of your "evil intel agent" stickers
.. please

rose.gif


What is a "character like me" ?
- i have found no other :p

i am "happy" that we solved your problem; that we nipped intel PR-FUD in the bud is a "plus" - that may or not be related to your OP. i don't want to speculate any further

Welcome to AnandTech and also my world
.. i have been here 8 years

:)
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
And what is the average when you do not include the first 11 frames of the multicard result?
It was still about 18% IIRC. Anyway, why are we cherry picking your results? If you don?t think your results are suitable for the discussion then don?t post them.

The single card had the highest single duration of 18.28ms, and the SLI setup had a highest single duration of 16.00.
Yep, the single card does have the highest spike, but again it?s all about averages. Plot the variations on a graph and you?ll see the single card will be flatter on average than the multi-card.

Again this is nothing new or magical; the German website has already done this along with providing video evidence of the issue.

According to the results, the single 8800GTS had a "larger" microstutter than the SLI setup.
How did you arrive at that conclusion? The % variation on the SLI system is on average higher than the single card system.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
So, since only a portion of the few people on Earth who have two or more video cards in one computer experience this/can notice the difference, and there are settings that will either completely erradicate it or make it not nearly as noticeable, how could this even be considered a "problem"?
I'm not sure what you're saying here exactly. Do you think micro-stutter exists or not? Yes or no?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
Yes, but 31.77 and 17.94 is still less than 40.30 and 36.49, respectively. This is the case with almost every SLI duration length when you compare each frame's duration to its single card duration.
Of course they?re less because the multi-GPU system has a higher framerate on average than the single card system. But that?s not what we?re talking about.

The issue is the difference in time taken for each frame to arrive compared to the previous one. On average the multi-card system fluctuates more than the single card so a given framerate on a single card seems smoother than the same framerate on a multi-card.

Essentially, no matter how great the variation in the size of the gap between frames, the gap is smaller with the SLI setup.
It?s not smaller. On average the SLI setup has a higher variance than the single card framerate. I calculated it right here:

http://img149.imageshack.us/im...60/microstuttervu8.png

Again you?re looking at the wrong figure; you need to look at the variation column, not the duration column.

I would expect that for micro stutter to occur, the gap would have to be larger.
The gap is larger - look at the variation column averages in the link above.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
These checksums were put in place based upon feedback ((By me and others in the sli community)) that Nvidia'd needed some kind of checksum restore point.
I don't buy that response for a second.

If it's for a "checksum restore point" why do the profiles cease to work as soon as someone modifies one of nVidia's preset values? At that point I'm not restoring anything, I just want my modified profile to work, but it doesn't.

If the original checksum isn?t modified (which it isn?t) you can always restore back to the default profile, so why disable the profile entirely?

It's painfully obvious nVidia didn't want people modifying their preset values so they tried to put a lock on them, much like when they started encrypting their drivers to stop Unwinder defeating their application detection back during the FX days.

I actually have Nhancer stickied over at slizone and I completely endorse the program and Nvidia has absolutely zero problem with that. I have actually done my very best to keep grestorm updated with the latest going on's of SLI and driver changes.
The question is, does nVidia completely endorse the program? More specifically do they endorse modifying preset values?

Based on the checksum fiasco it appears not.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: BFG10K
Yes, but 31.77 and 17.94 is still less than 40.30 and 36.49, respectively. This is the case with almost every SLI duration length when you compare each frame's duration to its single card duration.
Of course they?re less because the multi-GPU system has a higher framerate on average than the single card system. But that?s not what we?re talking about.

The issue is the difference in time taken for each frame to arrive compared to the previous one. On average the multi-card system fluctuates more than the single card so a given framerate on a single card seems smoother than the same framerate on a multi-card.

Essentially, no matter how great the variation in the size of the gap between frames, the gap is smaller with the SLI setup.
It?s not smaller. On average the SLI setup has a higher variance than the single card framerate. I calculated it right here:

http://img149.imageshack.us/im...60/microstuttervu8.png

Again you?re looking at the wrong figure; you need to look at the variation column, not the duration column.

I would expect that for micro stutter to occur, the gap would have to be larger.
The gap is larger - look at the variation column averages in the link above.

The duration is the gap, the variance is the difference in size between one duration to the next. The duration is consistently shorter for the SLI system. I understand that we're essentially talking about the time between each frame, and I get that the SLI system is less uniform... However, a decrease in uniformity is irrelevant if the largest gap for the SLI system is still smaller than the largest gap for the singe card.

With your math, you could have a frame at 0ms, 500ms, and 1000ms, making the duration 500 between all frames giving you a variation of 0 - a 'perfect' variation by your logic... Yet still an unplayable 2fps. What you need to look at is the size of the duration, which is obviously directly related to fps. That amount of variance is only relevant if it causes the duration between frames to become larger with SLI than without.

I'm not saying that micro stutter doesn't exist... As a matter of fact, I think I mentioned earlier on this thread how I could see it happening. What I'm saying is that on the example that Keys posted, I don't think you would experience any stutter or slowdown in SLI that you would not with a single card, since the actual gap between frames is still consistently smaller with the SLI setup.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nitromullet
Originally posted by: BFG10K
Yes, but 31.77 and 17.94 is still less than 40.30 and 36.49, respectively. This is the case with almost every SLI duration length when you compare each frame's duration to its single card duration.
Of course they?re less because the multi-GPU system has a higher framerate on average than the single card system. But that?s not what we?re talking about.

The issue is the difference in time taken for each frame to arrive compared to the previous one. On average the multi-card system fluctuates more than the single card so a given framerate on a single card seems smoother than the same framerate on a multi-card.

Essentially, no matter how great the variation in the size of the gap between frames, the gap is smaller with the SLI setup.
It?s not smaller. On average the SLI setup has a higher variance than the single card framerate. I calculated it right here:

http://img149.imageshack.us/im...60/microstuttervu8.png

Again you?re looking at the wrong figure; you need to look at the variation column, not the duration column.

I would expect that for micro stutter to occur, the gap would have to be larger.
The gap is larger - look at the variation column averages in the link above.

The duration is the gap, the variance is the difference in size between one duration to the next. The duration is consistently shorter for the SLI system. I understand that we're essentially talking about the time between each frame, and I get that the SLI system is less uniform... However, a decrease in uniformity is irrelevant if the largest gap for the SLI system is still smaller than the largest gap for the singe card.

With your math, you could have a frame at 0ms, 500ms, and 1000ms, making the duration 500 and 500 between the two frames giving you a variation of 0 - a 'perfect' variation by your logic... Yet still an unplayable 2fps. What you need to look at is the size of the duration, which is obviously directly related to fps. That amount of variance is only relevant if it causes the duration between frames to become larger with SLI than without.

I'm not saying that micro stutter doesn't exist... As a matter of fact, I think I mentioned earlier on this thread how I could see it happening. What I'm saying is that on the example that Keys posted, I don't think you would experience any stutter or slowdown in SLI that you would not with a single card, since the actual gap between frames is still consistently smaller with the SLI setup.


I pretty much agree with this.

When I looked at Keys data, my first thought was: "Errr....the frames are on the screen half as long and we're worried about a 1.5ms variance in average switch time?"

I also don't get the war on Crossfire and SLi in general, "Let's return to lower IQ -yay!"????

At 25X16, I guarantee you what happens with single GPU is more like "full tilt stutter" than "micro stutter". I've pretty much got to have Crossfire or SLi- I'm not going back to low res just to avoid "micro stutter".
 

Angriff

Junior Member
Apr 20, 2008
1
0
0
Most SLI/XFire users increase the graphics settings and target the same frame rate as with a single card. So the micro stutters are a big deal there.

And this problem is relevant for you even if you only have bought SLI/XFire to get from 60 to 100 FPS, because the quality of these 100 FPS is reduced by this as well of course.

Keysplayr's log is really at the bottom end of the possible extend in micro stutters you can get.
I often get even regulary frames that take more than two times longer than the previous ones in 3Dmark and many games with high quality settings in AFR mode.
But this can be somewhat dependent on the settings and the test run itself.

The computerbase.de article shows this quite well.

I definitely agree that much more attention should be brought to this matter!
Hopefully we will see an article at Anandtech!
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
The duration is the gap, the variance is the difference in size between one duration to the next. The duration is consistently shorter for the SLI system.
Yep, because the framerate is higher so it must be. If the single card scores 60 FPS then the duration is about ~16 ms. If SLI is scoring 90 FPS then the duration must be ~11 ms.

Again it must be shorter on average simply because the framerate is higher. But that isn?t the issue, the issue?s the variance in the durations, or rather the framerate swings.

However, a decrease in uniformity is irrelevant if the largest gap for the SLI system is still smaller than the largest gap for the singe card.
I don?t follow. The largest gap will almost always be smaller on SLI because the framerate is higher. By that reasoning micro-stuttering doesn?t exist but clearly that?s been proven otherwise.

With your math, you could have a frame at 0ms, 500ms, and 1000ms, making the duration 500 and 500 between the two frames giving you a variation of 0 - a 'perfect' variation by your logic... Yet still an unplayable 2fps.
That?s true but likewise you can?t only use the durations either.

For example, if the single card system gets 40/45/50 min/avg/max but the SLI system gets 45/75/105 min/avg/max, the SLI system clearly has a higher framerate and hence lower durations but it?s not necessarily smoother because it has larger fluctuations.

I would however agree Keys? figures aren?t the best because he didn?t provide frametimes but instead provided durations which don?t really tell the full story and can?t demonstrate AFR input lag.

Which is why I?ve done something using the Call of Juarez figures provided here:

http://www.computerbase.de/art...-way-sli_triple-sli/7/

Look at the figures & graph here:

http://img353.imageshack.us/im...3/microstutter2ls7.png

Some things to note:

  • The multi-GPU system has a higher framerate in the benchmarks but the variances between frames clearly fluctuate more wildly than the single card.
  • The frametimes (to the left of the variance) show the multi-GPU system consistently behind the single GPU system. This is also evidence of AFR input lag as the frames are always coming later than the single card despite the average being higher. An average framerate will not show this, nor will the format of Keys? data using durations instead of frametimes.
  • The average variance difference between the two systems is ?only? 19%, but look at the huge variance swings on the graph. Again averages can be deceptive with this sort of thing.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: BFG10K
The duration is the gap, the variance is the difference in size between one duration to the next. The duration is consistently shorter for the SLI system.
Yep, because the framerate is higher so it must be. If the single card scores 60 FPS then the duration is about ~16 ms. If SLI is scoring 90 FPS then the duration must be ~11 ms.

Again it must be shorter on average simply because the framerate is higher. But that isn?t the issue, the issue?s the variance in the durations, or rather the framerate swings.

However, a decrease in uniformity is irrelevant if the largest gap for the SLI system is still smaller than the largest gap for the singe card.
I don?t follow. The largest gap will almost always be smaller on SLI because the framerate is higher. By that reasoning micro-stuttering doesn?t exist but clearly that?s been proven otherwise.

With your math, you could have a frame at 0ms, 500ms, and 1000ms, making the duration 500 and 500 between the two frames giving you a variation of 0 - a 'perfect' variation by your logic... Yet still an unplayable 2fps.
That?s true but likewise you can?t only use the durations either.

For example, if the single card system gets 40/45/50 min/avg/max but the SLI system gets 45/75/105 min/avg/max, the SLI system clearly has a higher framerate and hence lower durations but it?s not necessarily smoother because it has larger fluctuations.

I would however agree Keys? figures aren?t the best because he didn?t provide frametimes but instead provided durations which don?t really tell the full story and can?t demonstrate AFR input lag.

Which is why I?ve done something using the Call of Juarez figures provided here:

http://www.computerbase.de/art...-way-sli_triple-sli/7/

Look at the figures & graph here:

http://img353.imageshack.us/im...3/microstutter2ls7.png

Some things to note:

  • The multi-GPU system has a higher framerate in the benchmarks but the variances between frames clearly fluctuate more wildly than the single card.
  • The frametimes (to the left of the variance) show the multi-GPU system consistently behind the single GPU system. This is also evidence of AFR input lag as the frames are always coming later than the single card despite the average being higher. An average framerate will not show this, nor will the format of Keys? data using durations instead of frametimes.
  • The average variance difference between the two systems is ?only? 19%, but look at the huge variance swings on the graph. Again averages can be deceptive with this sort of thing.

You're actually agreeing with me... You just renamed 'duration' in Keys' graph to 'variance' in the screen shot you posted (which is different than the variance that you derived from Keys' numbers).

In looking at the CoJ numbers I would agree that the perceived fps would be lower for the SLI setup, but so are the actual fps... 30/1050ms for the single card ~ 29fps, whereas 30/1250ms for the SLI setup ~ 24fps.

If you look at the page where they actually compare 3-way and 2-way SLI to a single 8800Ultra, you see is different story with regards to frame rate:

http://www.computerbase.de/art...way-sli_triple-sli/14/

This begs the question of what cards are they comparing in SLI to s single 8800 Ultra to get 29fps for the single card and 24fps for the dual card setup? If they aren't comparing two Ultras to the single Ultra than the point is moot. I doubt that anyone would argue that say dual 9600GTs running at 24fps are better than a single Ultra running at 29fps...

Notice that in Keys' comparison he compared a single 9800GTX vs dual 9800GTXes.

If my German was a bit better, I could probably answer this question myself.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
you can?t only use the durations either.

Yes, you can... the duration (the time between frames) is only thing that really matters. The greater the duration between any two frames, the lower the perceived fps becomes regardless of the average fps. The difference (variance) between different sets of durations is only interesting in the comparison between multi and single cards if the variance causes the duration between frames for the multi-card setup to exceed the duration between frames for the single card setup. This is the definition of "micro-stutter". As long as the duration between frames for the multi-card setup does not exceed the duration for a single card, no adverse affect will be noticed for the multi-card setup.