CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: BFG10K
The duration is the gap, the variance is the difference in size between one duration to the next. The duration is consistently shorter for the SLI system.
Yep, because the framerate is higher so it must be. If the single card scores 60 FPS then the duration is about ~16 ms. If SLI is scoring 90 FPS then the duration must be ~11 ms.

Again it must be shorter on average simply because the framerate is higher. But that isn?t the issue, the issue?s the variance in the durations, or rather the framerate swings.

However, a decrease in uniformity is irrelevant if the largest gap for the SLI system is still smaller than the largest gap for the singe card.
I don?t follow. The largest gap will almost always be smaller on SLI because the framerate is higher. By that reasoning micro-stuttering doesn?t exist but clearly that?s been proven otherwise.

With your math, you could have a frame at 0ms, 500ms, and 1000ms, making the duration 500 and 500 between the two frames giving you a variation of 0 - a 'perfect' variation by your logic... Yet still an unplayable 2fps.
That?s true but likewise you can?t only use the durations either.

For example, if the single card system gets 40/45/50 min/avg/max but the SLI system gets 45/75/105 min/avg/max, the SLI system clearly has a higher framerate and hence lower durations but it?s not necessarily smoother because it has larger fluctuations.

I would however agree Keys? figures aren?t the best because he didn?t provide frametimes but instead provided durations which don?t really tell the full story and can?t demonstrate AFR input lag.

Which is why I?ve done something using the Call of Juarez figures provided here:

http://www.computerbase.de/art...-way-sli_triple-sli/7/

Look at the figures & graph here:

http://img353.imageshack.us/im...3/microstutter2ls7.png

Some things to note:

  • The multi-GPU system has a higher framerate in the benchmarks but the variances between frames clearly fluctuate more wildly than the single card.
  • The frametimes (to the left of the variance) show the multi-GPU system consistently behind the single GPU system. This is also evidence of AFR input lag as the frames are always coming later than the single card despite the average being higher. An average framerate will not show this, nor will the format of Keys? data using durations instead of frametimes.
  • The average variance difference between the two systems is ?only? 19%, but look at the huge variance swings on the graph. Again averages can be deceptive with this sort of thing.
I'm not a Statistics major, but it seems to me what you really want to do is plot a bell curve and calculate the standard deviations for both scenarios. If there is micro-stutter, then the curve for the SLI setup will be wider to encompass the spread out nature of the data, and the standard deviation should be greater with the SLI setup (or perhaps > 1/2 the SD of the single card setup, I'm not sure what the proper protocol is here).
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: BFG10K
And what is the average when you do not include the first 11 frames of the multicard result?
It was still about 18% IIRC. Anyway, why are we cherry picking your results? If you don?t think your results are suitable for the discussion then don?t post them.

The single card had the highest single duration of 18.28ms, and the SLI setup had a highest single duration of 16.00.
Yep, the single card does have the highest spike, but again it?s all about averages. Plot the variations on a graph and you?ll see the single card will be flatter on average than the multi-card.

Again this is nothing new or magical; the German website has already done this along with providing video evidence of the issue.

According to the results, the single 8800GTS had a "larger" microstutter than the SLI setup.
How did you arrive at that conclusion? The % variation on the SLI system is on average higher than the single card system.

How else could I have arrived at that conclusion? By looking at the numbers you came up with. The single card had 18.28ms and the SLI setup had 16.00ms. This is not about averages. If it was about averages, like fps benchmarking, we wouldn't be having this conversation. This is about one frame to the next. Also, the durations in ms on the SLI setup are much lower than on a single card. So The percentages would mean something if the two setups were identical. e.g. single 8800GTS 640 against single 8800GTS 640.

Our graphs show that there isn't any "perfect" synchronization. They also show that it isn't the end of the world either or as serious a problem some make it out to be. I posted the 2nd run results intentionally "because of the first 11 frames and the discrepancy. That was the "worst run". And again, I mentioned it could have been due to disk thrashing. Runs 3 and 4 were as I mentioned before, more in line with frames 12 through 50 on my graph. And I'll post runs 3 and 4 later tonight. It takes quite a bit of time.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
In looking at the CoJ numbers I would agree that the perceived fps would be lower for the SLI setup, but so are the actual fps... 30/1050ms for the single card ~ 29fps, whereas 30/1250ms for the SLI setup ~ 24fps.

This begs the question of what cards are they comparing in SLI to s single 8800 Ultra to get 29fps for the single card and 24fps for the dual card setup?
I used an online translator and they stated they used a single 8800 Ultra followed by using two cards. While they don?t specifically state they used two 8800 Ultras, they?d be pretty stupid not to and that website doesn?t have a track record in stupidity.

My theory is that it?s related to AFR due to the system rendering ahead in order to feed both GPUs before starting to output frames. This offsets the first frame compared to a single card system which in turns offsets subsequent frametimes.

For this reason I don?t think you can use absolute frames/time = framerate across both columns, only relative comparisons.

Yes, you can... the duration (the time between frames) is only thing that really matters. The greater the duration between any two frames, the lower the perceived fps becomes regardless of the average fps. The difference (variance) between different sets of durations is only interesting in the comparison between multi and single cards if the variance causes the duration between frames for the multi-card setup to exceed the duration between frames for the single card setup. This is the definition of "micro-stutter". As long as the duration between frames for the multi-card setup does not exceed the duration for a single card, no adverse affect will be noticed for the multi-card setup.
Feel free to correct my interpretation, but you?re saying it?s okay for the AFR system to vary its durations in any way as long as said durations don?t exceed the single card?s durations?

If so I wouldn?t necessarily agree with this.

Say for example the single card alternates between 45 FPS and 50 FPS each frame which translates into 22 ms vs 20 ms durations, or a 2 ms variance.

The AFR system alternates between 55 FPS and 110 FPS each frame which translates into 18 ms vs 9 ms durations, or a 9 ms variance.

Because the AFR system?s durations never exceed that of the single card?s you?re saying the AFR system is better off?

If so I wouldn?t necessarily agree as those kinds of framerate swings are far more likely to be noticed even if the durations are always lower with AFR.

Which do you think you?re more likely to notice: 2 ms swings between each frame, or 9 ms swings that constantly cut your framerate in half and then rocket back up again?

This is the definition of "micro-stutter".
No, the definition of micro-stutter is the lack of perceived smoothness (such as through irregularities in frametimes) even if the framerate is high. It may be that the durations are higher with AFR but that?s not necessarily the case.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The single card had 18.28ms and the SLI setup had 16.00ms.
Uh-huh, and what exactly do those two figures tell you? How can you make an inference about all of the frames based on those two figures?

Also, the durations in ms on the SLI setup are much lower than on a single card.
Of course they?re lower, because the framerate is higher. FPS = frames / time. Since AFR has a higher framerate and you?re rendering the same amount of frames in both columns, the only way for the FPS to increase is for the time decrease, hence the lower durations.

But that isn?t the issue here like I?ve repeatedly stated. The issue is the fluctuations between the durations.

Our graphs show that there isn't any "perfect" synchronization.
It was never claimed a single card would have a perfectly flat graph; such a thing isn?t possible unless you cap the game to below that of the system?s minimum framerate.

The claim that was made was that the single card system delivers a flatter and more consistent framerate and this has been repeatedly proven by both myself and two other websites.

They also show that it isn't the end of the world either or as serious a problem some make it out to be.
Does everyone notice the problem? Probably not.
Does micro-stutter exist in every game? Probably not.

But that doesn?t mean the issue doesn?t exist and that we should sweep it under the rug. It?s a very real problem that some people notice, enough to put them off AFR.

I posted the 2nd run results intentionally "because of the first 11 frames and the discrepancy. That was the "worst run". And again, I mentioned it could have been due to disk thrashing. Runs 3 and 4 were as I mentioned before, more in line with frames 12 through 50 on my graph. And I'll post runs 3 and 4 later tonight. It takes quite a bit of time.
I think at this point you need to decide if you want us to use your figures or not. Don?t post up something and then say ?well, the first 11 frames don?t count so don?t look at those?.

Heck, it could well be that Crysis isn?t as badly affected by the problem so I wouldn?t bother anyway as I?ve done the Call of Juarez using the figures from ComputerBase:

http://img353.imageshack.us/im...3/microstutter2ls7.png

The graph confirms the issue clearly: the AFR system has much larger swings while the single card system delivers a flatter and more consistent line.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: BFG10K
The single card had 18.28ms and the SLI setup had 16.00ms.
Uh-huh, and what exactly do those two figures tell you? How can you make an inference about all of the frames based on those two figures?

Also, the durations in ms on the SLI setup are much lower than on a single card.
Of course they?re lower, because the framerate is higher. FPS = frames / time. Since AFR has a higher framerate and you?re rendering the same amount of frames in both columns, the only way for the FPS to increase is for the time decrease, hence the lower durations.

But that isn?t the issue here like I?ve repeatedly stated. The issue is the fluctuations between the durations.

Our graphs show that there isn't any "perfect" synchronization.
It was never claimed a single card would have a perfectly flat graph; such a thing isn?t possible unless you cap the game to below that of the system?s minimum framerate.

The claim that was made was that the single card system delivers a flatter and more consistent framerate and this has been repeatedly proven by both myself and two other websites.

They also show that it isn't the end of the world either or as serious a problem some make it out to be.
Does everyone notice the problem? Probably not.
Does micro-stutter exist in every game? Probably not.

But that doesn?t mean the issue doesn?t exist and that we should sweep it under the rug. It?s a very real problem that some people notice, enough to put them off AFR.

I posted the 2nd run results intentionally "because of the first 11 frames and the discrepancy. That was the "worst run". And again, I mentioned it could have been due to disk thrashing. Runs 3 and 4 were as I mentioned before, more in line with frames 12 through 50 on my graph. And I'll post runs 3 and 4 later tonight. It takes quite a bit of time.
I think at this point you need to decide if you want us to use your figures or not. Don?t post up something and then say ?well, the first 11 frames don?t count so don?t look at those?.

Heck, it could well be that Crysis isn?t as badly affected by the problem so I wouldn?t bother anyway as I?ve done the Call of Juarez using the figures from ComputerBase:

http://img353.imageshack.us/im...3/microstutter2ls7.png

The graph confirms the issue clearly: the AFR system has much larger swings while the single card system delivers a flatter and more consistent line.

You are taking my entire meaning as if I am saying "microstutter does not exist". Why?
Who is sweeping anything under the rug? I am fully participating in this convo.

The graph confirms the issue as it really is. An issue, and not a serious issue.
And as for the graphs, sometimes the SLI setup has "less" of a swing than a single card. So what you are saying can be used both ways.

And as for my posting results and not expecting anyone to use them rhetoric, why do you only focus on the first 11 frames anyway? When I explained that runs 3 and 4 did not display this? Can you not see "after" frame 11? Where things stabilize? I NEVER said the first 11 frames don't count, but I did express my thought about why they occured. You need to look at the entire picture instead of just what you need BFG. Do you have time to post graphs of your own? I know you don't have SLI, but you can at least run a single.

Furthermore, the OP stated that this severe microstutter exists in every game he tried. And the first game I tried, Crysis, proved that in reality, it's not anywhere near as bad as the OP states. I did nothing special. Just ran fraps, and ran the bench. Recorded it. Done. And another member joins to chime in that my graphs are at the bottom end of microstutter example. Interesting. I thought ALL games no matter what drivers or settings had the most severe case of it. Doesn't seem to be that way.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The graph confirms the issue as it really is. An issue, and not a serious issue.
It?s debatable how serious it is. It?s obviously not serious for you but it?s serious enough for me to avoid AFR style rendering.

And as for the graphs, sometimes the SLI setup has "less" of a swing than a single card.
Possibly, but on average all of the graphs we?ve seen so far demonstrate AFR has more fluctuations.

And as for my posting results and not expecting anyone to use them rhetoric, why do you only focus on the first 11 frames anyway? When I explained that runs 3 and 4 did not display this? Can you not see "after" frame 11? Where things stabilize? I NEVER said the first 11 frames don't count, but I did express my thought about why they occured. You need to look at the entire picture instead of just what you need BFG.
I didn?t focus on just the first 11 frames; I pointed out repeated examples of the issue happening past frame 11, plus I also calculated averages for all frames.

I even dropped the first 11 frames and still calculated a higher average variance with AFR (which coincidentally was quite simalar to CoJ?s average).

I?m not going to waste my time graphing it because (1) I?m still unsure whether you?d ?count? the graph and (2) because I?ve already posted a graph for Call of Juarez that more than proves the issue.

Do you have time to post graphs of your own?
How many times do I need to link this before you click on it?

http://img353.imageshack.us/im...3/microstutter2ls7.png

I?d appreciate an acknowledgement you?ve seen it, especially the graph. Thanks. :)

Furthermore, the OP stated that this severe microstutter exists in every game he tried. And the first game I tried, Crysis, proved that in reality, it's not anywhere near as bad as the OP states.
The OP probably was exaggerating but the Crysis figures you posted did show a higher variance on average with AFR than without.

Now it could be that run 3 & 4 won?t show this which is fair enough, but then I never claimed it happens in every game.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Crysis has nasty micro-stutter .. worse than any game i have explored
--even noted with a single GPU on Very-high :p
[unfortunately that 30 hr d/l was corrupt, so i will try again tomorrow :( .. it was with the ooB Crysis ]

rose.gif
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I used an online translator and they stated they used a single 8800 Ultra followed by using two cards. While they don?t specifically state they used two 8800 Ultras, they?d be pretty stupid not to and that website doesn?t have a track record in stupidity.

I would agree with that, but why is the fps for the single card 29fps and the fps for the dual card setup 24 when their own benchmarks don't mirror this? They are either not comparing the same card in the single and multi-card setup, or they are using a specific snippet of the frame log to illustrate their point.

Because the AFR system?s durations never exceed that of the single card?s you?re saying the AFR system is better off?

I'm saying that if the duration for AFR never exceeds that if a single card, the AFR system is no worse off than the single card. They may both in fact stutter depending on the length of a given duration.

But that isn?t the issue here like I?ve repeatedly stated. The issue is the fluctuations between the durations.

The issue is the length of the duration. Period. The fluctuation is only worthwhile examining if the duration of the AFR system exceeds that of the single card system. If the duration is shorter for the AFR system than the single card, you will NEVER experience a drop in frames or micro stutter with the AFR system that you would not also experience with the single card system.

Which do you think you?re more likely to notice: 2 ms swings between each frame, or 9 ms swings that constantly cut your framerate in half and then rocket back up again?

It depends on the duration... If the total duration between frames for the 2ms swing was 30ms and 32ms, and the total duration between frames for the 9ms swing was 20ms and 29ms, the one with the lower duration is still providing you with the better experience.

This is the definition of "micro-stutter".
No, the definition of micro-stutter is the lack of perceived smoothness (such as through irregularities in frametimes) even if the framerate is high. It may be that the durations are higher with AFR but that?s not necessarily the case.
[/quote]

What do you think causes these irregularities? Hint: a longer than normal duration between frames.

If you were experiencing micro stutter with a multi gpu setup and the durations were consistently lower then a single gpu setup, the single gpu setup would be stuttering continuously.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: nitromullet


If you were experiencing micro stutter with a multi gpu setup and the durations were consistently lower then a single gpu setup, the single gpu setup would be stuttering continuously.

I don't think anyone's arguing this point -- outside of a few unsupported games where multi-GPU performance is lower than single GPU your worst case performance is equivalent single GPU, and it can only get better from there.

The problem is: the multi-GPU setup drops down to single-GPU like frame rate (read: duration between frames) fairly regularly. This problem is exacerbated when a 60hz fixed frequency LCD is added into the mix.

Some people aren't sensitive to fast fast fast fast fast slow fast fast fast fast fast slow. Others find it worse than just slow slow slow slow slow in the first place.

It's one thing to know your GPU isn't capable of your target resolution and settings. It's quite another to think it is and occasionally experience otherwise.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: v8envy
Originally posted by: nitromullet

If you were experiencing micro stutter with a multi gpu setup and the durations were consistently lower then a single gpu setup, the single gpu setup would be stuttering continuously.

I don't think anyone's arguing this point -- outside of a few unsupported games where multi-GPU performance is lower than single GPU your worst case performance is equivalent single GPU, and it can only get better from there.

The problem is: the multi-GPU setup drops down to single-GPU like frame rate (read: duration between frames) fairly regularly. This problem is exacerbated when a 60hz fixed frequency LCD is added into the mix.

Some people aren't sensitive to fast fast fast fast fast slow fast fast fast fast fast slow. Others find it worse than just slow slow slow slow slow in the first place.

It's one thing to know your GPU isn't capable of your target resolution and settings. It's quite another to think it is and occasionally experience otherwise.


I guess in that case I'll have to side with Rollo...

Originally posted by: nRollo

At 25X16, I guarantee you what happens with single GPU is more like "full tilt stutter" than "micro stutter". I've pretty much got to have Crossfire or SLi- I'm not going back to low res just to avoid "micro stutter".

I'm starting to get the impression that this whole debate is more appropriately targeted towards those that would opt for dual 9600GTs or HD3850s over a single 9800GTX, and not say 9800GTX vs 9800GTX SLI.

I would 100% agree that if you have some random multi gpu setup that has roughly the same fps as some random single card then the single card would be the better bet (for more reasons then just duration between frames). However, the best use for multi-gpu setups is when there is no single gpu that can keep up with the multi-gou setup. This is known, and has been the consensus since multi-gpu was re-introduced to the mainstream by NVIDIA a few years ago.
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
nitromullet:

I think this post brought it to the point already...

Originally posted by: Angriff
Most SLI/XFire users increase the graphics settings and target the same frame rate as with a single card. So the micro stutters are a big deal there.

And this problem is relevant for you even if you only have bought SLI/XFire to get from 60 to 100 FPS, because the quality of these 100 FPS is reduced by this as well of course.

Keysplayr's log is really at the bottom end of the possible extend in micro stutters you can get.
I often get even regulary frames that take more than two times longer than the previous ones in 3Dmark and many games with high quality settings in AFR mode.
But this can be somewhat dependent on the settings and the test run itself.

The computerbase.de article shows this quite well.

I definitely agree that much more attention should be brought to this matter!
Hopefully we will see an article at Anandtech!

Inhomogeneous frame rates reduce the quality of the nominal frame rate.
So judging the potency of SLI/CrossFire just by frame rate alone is not enough.

That's the whole point.
 

idiotekniQues

Platinum Member
Jan 4, 2007
2,572
0
76
i hope attention is brought to this so that ati and nvidia are forced to fix it with driver updates. the less attention is on this the more they think they can get away wtih letting it slide.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Datenschleuder
nitromullet:

I think this post brought it to the point already...

Originally posted by: Angriff
Most SLI/XFire users increase the graphics settings and target the same frame rate as with a single card. So the micro stutters are a big deal there.

And this problem is relevant for you even if you only have bought SLI/XFire to get from 60 to 100 FPS, because the quality of these 100 FPS is reduced by this as well of course.

Keysplayr's log is really at the bottom end of the possible extend in micro stutters you can get.
I often get even regulary frames that take more than two times longer than the previous ones in 3Dmark and many games with high quality settings in AFR mode.
But this can be somewhat dependent on the settings and the test run itself.

The computerbase.de article shows this quite well.

I definitely agree that much more attention should be brought to this matter!
Hopefully we will see an article at Anandtech!

Inhomogeneous frame rates reduce the quality of the nominal frame rate.
So judging the potency of SLI/CrossFire just by frame rate alone is not enough.

That's the whole point.

see .. we already knew this .. we just had to give a "reason" to the "why"
- it is a "quality" thing .. and SLi or Crossfire AA modes reduce or remove it; Xfire AA works for me at 16x10 except on Crysis - which is a "micro gitter-fest" on 'very hi' - even with a single GPU [amd or nv]; Multi GPU makes me nauseous .. i play it with the details down and it is OK; turn on AA and it is better but FPS tank on 2900.

thanks for the Thread .. it's the thought that counts

rose.gif
 

imported_Alx

Junior Member
Apr 27, 2005
16
0
66
Whew, having finally read through the whole thread, I must extend heartfelt thanks to the OP for posting this and saving me some money. I've contemplated SLi/Crossfire many times, and probably would have gotten it already if only I had a beefier PSU.

At this point I'm ready to file the statement "I can't see microstutter!" alongside all time greats such as "I can't see ghosting on my LCD!" and "I can't see the difference beyond 30fps!". That should be a studied phenomenon on its own, how eyes seems to change function when looking at hard earned possessions.

One point that's been brought up and that I want to dispute, is that multi-gpu will give you higher minimum FPS at the cost of consistent FPS. As bad as that is in itself, it looks clear to me that in the worst case scenario your minimum FPS will actually be worse than on a single card. Most games don't benefit from 100% increase in FPS when running in SLi, but 100% of the frames are split evenly between two cards. So if you're only getting a 50% increase from the second GPU, but the delay between the slow frames is 70% longer, your minimum FPS is in fact lower than without SLi!

This thread has also been educational in demonstrating that when fiercely defending a purchase, no distance is too far and no eye is too blind. Watching apoppin jump from "ha! tinfoil hat theory" to "it's just your setup" to "you don't get it, it's about IQ" to "this is just a PR war" has been both thoroughly entertaining and illuminating. Here's a hug apoppin, it'll be ok.
 

ChrisRay

Junior Member
Oct 21, 2005
20
0
0
Originally posted by: BFG10K
These checksums were put in place based upon feedback ((By me and others in the sli community)) that Nvidia'd needed some kind of checksum restore point.
I don't buy that response for a second.

If it's for a "checksum restore point" why do the profiles cease to work as soon as someone modifies one of nVidia's preset values? At that point I'm not restoring anything, I just want my modified profile to work, but it doesn't.

If the original checksum isn?t modified (which it isn?t) you can always restore back to the default profile, so why disable the profile entirely?

It's painfully obvious nVidia didn't want people modifying their preset values so they tried to put a lock on them, much like when they started encrypting their drivers to stop Unwinder defeating their application detection back during the FX days.

I actually have Nhancer stickied over at slizone and I completely endorse the program and Nvidia has absolutely zero problem with that. I have actually done my very best to keep grestorm updated with the latest going on's of SLI and driver changes.
The question is, does nVidia completely endorse the program? More specifically do they endorse modifying preset values?

Based on the checksum fiasco it appears not.

You can choose not to believe it. But it doesnt make it true. Nvidia has checksum's in place to modification and a direct restore. They obviously want you to modify it. You can do this through nvapps.xml amongst other things.

The drivers also support a default rendering profile. Which allows for the profile restore function to work. You can choose to modify this at your will. Currently Nhancer doesnt let you "modify" the default values because theres no reason too as it will break profile restoration. These values are also held in the registry incase the profile is lost/deleted.

All profiles now look like this.

<APPLICATION Label="prey.exe"/>
<PROPERTY Label="multichip_rendering_mode" Value="0x00000001" Default="0x00000001" Itemtype="predefined"/>
<PROPERTY Label="multichip_ogl_options" Value="0x00080001" Default="0x00080001" Itemtype="predefined"/>
<PROPERTY Label="get_error" Value="0x00000008" Default="0x00000008" Itemtype="predefined"/>

The default value represents the restore point. And the first multi GPU rendering value represents the modifiable point. Why add a restore a function if they did not anticipate users changing these values?

Anyway. Believe what you want. Doesn't make you right. Me and Grestorm communicate with each other regularly. And I use nhancer as a baseline for communicating SLI functionality with Nvidia.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Datenschleuder
Inhomogeneous frame rates reduce the quality of the nominal frame rate.
So judging the potency of SLI/CrossFire just by frame rate alone is not enough.

That's the whole point.

If that is the whole point, what is the point? :)

We already know that buying a multi-gpu that benchmarks roughly equal to a single gpu is not preferable compared to buying the single gpu. Not only is performance not as consistent with a multi-gpu setup, but there is an additional level of overall system and driver complexity that make the multi-gpu setup less desirable. The only time I have ever heard anyone knowledgeable recommend a multi-gpu setup is when there was no single gpu available that could offer the same level of performance.

If that is the extent of this great service you are trying to provide here, all I have for you in response is, "/yawn".
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: n7
Not sure it's quite as important to people as framerate, simply because not everyone can even see the stutter.

And i wouldn't entirely agree with wording it as a rip-off, even though SLI/CF are generally indeed ripoffs :p
I'd say a better way to say this is that due to numerous drawbacks, SLI/CF isn't for everyone.

It's like my DLP projector.
Sure, some people can see the rainbow effect, but i can't.
So to some, it's crap, or certainly not worth what i paid.
To me, it's definitely worth it.

Now yes, i returned my HD 3870 X2 due to being unhappy with how it performed, & kept my 8800 GTX.

I also don't really recommend SLI/CF to people for various reasons, but for some people, it's a good fit, & if they're happy with it, that's all that matters.

I don't understand the reasoning behind this reply. Who CARES what "some people" (aka the mythical "they") can or cannot see. YOU cannot the downside, but there isn't an upside, its an artificial FPS counter enhancer that increases the number reported in tests while not improving actual smoothness, so why would you pay extra for nothing? And you cannot perceive either according you.

This reminds me some test I heard about... with 20K$ audio cables... they tested them electrically and found that the 20K audio cables are identical to 200$ cables, and both are significantly better then cheaper cables... but a lot of people insisted that the cable they were TOLD cost 20K sounded better. (they lied to half of them telling them the 200$ is the 20K one).
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
Originally posted by: nitromulletIf that is the whole point, what is the point? :)

We already know that buying a multi-gpu that benchmarks roughly equal to a single gpu is not preferable compared to buying the single gpu. Not only is performance not as consistent with a multi-gpu setup, but there is an additional level of overall system and driver complexity that make the multi-gpu setup less desirable. The only time I have ever heard anyone knowledgeable recommend a multi-gpu setup is when there was no single gpu available that could offer the same level of performance.

If that is the extent of this great service you are trying to provide here, all I have for you in response is, "/yawn".
Maybe YOU know that now, but this issue is unknown by most people.
And even those who are effected and did notice the issue are mostly unaware about the cause.

Or can you show me even a single English speaking hardware review site that measures and points at the problem of inhomogeneous frame times with AFR?!
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: phexac
This is interesting, but the reason I have stayed away from SLI is that 1 high-end card is enough to run pretty much every game out there at max settings at a good framerate. So why do I need a 2nd card? The one exception is Crysis. Good job making a game that no one can play at max settings for the next 1-2 years, SLI or not. I'll be buying that game when there are cards that can actually run it at max.

Good luck. Farcry is STILL among the lowest fps of any game I own, and it looks like garbage by todays standards. Crytek's engines are like a 9000 lb gorilla, they put a bunch of eye candy in em, dont care that nothing runs em, and six months later a better optimized (or tweaked old engine) comes out that looks better and actually runs. The fact that they are so caught up in impressing you with their engine that they render a FoV of about fourty square miles probably doesn't help. Bioshock looks better than Crysis, and I get 100+ fps in it with max settings. It's all indoors, but even so, that's clearly a much better engine (UT3) than cryteks.
 

Jax Omen

Golden Member
Mar 14, 2008
1,654
2
81
I definitely second the "Bioshock looks better than Crysis, and runs infinitely better while doing it" motion.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Datenschleuder
Originally posted by: nitromulletIf that is the whole point, what is the point? :)

We already know that buying a multi-gpu that benchmarks roughly equal to a single gpu is not preferable compared to buying the single gpu. Not only is performance not as consistent with a multi-gpu setup, but there is an additional level of overall system and driver complexity that make the multi-gpu setup less desirable. The only time I have ever heard anyone knowledgeable recommend a multi-gpu setup is when there was no single gpu available that could offer the same level of performance.

If that is the extent of this great service you are trying to provide here, all I have for you in response is, "/yawn".
Maybe YOU know that now, but this issue is unknown by most people.
And even those who are effected and did notice the issue are mostly unaware about the cause.

Or can you show me even a single English speaking hardware review site that measures and points at the problem of inhomogeneous frame times with AFR?!

I gotta agree with this point... I have NEVER heard of the problem.. I performed a detailed analysis of vsync and came to the conclusion that AFR is non beneficial, posted it last week. I have NEVER heard of it ANYWHERE before.
There were many people who complained about issues with micro-stutter... but nobody that I know of ever made the accusation that it is an inharent fault of microstutter before that.

It is nice to see more threads dissecting the issue. This is definitely something that anand should perform an in depth investigation of. But as far as I can tell you should NEVER EVER use AFR.

SFR is still perfectly safe and usable though...
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
Yes, but I never found SFR usefull with my SLI system.
It always performs much worse or doesn't work at all.

Also nearly all game profiles I can see use AFR.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ah... better to get a better numerical score then better actual performance... go figure...
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: taltamir


It is nice to see more threads dissecting the issue. This is definitely something that anand should perform an in depth investigation of. But as far as I can tell you should NEVER EVER use AFR.

SFR is still perfectly safe and usable though...

Get yourself a second card and SLI them. Then tell me you should never ever use AFR.

Don't have an SLI mobo? Or a power supply that can handle a second card? (I believe you have a 8800GT or GTS 512 right?) Well then, I'm not going to tell you to spend extra money for the whole SLI setup.

You'd have to experience it for yourself I guess, to see how blown out of proportion this whole thing is. Like I said before, there is an issue, just nowhere near as terrible as the OP, or others suggest.


 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
the issue is that AFR provides artifically increased FPS with the same smoothness of a single card. It is a waste of money. I am waiting to actually see someone contradict some of the SLI theory I posted in a way that invalidates this conclusion.

I don't see how you can blow out of proportion something like "AFR gives higher MEASURED performance than one card, but the the same ACTUAL performance". This indicates that the entire thing is all smoke and mirrors and one big ripoff and waste of money.