CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Datenschleuder
Originally posted by: keysplayr2003
More fps is more fps no matter how you slice it.
The whole point is that the FPS gain becomes insignificant when the frames are not updated homogeneously.

Frame #0: 00.0 ms
Frame #1: 00.1 ms should be: 16.6 ms
Frame #2: 33.3 ms

So in this (exaggerated example):

- the frame rate says 60 FPS (1000 ms / (33.3 ms / 2))
- but the reality is that it is only 30 FPS (1000 ms / 33.3 ms)

Because Frame #1 update is not significant (it follows almost immediately after the previous frame), and Frame #2 takes 33.2ms to update, which equals ~30 FPS.

Read my first post, where the issue is explained with a real example.

That's terrific. Now what does this all mean to the end user, realistically?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: CP5670
Well, I think the OP is correct and am pretty sure I've seen this issue myself, but I do agree that the way he presented this topic and is arguing with everyone is hurting his credibility.

Has it occurred to anyone if AFR actually performed worse one reputable tech site would have noticed it by now? This may be being discussed by some guys "on the German forums", but all I have to do is disable GPUs or take out a card to see things run choppy at my settings with single GPU and run great with Multi.

They probably have not actually looked. It's fairly subtle if you don't know about it and not something that would be readily apparent while just letting the benchmarks run. The reviewer would have to be sitting at the computer the whole time and looking for things like this. It's certainly worth bringing to Derek's attention in any case. I would like to see one of the major hardware sites look into this and related issues, whether they confirm or debunk them.

/\ This.

The OP has a point about how stats are reported, but I believe most of this information has already been widely available. It's unrealistic for most sites/magazines/etc that review hardware to take up time, energy, and column inches breaking down the basics of SLI again and again.

I've never seen this reported by any hardware site until that PCGH article. I had my suspicions for a while and some other posters here had made comments to the effect over the years (most recently n7 in his 3870 X2 thread), but that was about it.

PM DerekWilson if you are really interested :p
-or us the email address in his profile

i think there is a LOT more interesting things to review first for 95% of us and the General Public.

However, it would be nice to have a list of "fixes" in one place
- i am NOT volunteering as i do not see ANYTHING like you guys are reporting with AMD drivers .. although - in the PAST - they were much more prominent

which then leads me to *speculate* that AMD's multi-GPU rendering is NOT "strict AFR" .. as i believe they hinted .. a long time ago

rose.gif
 

CP5670

Diamond Member
Jun 24, 2004
5,517
592
126
Originally posted by: apoppin
PM DerekWilson if you are really interested :p
-or us the email address in his profile

I just did, actually. I think he'll agree that it's at least worth looking into, but it will probably be some time before we see the results in an actual article.
 

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
I don't understand alternate frame rendering. How can the game know what the next frame is going to be like when it hasn't reached that point yet? If it's just rendering the same frame again, why render anything?
Is it just me or does everyone ignore this?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: Throckmorton
I don't understand alternate frame rendering. How can the game know what the next frame is going to be like when it hasn't reached that point yet? If it's just rendering the same frame again, why render anything?
Is it just me or does everyone ignore this?

It's not like that. The driver can report that the frame is done and is ready for another frame while GPU1 is half way done with its work. Then GPU 2 can start rendering whatever the current frame is, while GPU1 is not done yet knowing full well that frame will be displayed before it finishes.

_ _ _ _ _ _ _ _
*_ _ _ _ _ _ _ _

Of course the next frame could be more or less complex than the previous one, so you're not guaranteed to have this perfect interleaving. Also the driver may not be able to guestimate perfectly where the half-way mark of the current frame rendering is at. That's why you get results like the OP posted, where the subsequent frame comes much sooner than desired, and doesn't really add much to the experience while still showing a very high frame rate to FRAPS or the benchmark. Then the following frame shows up much later than wanted, running at basically single GPU smoothness for that frame interval and making the user experience stutter.

Add in LCDs running at a fixed 60 hz and you get a really difficult problem to solve perfectly.

SFR is the only way to fly for multi-GPU. The sooner GPU makers realize this the sooner picky buyers like myself will accept multi-GPUs.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: keysplayr2003
Originally posted by: Datenschleuder
Originally posted by: keysplayr2003
More fps is more fps no matter how you slice it.
The whole point is that the FPS gain becomes insignificant when the frames are not updated homogeneously.

Frame #0: 00.0 ms
Frame #1: 00.1 ms should be: 16.6 ms
Frame #2: 33.3 ms

So in this (exaggerated example):

- the frame rate says 60 FPS (1000 ms / (33.3 ms / 2))
- but the reality is that it is only 30 FPS (1000 ms / 33.3 ms)

Because Frame #1 update is not significant (it follows almost immediately after the previous frame), and Frame #2 takes 33.2ms to update, which equals ~30 FPS.

Read my first post, where the issue is explained with a real example.

That's terrific. Now what does this all mean to the end user, realistically?

You probably should look at that again...

Basically, what he is showing is that if the frames aren't generated at roughly the same interval, the effect is that the perceived fps drops even though mathematically the fps remains the same.

He's saying that for you to have 60fps, you need to have a frame every 16.6ms on average over the time span of 1 second. You will only perceive the 60fps as 60fps if every frame is rendered as closely to the 16.6ms interval as possible. You want your frames to be like this (in ms):

1...16.6...33.3...49.9...66.5 this gives you a constant 60fps

what he is saying can happen with AFR is that your frames might look like this:

1...33.2...33.3...66.4...66.5 on average this gives you 60fps, but the interval between every second frame is as long as the interval for 30fps, so your fps would feel like 30fps.

This is an extreme example of this and most likely you won't have such a variance in intervals between frames, but this does help to illustrate his point.

As far as what this translates to in the real world... I doubt the the intervals are actually that large. I also agree with apoppin that the whole thing is a compromise, and that the increase in overall fps with dual card setups allows for greater eye candy and thus outweighs the negative effect of micro stutter in my experience.

 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Originally posted by: keysplayr2003
Originally posted by: Datenschleuder
Originally posted by: keysplayr2003
More fps is more fps no matter how you slice it.
The whole point is that the FPS gain becomes insignificant when the frames are not updated homogeneously.

Frame #0: 00.0 ms
Frame #1: 00.1 ms should be: 16.6 ms
Frame #2: 33.3 ms

So in this (exaggerated example):

- the frame rate says 60 FPS (1000 ms / (33.3 ms / 2))
- but the reality is that it is only 30 FPS (1000 ms / 33.3 ms)

Because Frame #1 update is not significant (it follows almost immediately after the previous frame), and Frame #2 takes 33.2ms to update, which equals ~30 FPS.

Read my first post, where the issue is explained with a real example.

That's terrific. Now what does this all mean to the end user, realistically?

You probably should look at that again...

Basically, what he is showing is that if the frames aren't generated at roughly the same interval, the effect is that the perceived fps drops even though mathematically the fps remains the same.

He's saying that for you to have 60fps, you need to have a frame every 16.6ms on average over the time span of 1 second. You will only perceive the 60fps as 60fps if every frame is rendered as closely to the 16.6ms interval as possible. You want your frames to be like this (in ms):

1...16.6...33.3...49.9...66.5 this gives you a constant 60fps

what he is saying can happen with AFR is that your frames might look like this:

1...33.2...33.3...66.4...66.5 on average this gives you 60fps, but the interval between every second frame is as long as the interval for 30fps, so your fps would feel like 30fps.

This is an extreme example of this and most likely you won't have such a variance in intervals between frames, but this does help to illustrate his point.

As far as what this translates to in the real world... I doubt the the intervals are actually that large. I also agree with apoppin that the whole thing is a compromise, and that the increase in overall fps with dual card setups allows for greater eye candy and thus outweighs the negative effect of micro stutter in my experience.


This is correct, but it is also true that a single GPU also has the problem, just to a lesser extent. If you draw 57 frames in the first 500ms and 3 more frames for the second 500ms, it will look extremely jerky, despite the fact that you are running 60 frames per second. This CAN and DOES happen in both single and multi GPU scenarios. This is why frames per second can be misleading, even when talking minimums. Probably would get far more accuracy if you dropped it down to 1/10th of a second intervals and used the term frames per 100ms or something like that. It doesn't run off the tongue quite the same, but would be far better for painting the 'real' picture on smoothness. Something to aim for would be 6 frames per 100ms and have it never drop below 2 frames per 100ms.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: nitromullet
Originally posted by: keysplayr2003
Originally posted by: Datenschleuder
Originally posted by: keysplayr2003
More fps is more fps no matter how you slice it.
The whole point is that the FPS gain becomes insignificant when the frames are not updated homogeneously.

Frame #0: 00.0 ms
Frame #1: 00.1 ms should be: 16.6 ms
Frame #2: 33.3 ms

So in this (exaggerated example):

- the frame rate says 60 FPS (1000 ms / (33.3 ms / 2))
- but the reality is that it is only 30 FPS (1000 ms / 33.3 ms)

Because Frame #1 update is not significant (it follows almost immediately after the previous frame), and Frame #2 takes 33.2ms to update, which equals ~30 FPS.

Read my first post, where the issue is explained with a real example.

That's terrific. Now what does this all mean to the end user, realistically?

You probably should look at that again...

Basically, what he is showing is that if the frames aren't generated at roughly the same interval, the effect is that the perceived fps drops even though mathematically the fps remains the same.

He's saying that for you to have 60fps, you need to have a frame every 16.6ms on average over the time span of 1 second. You will only perceive the 60fps as 60fps if every frame is rendered as closely to the 16.6ms interval as possible. You want your frames to be like this (in ms):

1...16.6...33.3...49.9...66.5 this gives you a constant 60fps

what he is saying can happen with AFR is that your frames might look like this:

1...33.2...33.3...66.4...66.5 on average this gives you 60fps, but the interval between every second frame is as long as the interval for 30fps, so your fps would feel like 30fps.

This is an extreme example of this and most likely you won't have such a variance in intervals between frames, but this does help to illustrate his point.

As far as what this translates to in the real world... I doubt the the intervals are actually that large. I also agree with apoppin that the whole thing is a compromise, and that the increase in overall fps with dual card setups allows for greater eye candy and thus outweighs the negative effect of micro stutter in my experience.

Extreme example indeed. I would imagine that this happens with single cards as well.
EDIT: Ah, what apoppin said above.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
my take on the whole thing... if you can't notice the issue, you can't notice the "low" FPS of a cheaper system. The extra cards in multi GPU don't give you a smoother experience, they give you the same smoothness with a higher FRAPS score. The "I don't notice my 1000$ is not any faster then a 300$ system so it's money well spent" argument is flawed at best. The reviews that depict the 1000$ system as actually being faster are cheating customers out of their money.

split frame rendering being the only exception where multi-GPU is useful.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: taltamir
my take on the whole thing... if you can't notice the issue, you can't notice the "low" FPS of a cheaper system. The extra cards in multi GPU don't give you a smoother experience, they give you the same smoothness with a higher FRAPS score.


That's not quite right either. On average the multi-GPU system will give you a smoother experience than an equivalent single GPU system, it's only in the worst case where the two are 'tied.' Excluding the case where dual GPUs perform worse than a single in unsupported games, of course.

The multi-GPU fans are saying this is an improvement -- higher frame rates for 9 frames out of 10 are still faster frame rates.

Years ago I preferred a steady yet higher latency of DSL (about 30ms) to a much lower yet wildly fluctuating latency of cable (3-300ms, about 80% of the time under 30ms, on average outperforming DSL by a landslide). I could compensate for a known latency, but fluctuations are hard to deal with. Some people have the same issue with AFR.

After technology improved I'm back with cable -- I get the lower latency and it doesn't vary wildly. That'd be SFR for the multi-GPU equivalent.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
my take on the whole thing... if you can't notice the issue, you can't notice the "low" FPS of a cheaper system. The extra cards in multi GPU don't give you a smoother experience, they give you the same smoothness with a higher FRAPS score. The "I don't notice my 1000$ is not any faster then a 300$ system so it's money well spent" argument is flawed at best. The reviews that depict the 1000$ system as actually being faster are cheating customers out of their money.

split frame rendering being the only exception where multi-GPU is useful.

are you using Crossfire or SLi?
:Q

rose.gif


i'd like to compare rendering now .. i have a better experience with AMD multi-GPU to report than you do

btw, i *also* noticed very minor micro-instability even with a *single GPU* - and more with my GTS and even a GTX than with AMD cards i owned. It's a tiny bit more with NVIDIA than i noticed with my 2900xt or x1950p/-512M - Particularly in canned benches like 3DMark06 - but you have to look SO closely .. and it also could just me me and my old inclination to just favor ATI IQ; btw - as to IQ, NOT anymore!
--that makes it "subjective" in my book, unless someone wants to record hundreds of GBs of FRAPS and try to set them side-by-side - also while analyzing the Graphs and checking all the Windows tools [hoping they don't interfere with your testing results]

Sounds like my kind of "fun" .. mmm self-torture ... with the constantly burning red-eyes being a specialty of mine
.. not!
 

n7

Elite Member
Jan 4, 2004
21,303
4
81
I agree with the OP for the most part.

I'm not saying it's a big deal to everyone, but i noticed very quickly how unsmooth Crossfire felt vs. my single GTX.

I realize many people (likely the majority actually) do not notice this.

I saw in my own use & testing, it's also not something that can be measured in fps, as fps was higher on average w/ my 3870 X2 vs. my GTX, yet gameplay was not as smooth at all IMO.

 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
You can't directly compare 3870X2 and GTX in this context, although I am sure the micro-stutter surely exist on 3870X2. This isn't really about average FPS being high but minimum being low, etc. It's arguably harder to notice this than 'shimmering', in my opinion. A lot of users will not see this until being pointed out (was the case for me) but it surely can annoy people with sensitive eyes.
 

n7

Elite Member
Jan 4, 2004
21,303
4
81
Problem is, it felt way off even when fps weren't dropping badly (which was indeed another huge issue also).

I was finding that in Crysis, e.g, i'd be merrily cruising w/ 40 fps, yet it felt like it was chugging like crap.

As i mentioned, i'm pretty sure that most people do not notice what i did w/ my 3870 X2, or they'd be agreeing with the OP, rather than claiming BS.

I suspect it's the same sorta thing as LCD input lag...some people can see it, others can sorta, others cannot at all.

Granted, i might be completely wrong...but wrong or not, there's even less of a chance than ever before that i will be running CF/SLI ever again until it gets "fixed" for crazy people like myself ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
No offense, but I don't buy this.
Why? It?s a documented phenomenon.

I can certainly understand some people aren?t bothered by it (e.g. I?m generally not bothered by tearing with vsync off), but that doesn?t mean it doesn?t happen.

The whole "framerates are much higher but the actual performance is much lower" thing just isn't true. I've been running SLi since there's been SLi, on lots of different computers. I know for a fact that I've run many high AA, high res settings that no sinlge GPU solution could come close to, and run them silky smooth.
The problem is the term ?performance? is a multi-faceted description. AFR gives a high framerate at the expense of smoothness and input response. With micro-stuttering a given framerate doesn?t feel as smooth as it does without, so a single card @ 60 FPS will generally feel smoother than a multi-card @ 60 FPS.

I imagine there is deviation from the time of rendering the frames between AFR and single card, but I don't think there is any "scandal" going on.
I wouldn?t call it a scandal either but it?s definitely not in the public eye like it should be, much like the shimmering issue or driver problems in TWIMTBP titles that have been going on for almost two years.

Has it occurred to anyone if AFR actually performed worse one reputable tech site would have noticed it by now?
You mean like the shimmering issues which were never in the public eye? Ironically it was a German website that first devoted articles to the issue.
 

CP5670

Diamond Member
Jun 24, 2004
5,517
592
126
Granted, i might be completely wrong...but wrong or not, there's even less of a chance than ever before that i will be running CF/SLI ever again until it gets "fixed" for crazy people like myself :p

The thing is, picky people like us are also the ones more likely to get these multi GPU setups, if we weren't aware of these flaws. :p
 

n7

Elite Member
Jan 4, 2004
21,303
4
81
Originally posted by: CP5670
Granted, i might be completely wrong...but wrong or not, there's even less of a chance than ever before that i will be running CF/SLI ever again until it gets "fixed" for crazy people like myself :p

The thing is, picky people like us are also the ones more likely to get these multi GPU setups, if we weren't aware of these flaws. :p

So true :(
 

n7

Elite Member
Jan 4, 2004
21,303
4
81
Okay, i finally actually noticed the video link, & watched the video

That's exactly what i was experiencing w/ my 3870X2!

I imagine not everyone can see it in the video, but to me it's so clear...it looks like little stuttering/chugging constantly with CF on vs. off in that video.

I was driving the hummer in Crysis & was noticing it like crazy...just like how it is watching that NFS part in the video.

Same crap in UT3, which was the real dealbreaker for me, since that's what i play mostly.

Well, i know i'm not crazy anymore.

Now that i've seen that video, it all makes more sense.

I'm going to ask some buddies if they can see the stutter/chugging in that video...will be interesting to see if they notice it like i do.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
None of you seem to really get it

The educated SLi/Multi-GPU buyer buyer doesn't purchase SLI just for the theoretical AFR performance gains - they buy for the opportunity to significantly increase IQ per frame! Using nvidia SLI without also making use of SLI-AA modes is just stupid - same thing imo if you don't use the wide tent featurees and edge detect of Crossfire AA - it goes up to 48X!!!

Of course, NVIDIA hasn't really helped by taking their time to add the modes on G8x and then virtually requiring you to have nHancer to make use of of any of them them. To play Oblivion fully-modded up @ 2560x1600 - way above what i can get - with 16x SLI-AA plus all the other standard IQ fare is amazing; same thing with CrossFire AA

i have no problem with the micro stutter - i'd be a fool to deny - but it does not poke or burn my eyes .. a little compromise for the incredible visual goodies?
NP

rose.gif



I imagine not everyone can see it in the video, but to me it's so clear...it looks like little stuttering/chugging constantly with CF on vs. off in that video.
i think the X2s might suffer more than CrossFire ,, at least from what i read on the forums; perhaps their defective rate is high. if i switch my Pro with my XT in the PCIe slots i can "make it happen" - a lot - like in the video, but worse - where you'd RMA them!

i'd much rather have a little instability if i can get the amazing IQ AA filtering at 16x12/16x10 with my settings maxed as far as they can go - including Oblivion and HellGate: London! They are just awesome and i can *overlook* any instability that may appear for an instant.

i just got Crysis - i was 'forced to' as the Demo is not relevant anymore - but i have 15 hours left to go to D/L the rest of the patch before i can play with it :p
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Originally posted by: BFG10K
The problem is the term ?performance? is a multi-faceted description. AFR gives a high framerate at the expense of smoothness and input response. With micro-stuttering a given framerate doesn?t feel as smooth as it does without, so a single card @ 60 FPS will generally feel smoother than a multi-card @ 60 FPS.
I think this sums it up nicely. Although if i could add to it, I thought I felt it in a worse manner when FPS is lower. I was able to get used to it when FPS is 60, but when the FPS dips to 30 it just didn't feel like 30FPS on a single card. (My setup is 8800 GT SLI, BTW)
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Originally posted by: apoppin
The educated SLi/Multi-GPU buyer buyer doesn't purchase SLI just for the theoretical AFR performance gains - they buy for the opportunity to significantly increase IQ per frame!
I agree wholeheartedly. Although G92 SLI has been quite bitter to me thanks to its less than stellar AA performance.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: lopri
Originally posted by: BFG10K
The problem is the term ?performance? is a multi-faceted description. AFR gives a high framerate at the expense of smoothness and input response. With micro-stuttering a given framerate doesn?t feel as smooth as it does without, so a single card @ 60 FPS will generally feel smoother than a multi-card @ 60 FPS.
I think this sums it up nicely. Although if i could add to it, I thought I felt it in a worse manner when FPS is lower. I was able to get used to it when FPS is 60, but when the FPS dips to 30 it just didn't feel like 30FPS on asingle card. (My setup is 8800 GT SLI, BTW)

So in others words, you felt the extra performance - including the enhanced SLi-AA modes - was not worth the "irritation" you experienced?

For me it is the opposite - i'd much rather enjoy the improved Visuals with the extra filtering that multi-GPU CrossFire affords - since i can "see" micro-stuttering with a *single* GPU - anyway - and i still manage to play for hundreds of hours without going completely insane
--hey, maybe that IS what happened to me
:Q


naw

rose.gif

 

mvvo1

Junior Member
Apr 19, 2008
1
0
0
Originally posted by: Datenschleuder
Show us your frame log

:)
Apparently you don't read my posts at all.
I did this in the first post already my friend.

I won't waste my time anymore here on replies that only contain wild claims.

THANKS for keeping this thread intelligent and reasonable!

I'll bet you are happy your $1000 Quad Core CPUs! :) How come no one takes Intel to task for telling me that Quad Core is better than dual core? What has ever gotten faster in gaming with the 4 cores?

Ugh - this post is some academic. Do you bleed blue?