CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ZtyX

Junior Member
Apr 19, 2008
24
0
0
Datenschleuder.. I am interested in this discussion. Can you answer the people who have posted here, please? I'd like to know what you have to say about the things people have posted to argue with you.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: mvvo1
Originally posted by: Datenschleuder
Show us your frame log

:)
Apparently you don't read my posts at all.
I did this in the first post already my friend.

I won't waste my time anymore here on replies that only contain wild claims.

THANKS for keeping this thread intelligent and reasonable!

I'll bet you are happy your $1000 Quad Core CPUs! :) How come no one takes Intel to task for telling me that Quad Core is better than dual core? What has ever gotten faster in gaming with the 4 cores?

Ugh - this post is some academic. Do you bleed blue?

it is pretty clear - at least to me - that this is "about Intel" and it is designed to cast "doubt" on NVIDIA
- if it walks like a duck .. whatever that means .. but it smacks of "PR" .. and i smell intel.

rose.gif


Datenschleuder.. I am interested in this discussion. Can you answer the people who have posted here, please? I'd like to know what you have to say about the things people have posted to argue with you.
He has not answered a single one of us and refuses to post his own rig specs. i am certain there is "purpose" in this OP .. and it appears it is not to clarify anything but to obfuscate

what about Quad core being a Bust for gaming?
- a PR joke .. think their Octo-beast will be any better? Very doubtful imo. Intel is clueless about Graphics[period] and they are bitter that NVIDIA is not helping them as ATI helped AMD.

so i expect a dirty PR war .. i have no idea if this is the first shot from intel or not .. but it sure fits in with their history of dealing with AMD

 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
Originally posted by: lopri
Oh and vSync helps if I'm not mistaken.
No, it doesn't.
VSync is about aligning the front buffer swap with the vertical blanking interval of the RAMDAC.

The vertical blanking interval is homogeneous, but the problem is about an inhomogeneous content update of the back buffer, so it doesn't have anything to do with the problem.


And there are not just some people effected, who are sensitive to stutters.
This problem is as important as raw frame rate! So claiming that this isn't significant is like claiming that higher frame rates are not important.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: n7
Okay, i finally actually noticed the video link, & watched the video

That's exactly what i was experiencing w/ my 3870X2!

I imagine not everyone can see it in the video, but to me it's so clear...it looks like little stuttering/chugging constantly with CF on vs. off in that video.

I was driving the hummer in Crysis & was noticing it like crazy...just like how it is watching that NFS part in the video.

Same crap in UT3, which was the real dealbreaker for me, since that's what i play mostly.

Well, i know i'm not crazy anymore.

Now that i've seen that video, it all makes more sense.

I'm going to ask some buddies if they can see the stutter/chugging in that video...will be interesting to see if they notice it like i do.

So, now that you know you were ripped off, are you selling the extra cards and going to a single GPU or SFR double GPU only?
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Not sure it's quite as important to people as framerate, simply because not everyone can even see the stutter.

And i wouldn't entirely agree with wording it as a rip-off, even though SLI/CF are generally indeed ripoffs :p
I'd say a better way to say this is that due to numerous drawbacks, SLI/CF isn't for everyone.

It's like my DLP projector.
Sure, some people can see the rainbow effect, but i can't.
So to some, it's crap, or certainly not worth what i paid.
To me, it's definitely worth it.

Now yes, i returned my HD 3870 X2 due to being unhappy with how it performed, & kept my 8800 GTX.

I also don't really recommend SLI/CF to people for various reasons, but for some people, it's a good fit, & if they're happy with it, that's all that matters.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: n7
Not sure it's quite as important to people as framerate, simply because not everyone can even see the stutter.

And i wouldn't entirely agree with wording it as a rip-off, even though SLI/CF are generally indeed ripoffs :p
I'd say a better way to say this is that due to numerous drawbacks, SLI/CF isn't for everyone.

It's like my DLP projector.
Sure, some people can see the rainbow effect, but i can't.
So to some, it's crap, or certainly not worth what i paid.
To me, it's definitely worth it.

Now yes, i returned my HD 3870 X2 due to being unhappy with how it performed, & kept my 8800 GTX.

I also don't really recommend SLI/CF to people for various reasons, but for some people, it's a good fit, & if they're happy with it, that's all that matters.

I still think you would have been happier with a 9800GX2 n7. (except for the multimonitor I know you demand)

The higher minimums of each 9800 core would have held that chugging at bay!
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Datenschleuder
Originally posted by: lopri
Oh and vSync helps if I'm not mistaken.
No, it doesn't.
VSync is about aligning the front buffer swap with the vertical blanking interval of the RAMDAC.

The vertical blanking interval is homogeneous, but the problem is about an inhomogeneous content update of the back buffer, so it doesn't have anything to do with the problem.


And there are not just some people effected, who are sensitive to stutters.
This problem is as important as raw frame rate! So claiming that this isn't significant is like claiming that higher frame rates are not important.

And on the flip side of the coin, claiming this as "significant" is just stepping outside the goal line. If you know what I mean.
You joined at least two forums on the same day "April 17th" to start spreading this gospel. Here, and B3D and maybe other forums as well. To what end? This really hasn't been a major, or even a minor, issue for most people using multicard. Am I wrong? There has been talk of this here and there, but never on the earth shattering scale you'd like to see it.

I truly hope you get your hits out of this. You're working hard for them.
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
Why do you have a problem with the ambition to make this issue known in the scene?

You haven't understood what this is about if you believe that this problem isn't important.

The performance in a real time application (like a game) is determined by the ability to deliver processed data in a predefined time frame.
For a graphics card, this means to update frames at the highest, homogeneous rate possible.
It is worthless to update a frame buffer a billion times within a microsecond and not delivering updates for a long time.
The important thing is to deliver these frames evenly distributed over time, so that a smooth scene update occurs.

It might be that you have never noticed the problem, but this is exactly like not seeing a difference between two different frame rates.
So this issue has to be qualified by measurements and not by visual observations.

I have delivered a measured frame time log (8800GTX SLI, Crysis) that shows how inhomogeneous frame buffer updates reduce the practical frame rate from an expected smooth ~30 FPS down to ~17.8 FPS in practice at every second frame.
And the difference between 30 FPS and 17.8 FPS is very significant of course.

This problem with AFR (which is the only practical dual GPU mode) exists since the beginning of SLI and CrossFire.
I (and others) confronted Nvidia and ATI with this years ago. They confirmed it, but never fixed it.

I hope that everyone agrees that it is in the best interest of the consumers to make this well known via forums and review sites, so that the hardware vendors are forced to finally fix the problem.
Saying that it is wrong to spread this knowledge is simply ridicules.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
The scene? You are trying to "make" a scene is what I'm trying to say. homogeneous or not, how does this affect the end user, as I have asked you since the beginning of this thread?
If you have frames rendered as:

16ms, 32ms 34ms 48ms 49ms 66ms 67ms etc. Are not the frames rendered in 10seconds time still quite larger than of a single card?
What about games that multi GPU solutions can render at Hundreds of frames per second? Or are you just voicing the worst case scenario?

Multicard: || || || || || || || || || ||
Single: | | | | | | | | | |

Each vertical line above represents a rendered frame at a given time in ms. You are talking about the longer duration (microstutter) between the larger gaps. Above is probably the absolute worst case.

Instead of:

Multicard: | | | | | | | | | | | etc.
Single: | .. | .. | .. | .. | .. | etc.

Correct? Do I understand?

Now let me ask. How is this worse than a single card?

And finally, what, or should I say where, exactly do you measure the rendered frames? The front buffer? or the ROPs?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Datenschleuder
Why do you have a problem with the ambition to make this issue known in the scene?

You haven't understood what this is about if you believe that this problem isn't important.

The performance in a real time application (like a game) is determined by the ability to deliver processed data in a predefined time frame.
For a graphics card, this means to update frames at the highest, homogeneous rate possible.
It is worthless to update a frame buffer a billion times within a microsecond and not delivering updates for a long time.
The important thing is to deliver these frames evenly distributed over time, so that a smooth scene update occurs.

It might be that you have never noticed the problem, but this is exactly like not seeing a difference between two different frame rates.
So this issue has to be qualified by measurements and not by visual observations.

I have delivered a measured frame time log (8800GTX SLI, Crysis) that shows how inhomogeneous frame buffer updates reduce the practical frame rate from an expected smooth ~30 FPS down to ~17.8 FPS in practice at every second frame.
And the difference between 30 FPS and 17.8 FPS is very significant of course.

This problem with AFR (which is the only practical dual GPU mode) exists since the beginning of SLI and CrossFire.
I (and others) confronted Nvidia and ATI with this years ago. They confirmed it, but never fixed it.

I hope that everyone agrees that it is in the best interest of the consumers to make this well known via forums and review sites, so that the hardware vendors are forced to finally fix the problem.
Saying that it is wrong to spread this knowledge is simply ridicules.

What do you program? .. i programmed games equivalent to Pong back in the 80s :p
- if it for "intel" .. i completely "understand" .. they are clueless about Graphics .. look at their IG sh!t; it barely runs Aero today! ... what happened to Quad Core support for new games? .. slow coming, huh? and unless my memory has gone to crap wasn't Intel supposed to have dx10 before the g80? - like 2-1/2 years ago? Get Real. Intel will take 10 years to catch up to NVIDIA and AMD and by then, they will be 20 years behind.
:roll:


How about if i go onto some Intel Forums and "spread the truth" as you are apparently doing here? You seem to know a lot about intel; just not a clue about the way SLI or even CrossFire actually *works*
--What is your system? - for the millionth time = what drivers, what OS, what setting, wtF?

How the hell can ANY of us take you seriously when you don't *appear* to even know what what you are talking about. We expect a hell of a lot more than a messed-up video and a chugging 2nd GPU's log to make ANY serious answer to your "facts"

We have a much higher standard here at AnandTech Video than at most video tech forums [for our intelligent discussions] and we expect you to also meet them - if you are to be taken seriously
i'm certain there is good info also at Intel's own site .. you seem to also be unaware of how graphics 'works' .. especially for a programmer; please explain and give us your rig specs.
 

Datenschleuder

Junior Member
Apr 17, 2008
23
0
0
keysplayr:

Yes, you are correct on the inhomogeneous frame rates.

I have not said that the AFR performance is worse than with single card mode because of this problem, but it eliminates most of its benefits.

The frame times are measured with the frame buffer swaps.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Datenschleuder
keysplayr:

Yes, you are correct on the inhomogeneous frame rates.

I have not said that the AFR performance is worse than with single card mode because of this problem, but it eliminates most of its benefits.

The frame times are measured with the frame buffer swaps.

So what is the problem?
:confused:
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: apoppin
Originally posted by: Datenschleuder
keysplayr:

Yes, you are correct on the inhomogeneous frame rates.

I have not said that the AFR performance is worse than with single card mode because of this problem, but it eliminates most of its benefits.

The frame times are measured with the frame buffer swaps.

So what is the problem?
:confused:

Highlighted it for you ;)

I believe his contention is that the SLI is not worthwhile if it performs like a single card every other frame which appears as microstuttering. I've had a couple SLI systems and while they provided a framerate advantage over single cards, the microstuttering was apparent at times and was rather annoying. Like others have mentioned, it doesn't affect all people but it seems that the issue is worth looking into. That's all I get out of Datenschleuder's posts so I'm not sure why you guys are railing on him instead of pming Derrick (sp?) and asking him to investigate. If the matter is found to be insignificant than great, but if not than it might provide some sort of catalyst for AMD and Nvidia to do something about it.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
I PMed Derek on this one :)

We'll see if he responds i guess...
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The educated SLi/Multi-GPU buyer buyer doesn't purchase SLI just for the theoretical AFR performance gains - they buy for the opportunity to significantly increase IQ per frame!
You have a point about multi-GPU AA modes but realistically most multi-GPU users don't even know they exist, much less use them. Couple this to the fact that this generation support and innovation for such AA modes is piss-poor by both vendors.

Neither vendor does it right. If they were smart they'd combine 2xAA from each board which would allow 4 GPUs to provide 8xMSAA and 4xSSAA at the same performance as 2xAA on a single board. Instead they deliver gimped solutions that aren?t usable except in old games.

Using nvidia SLI without also making use of SLI-AA modes is just stupid - same thing imo if you don't use the wide tent featurees and edge detect of Crossfire AA - it goes up to 48X!!!
Do you mean 44xAA? In any case that?s a slideshow except in very old games. My idea would be each board doing 2xAA which is doable even for a single 3870 in new games. That would be a tangible benefit which wouldn?t rely on brittle application profiles or multi-GPU scaling.

i have no problem with the micro stutter - i'd be a fool to deny - but it does not poke or burn my eyes .. a little compromise for the incredible visual goodies?
With multi-GPU AA modes AFR issues cease since the boards behave as one. If you?ve been using such AA modes that explains why you can?t see micro-stuttering.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i did say *educated* .. NVIDIA doesn't "push" nHancer very well .. that is an awesome tool!

Actually, NVIDIA *should* buy AMD - keep working on Fusion with the ATi boys [don't fire them and let them go to intel] and keep NVIDIA Graphics/AMD CPU .. and the ATi name can go bye-bye .. i i think they would finally get a well rounded graphics solution - but then Intel would have to figure how to do Graphics right .. tough for a CPU company .. but we would have *two giants*

With multi-GPU AA modes AFR issues cease since the boards behave as one. If you?ve been using such AA modes that explains why you can?t see micro-stuttering.
:light:
Eureka! .. mahalo!
-- the ANSWER!

rose.gif



Yeah .. 44x .. for ancient games .. SOMEDAY Crysis will run on a single GPU "Very High" at 25x16 and with 44XAA ..
.. how soon?

8 years? i'll "guess"

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Now let me ask. How is this worse than a single card?
As has already been explained, multi-GPU rendering is subjected to more fluctuations between frames than single GPU rendering, hence the potential for micro-stuttering.

A single GPU running @ 60 FPS will generally be smoother than a multi-GPU running @ 60 FPS because the single GPU?s time between frames doesn?t fluctuate as much as it does between two boards working on alternate frames.

Again it?s all right there in the article including videos and the framerate graph. The graph clearly demonstrates the ripples with the multi-GPU setup while the single GPU delivers a flatter and more consistent line.

This isn?t some imaginary conspiracy theory that just popped up today; this phenomenon has been known for years but the general public is oblivious and/or unbothered by it, much like they were with shimmering issues.

Again I can understand if someone isn?t bothered by it but let?s not pretend it doesn?t exist. I?ve seen it and it?s one of the main reasons that I stay away from SLI/Crossfire. I?m not paying more to get a framerate counter to show higher figures only to find it doesn?t paint a true picture of my gameplay experience.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
i did say *educated* .. NVIDIA doesn't "push" nHancer very well .. that is an awesome tool!
Not only do they not push it, nVidia implicitly tried to shut it down by introducing a profile checksum into Vista?s drivers.

As a result changing preset settings through the likes of nHancer would cause the entire profile to stop working.

Fortunately Grestorn managed to work around it. :thumbsup:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
no way .. nHancere should be to their advantage .. it is cool!

i am getting an education, thanks . . . NO WONDER i am NEVER [almost never] bothered by Micro Stutter

i play at 16x10 and can use Crossfire AA modes - all of the time! .. and i still have 3-1/2 hours to go to finish that 1.02 patch for Crysis that i started yesterday early AM :|
- so i will look for it there .. come to think of it, the Demo was annoying as hell .. i just thought it was the 7 FPS on "very high"
:eek:

Yech!

all the more reason to get a Single GT200
--hurry up with r700 AMD! .. i want GT 200 :p

rose.gif


OK, OP .. you win ... CrossFire and SLI frame rates DO reflect reality BUT micro stutter drives some of us NUTS - especially on the big displays where there is no option to use AA modes practically

Solution: Use SLi or CrossFire AA modes or get a single GPU
--*compromise* ??!?

Now i am heading to your forum to talk about intel's shortcomings .. i think need to post about 10 *controversial* topics .. will i be welcome? i think i might be able to brush up on my German at the same time.
:D
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Originally posted by: n7
Okay, i finally actually noticed the video link, & watched the video

That's exactly what i was experiencing w/ my 3870X2!

I imagine not everyone can see it in the video, but to me it's so clear...it looks like little stuttering/chugging constantly with CF on vs. off in that video.

It's obvious to me in that video too. The car is not moving at a uniform rate. I don't know how anyone can miss it.

Although to be fair, there are a variety of other motion-related problems out there that I find blatant but other people don't seem to notice at all. (like motion blur on any LCD, or the frameskipping effect in the Doom 3 engine)

If people aren't bothered by it, then it's great that they are enjoying their cards, but it is certainly something that prospective buyers should be aware of.

I PMed Derek on this one :)

We'll see if he responds i guess...

I haven't received a response to mine yet. He is probably just busy.

Originally posted by: apoppin
So in others words, you felt the extra performance - including the enhanced SLi-AA modes - was not worth the "irritation" you experienced?

I did. I want to have completely smooth motion before I start cranking up graphical settings. SLI could not give me the same level of fluidity that I was getting on a single card in most games, regardless of what graphical settings I used.

As for SLI AA, that is actually a separate rendering mode that I seem to recall was SFR based. I think it had the fluctuating load balancing line in the middle like SFR. I don't remember for sure though.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
As for SLI AA, that is actually a separate rendering mode that I seem to recall was SFR based. I think it had the fluctuating load balancing line in the middle like SFR. I don't remember for sure though.

according to BFG, that makes all the difference and evidently explains it .. at 16x10 i DO use Crossfire AA .. now *all* of the time as i am looking at IQ rather closely
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
As for SLI AA, that is actually a separate rendering mode that I seem to recall was SFR based.
It?s SFR in terms of not having AFR characteristics but there?s no load balancing like regular SFR.

With SLI/Crossfire AA both boards render the same image but each has a different AA mode.

nVidia offsets the images slightly while ATi adjusts the positions of the sample patterns on each board. Afterwards the images are combined with the final AA level effectively equaling the combined level of both boards.

I think it had the fluctuating load balancing line in the middle like SFR. I don't remember for sure though
Like I said above I wouldn't expect SLI AA to have a load balancing line since each board renders (the same) full frame, just with different AA.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
It?s SFR in terms of not having AFR characteristics but there?s no load balancing like regular SFR.

With SLI/Crossfire AA both boards render the same image but each has a different AA mode.

nVidia offsets the images slightly while ATi adjusts the positions of the sample patterns on each board. Afterwards the images are combined with the final AA level effectively equaling the combined level of both boards.

That makes sense and is how I would expect it to work. I do remember seeing the line in the middle in it, but that is probably just my fuzzy memory as I didn't use the SLI AA modes that often. (on the 7 series cards back then, it was generally too slow for modern games while older games required single card mode for proper vsync)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: CP5670
It?s SFR in terms of not having AFR characteristics but there?s no load balancing like regular SFR.

With SLI/Crossfire AA both boards render the same image but each has a different AA mode.

nVidia offsets the images slightly while ATi adjusts the positions of the sample patterns on each board. Afterwards the images are combined with the final AA level effectively equaling the combined level of both boards.

That makes sense and is how I would expect it to work. I do remember seeing the line in the middle in it, but that is probably just my fuzzy memory as I didn't use the SLI AA modes that often. (on the 7 series cards back then, it was generally too slow for modern games while older games required single card mode for proper vsync)

*applause*

take a bow, BFG10K
:thumbsup:

forget the article ... Derek can go to a party or back to bed

rose.gif
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
PHEW!!! Ok. Take a look guys/gals.

Crysis SLI and Non SLI frametimes using a 8800GTS 640(s)

You'll see the molehill that somebody wants to make into a mountain. In my honest opinion that is.

If anybody cannot bring my link up in their browser, let me know.

I only did the math for 50 frames, because I didn't want to sit here with a calculator all night.
But I assure you, that the rest of the frametimes were similar.