CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
and is there any way to use Crossfire AA - not AFR - without the performance hit of actually 'enabling' AA?
Not currently, but in theory there is.

Without AA every board has to capture one super-sample and one multi-sample (located at the pixel's centre) anyway, so all you'd need to do is offset the samples on each board and then each GPU in the system would add one level of AA for free (minus any overhead to combine the images).

So in theory four boards could provide 4xAA at the same performance hit as one board @ 0xAA.

In fact I proposed something like this a few pages back.
 

Blazer7

Golden Member
Jun 26, 2007
1,099
5
81
Does anyone have any idea whether nVidia or ATI are working on this? nVidia has been releasing all sorts of drivers for their latest 9xxx series but there is no mention of any attempt to fix this. ATI has been releasing monthly drivers for quite some time now but no mention of a fix from them either.

In fact, there is no indication of either nVidia or ATI that they are even considering this as a so significant a problem that they will have to do something about it, or at least they are not saying anything in public. All the talking is fine as many of us get educated on this but what about a fix?

If we have thought of the addition of a delay mechanism within the driver as a possible way to improve on this then all these guys at ATI and nVidia should have probably thought of this years ago. XFire and SLI have been out there for quite some time and the guys at ATI and nV have resources and knowledge on this kind of things that's way out of our reach.

So this is either a very small and complicated issue for them that's not worth their time or this is very huge, complex and time consuming that trying to correct this takes too much time, money and efforts that can clearly be utilised better if used to get on each others throat. I mean lets use everything we got to win the war and screw the customers kind of thing.

Either way keeping this kind of details away from the public eye is not very nice. The only thing that's worse is doing nothing about it after the issue has been exposed.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
Crossfire-AA without AA is just like disabling CF completely.
No, if you run an AA level other than a Crossfire one (including disabling AA) the system reverts back to AFR (or whatever scaling is currently in place for that app).
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: BFG10K
Crossfire-AA without AA is just like disabling CF completely.
No, if you run an AA level other than a Crossfire one (including disabling AA) the system reverts back to AFR (or whatever scaling is currently in place for that app).

I'm not talking about what the CCC does, I'm talking about what it would mean technically to turn off the AA in Crossfire-AA ... :)

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
If the scene is highly dynamic, the micro stuttering isn't much of an issue anway, because in such scenes the framerates are fluctuating as are the delays caused by the CPU and GPU, and this will conceal the problem completely.
I?ve thought about this a bit more and I don?t necessarily agree. There?s no reason why the effects of micro-stutter cannot stack with the effects of regular fluctuations, thereby causing even greater fluctuations. Indeed, it?s often the case that a given framerate with AFR doesn?t feel as smooth as it does on the single card.

Only in slowly moving scenes the user will notice the stuttering and those scenes are exactly the scenes which can be fixed easily by adding a small delay every now and then.
But unless the scene has constant CPU and GPU render times you?ll potentially always be required to change the delay. Even taking two steps forward in the game - causing a 2 ms change in render times - could necessitate a change in the delay.

Anyway, the delay doesn't need to be added constantly, just every now and then.
I?ve thought about this as well. If you?re not adding the delay constantly, how do you change it when required?

For example, if I?ve issued a delay of 6 ms at frame 20 and it?s somehow permanently offset all subsequent frames by 6 ms, how do I reduce that delay to 4 ms at frame 40?

Surely the value must be stored and/or forced somewhere, thereby delaying every frame 6 ms until the value is changed to 4 ms? If not how do you reverse the effects of the first 6 ms delay?

I'm not talking about what the CCC does, I'm talking about what it would mean technically to turn off the AA in Crossfire-AA ...
Technically the definition of Crossfire AA is 8x or higher in the old system and 16x/44x (or whatever they call the settings these days) in the new system.

Crossfire AA without the AA is an oxymoron. :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Crossfire AA without the AA is an oxymoron.

sorry .. i was about 3 hours late to bed when i suggested that
i ran out of OXYgen and i became a babbling MORON, this AM :p
:eek:

what i guess i meant, is there a way to mimic, FORCING IT as CrossFire's Sequential rendering - instead of actually turning on AA - instead of using AFR .. and the answer is evidently "yes" .. if they wanted to - it still needs to be recognized as an "issue" and "whatever they hell they use" should be better than the MS we sufferer with now

BFG says it IS possible .. and this doesn't look too difficult to implement
Without AA every board has to capture one super-sample and one multi-sample (located at the pixel's centre) anyway, so all you'd need to do is offset the samples on each board and then each GPU in the system would add one level of AA for free (minus any overhead to combine the images).

So in theory four boards could provide 4xAA at the same performance hit as one board @ 0xAA.


My Proposal is to let everyone know. The OP made a big noise .. so thanks!

As for me, i am dumping Crossfire/Multi-GPU for a single card solution and will not look back till i need it or till it really improves the basics; vote with your wallet - they will notice; and i *guarantee* their Focus Group members are really aware of it [as they weakly attempt to minimize it, in case you haven't noticed; please FG members, we are not going to tolerate this sloppy sh!t from your sponsors - please let them know we don't like it - at all - you know who you are from AMD, also! :p] Also we are not going to stop talking about it .. and this message is appearing on all the tech forums .. shortly

they need some real *damage control* .. not - "ignore it on my 36" LCD i can't notice it anyway" Bullcrap!
-if they want to do that, we can just hit them harder and harder
- they aren't that stupid; but they are lazy

rose.gif


i have a lot of hope for Multi GPU, personally .. i think Multi-GPU is a decent alternative .. and it is still pretty new .. Xfire began almost a year after nVidia announced SLi with ATi bastage x800 CrossFire and the state of their Art isn't half-bad, improving slowly and some times in a leap, over the last few years.
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: BFG10K
Only in slowly moving scenes the user will notice the stuttering and those scenes are exactly the scenes which can be fixed easily by adding a small delay every now and then.
But unless the scene has constant CPU and GPU render times you?ll potentially always be required to change the delay. Even taking two steps forward in the game - causing a 2 ms change in render times - could necessitate a change in the delay.

No, because in this scenario, you have to add the delay only ONCE. After that - if the GPU/CPU rendering times stay constant - there will be no stuttering, because the frames are spread out evenly.

Originally posted by: BFG10K
Anyway, the delay doesn't need to be added constantly, just every now and then.
I?ve thought about this as well. If you?re not adding the delay constantly, how do you change it when required?

For example, if I?ve issued a delay of 6 ms at frame 20 and it?s somehow permanently offset all subsequent frames by 6 ms, how do I reduce that delay to 4 ms at frame 40?

There is no "somehow".

You can see it like this (each frame takes the same time to be rendered):
First GPU:
x--------x--------x--------x--------x--------x--------x--------
Second GPU, started to render slightly later, because of the time the CPU took to prepare the second frame:
--x--------x--------x--------x--------x--------x--------x--------

Both GPUs used together you'll get what I already illustrated before:
x-x------x-x------x-x------x-x------x-x------x-x------x-x------


So, now add a delay to the rendering of the second GPU only ONCE, so it starts a little later:
-----x--------x--------x--------x--------x--------x--------x--------

The combination of both GPUs would now look like this:
x----x----x----x----x----x----x----x----x----x----x----x----


In the case that the rendering times of the frames would stay constant, you'd never have to add another delay.

But since that's not realistic, the driver has to keep track of the frame times and if the time difference between the two GPUs start to differ too much, it just has to calculate and add another delay ONCE again to fix that, until it gets out of alignment again.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
But since that's not realistic, the driver has to keep track of the frame times and if the time difference between the two GPUs start to differ two much, it just has to calculate and add another delay ONCE again to fix that, until it gets out of alignment again.

the way you describe it is IS such an easy fix

next month if AMD or NVIDIA wanted to

rose.gif


They are really thick as a brick .. itis obvious to me that their FG members are just defending their sponsor's sloppiness .. is that what they are for; just to promote their SLI/Xfire propaganda or are they going to attempt get some real help - and answers - for us?
- let them know that we don't like it a bit and we want it addressed. I think it will be an article on some main site soon!

if it is propaganda - i propose we not allow them here at all
- i am sick to death of denial . . denial .. denial
-jump into da'Nile if you are going to do that nonsense ... again!; we are wiser now
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: apoppin
But since that's not realistic, the driver has to keep track of the frame times and if the time difference between the two GPUs start to differ two much, it just has to calculate and add another delay ONCE again to fix that, until it gets out of alignment again.

the way you describe it is IS such an easy fix

next month if AMD or NVIDIA wanted to

rose.gif


They are really thick as a brick .. itis obvious to me that their FG members are just defending their sponsor's sloppiness .. is that what they are for; just to promote their SLI/Xfire propaganda or are they going to attempt get some real help - and answers - for us?
- let them know that we don't like it a bit and we want it addressed. I think it will be an article on some main site soon!

if it is propaganda - i propose we not allow them here at all
- i am sick to death of denial . . denial .. denial
-jump into da'Nile if you are going to do that nonsense ... again!; we are wiser now

Where in this thread, or anywhere else do you see ANYone denying microstutter exists? The OP stated it was an extreme case in every game. Well, I tried two games so far. Crysis and World In Conflict. I posted both results. There is variations in the time the frames arrive. We are discussing the issue with Nvidia.
I know MS exists, but if I see something I think might be FUD like the OP started, I will conduct my own testing (as I have done, and can do a lot more) to see if he/she was right. If I find they aren't right, I'm gonna say something. AS YOU WOULD. Not because I'm a FG member! This is about SLI/Xfire both! And I'm posting the only results I can because I don't have Xfire. You do however,
So why not do some tests of your own?

/edited

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i see attempts at obfuscation

everywhere .. like once-dead, now-returning posters
rose.gif


i am not saying *you* for i did it at first - also
-it is easy to NOT "see it"
. . . and it is way too easy to minimize it by saying "the experience is better with SLI/xFire"

We don't care .. we want the respective manufacturers to take us seriously - and i expect - let me try demand - a progress report from them on what they are going to do about it
-If they don't, i will stop buying and recommending ANY SLi/xFire without making a "big deal" out of it to everyone - from now on

if I see something I think might be FUD like the OP started, I will conduct my own testing
at first i agreed .. but i am witness that is does exist!! .. just strain the graphics sub-system to make it more obvious to anyone

so .. i think looks good .. a matter of perception for the Focus Groups to Ally with us, not disagree or say it isn't an "issue" - it is an issue if you look for it OR are "sensitive"; i know "you" - Keys - you are just as eager for real knowledge as i am - i am sorry to be so general as to include you in my rant, i am sorry!
rose.gif


dare to disagree here with the people giving you info - ultimately it is to us!
- they will love you for independent thinking .. eventually; we have a real issue

We are discussing the issue with Nvidia.

Good progress report! First time i am aware a FG member said anything; i wish the AMD guys would have the courtesy to drop out of their Viral hiding and join us openly with AMD's view.
--i hate talking to "you" that way although i am pretty sure i know 'who'

As to my own crossfire, i am working on another project with another Single GPU .. so it is bad timing for me to stick it back in .. maybe on Monday, i will update to Cat 4 and run an analysis for you with my own FrankenFire and AFR compared with alternate AA rendering; i might even be even able to work into my own research
[:light: - eureka! .. OK .. next Week, expect a 3-way micro stutter repost; a single nV GPU for benchmark; Crossfire AA and HD2900 Crossfire AFR Micro-stutter without AA
- now i am intrigued to do it .. i can only hope i have time as i am under 3 concurrent deadlines!
:clock:

i am benching Crysis right now [but am pretty lost with BioShock; that camera is SO jerky; i need to slow it down] and and just getting the hang of the Crysis frame-by-frame tool; i finally updated it to 1.02 only this week

rose.gif




 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Sorry to post some benches that show what real stuttering is about in the thread about micro stuttering Dug- seemed applicable.
No, they aren?t applicable. We know AFR is faster than single card. We know it can provide playable framerates over a single card. That isn?t under debate. You don?t have to keep providing benchmarks to prove something that isn?t under question.

What is under debate is at what cost those extra frames come at and whether that cost is worth it over a single card.

So you're really just debating whether it's "worth it" to have higher image quality and framerates all the time, with slightly slower average refresh, as opposed to lower image quality and framerates, but with slightly higher average refresh rate? Not even really a question to me.

Originally posted by: BFG10K
This thread is not representative of multi card gaming performance, a reader might think "Holy Cow! With that 1.4ms average variance in frame to frame refresh all multicard gain is lost! Oh noes- yet another example of corporations stealing our hard earned money!"
If someone was handing me hardware on a silver platter I might well be more lenient towards potential issues.

But if I?m going to fork over hard-earned cash on these expensive setups I deserve to know about potential problems and others deserve the same. The evidence has been presented and they can make their own decisions.

I?ve experienced micro-stutter and it?s one of the reasons I don?t want to commit to AFR.

The thing about that is, I have all NVIDIA hardware BFG- I have an 8800Ultra, and I have 9800GX2s. If this were as big an issue as you say, why wouldn't I just use the Ultra to game with? No one has had a gun to my head to use SLi all these years, and oftentimes I've bought second cards with my own money to have SLi like the XFX 9800 GX2 in my box right now, the XFX 7950 GX2 I first went quad with, or the Leadtek 8800GTX I bought to have GTX SLi when that product launched. (this is $1800 for these three cards alone, if this "micro stuttering" was such an issue, I must really be a glutton for punishment)

Originally posted by: BFG10K
My neighbor was over a couple nights ago, I showed him UT3 at 25X16. (he's not a gamer) I played a whole round of vehicle CTF while he watched and at one point during my demo he said "This is just amazing how smooth this runs- I've seen people play computer games before and they were all jerky".
Wonderful, so you provide an example of someone not even playing the game who also appears to be a layman with regards to gaming? He didn?t notice micro-stutter so it?s not an issue?
That's my point exactly- if you have to look for it, and have to be trained to look for it, the higher image quality and resolutions are obviously worth it.

Originally posted by: BFG10K
You have 30? Dell don?t you?

http://www.behardware.com/arti...5/dell-3008wfp-hc.html

The newest version of that display has an input lag of about 3 frames. If you aren?t bothered by that then it?s no wonder you aren?t bothered by micro-stutter. That?s fine (I?m personally not bothered by tearing without vsync for example) but some people are bothered by these things and deserve to know about them.

I bought a 3007WFP-HC, rather than the recalled 3008. AFAIK it has no "input lag" issues, just much faster response times and higher color gamut. You really should see games on this thing BFG- IMHO- resolution + screen size trumps all, and I'm pretty thankful multicard gives me the power to drive it. :)




BTW- I agree with you single card is always preferable if it can do what you need it to, unfortunately, single card doesn't offer the image quality I require.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
So you're really just debating whether it's "worth it" to have higher image quality and framerates all the time, with slightly slower average refresh, as opposed to lower image quality and framerates, but with slightly higher average refresh rate? Not even really a question to me.

i am not so sure you really get it
--we KNOW ,, that is OBVIOUS - my old argument!!!

we are taking *specifically* about Micro Stutter
--we don't want to talk with you here about the "advantages of SLi"

Please, start another thread to sign nvidia's praises .. i will join you THERE later to discuss it with you

but THAT is BESIDE the POINT - right HERE

either discuss "micro stutter" or cease and desist with the propaganda
-you are repeating your argument over and over - it has NOTHING to do with this - look at the title please and try to stay on-topic
CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!
^^This is the subject^^ Micro stutter as it relates to "reality"; some of us ARE really bothered by it ... we ALL "get it" you don't have to say it over and over - you and i are not bothered by it - cool!
- i see it but i use Xfire AA so it is gone; you don't see
it _-awesome, more power to you!! Don't ignore the more "sensitive" amongst us, please .. they have a REAL "issue"


rose.gif


please ... i am having a bad day .. sorry if i am coming off rude but my mother - after making excellent progress - suddenly relapsed, went off in the ambulance a few hours ago and i do not ever expect her to ever come back to my home and my care

:(
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
No, because in this scenario, you have to add the delay only ONCE. After that - if the GPU/CPU rendering times stay constant - there will be no stuttering, because the frames are spread out evenly.
But they won't stay constant. They'll only stay constant when you're standing still, not touching the mouse and nothing else is going on.

That is my whole point! Your scenario is unrealistic and unsustainable!

Even a slight 2 ms change in frametimes would require a new delay to be added.

In the case that the rendering times of the frames would stay constant, you'd never have to add another delay.
I don't think you understand the question I'm asking. I understand that a single delay offsets all subsequent frames, but that isn?t what I?m asking.

Let me rephrase it. First I issue a single 6 ms delay at frame 20 which then delays all subsequent frames by 6 ms. Are you with me so far?

How do I later change that delay to 4 ms? Using your explanation I can't simply issue a 4 ms delay because that will stack with the 6 ms delay I issued earlier which would then delay all frames by 10 ms from their original starting point.

In either case it has no bearing on my original example anyway:

1 10 ms
2 12 ms
3 22 ms
4 24 ms
5 34 ms
6 36 ms

Here we need the gap between frames to be 6 ms. To achieve that a single 4 ms delay added to frame 2 will offset all even frames by 4 ms and make the gap 6 ms.

Do you agree with this?

1 10 ms
2 16 ms
3 22 ms
4 28 ms
5 34 ms
6 40 ms

After doing this, we are still left with 6 frames in 40 ms instead of 6 frames in 36 ms which increases total render times by 11%.

Do you agree with this?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
So you're really just debating whether it's "worth it" to have higher image quality and framerates all the time, with slightly slower average refresh, as opposed to lower image quality and framerates, but with slightly higher average refresh rate? Not even really a question to me.
No, the debate is whether the benefits of AFR offset the disadvantages such as micro-stutter. You posting framerate graphs doesn?t address the issue of micro-stutter at all.

I bought a 3007WFP-HC, rather than the recalled 3008. AFAIK it has no "input lag" issues, just much faster response times and higher color gamut.
Almost all LCDs have measurable input lag:

http://www.lesnumeriques.com/d...88&mo2=92&p2=946&ph=12

The 3007 is pretty good but with an average lag of 11.5 ms, it?s far from perfect.

2560x1600 is great, no doubts there, but I?m waiting for technology to get a bit better before I take the plunge.
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
quick question....

no AA with no AI means only one card in use?

no AA or (2,4) with some AI like standard or advanced means... AFR....

AA at 16x which is the CFX one... no matter the AI is both cards working at once for each frame????

is all of that true?


note: gtg to bed...
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
Apoppin: best wishes to you and your mother. I hope everything turns out okay for your guys.
rose.gif


Pelu:

no AA with no AI means only one card in use?
I?m not sure about that one so you?d need to check.

AFAIK disabling Cat AI disables application detection but CCC could still be forcing AFR despite not detecting applications specifically.

no AA or (2,4) with some AI like standard or advanced means... AFR....
AFR or legacy modes (scissors, super-tiling) if they?re still around in the profiles.

AA at 16x which is the CFX one... no matter the AI is both cards working at once for each frame????
Yep - 8xMSAA on each board I?d wager.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
So you're really just debating whether it's "worth it" to have higher image quality and framerates all the time, with slightly slower average refresh, as opposed to lower image quality and framerates, but with slightly higher average refresh rate? Not even really a question to me.
No, the debate is whether the benefits of AFR offset the disadvantages such as micro-stutter. You posting framerate graphs doesn?t address the issue of micro-stutter at all.

I posted the benches by way of showing the benefits of AFR do indeed outweigh the disadvantage of delay induced by the load balancing in the driver overhead.

Obviously I can't post benches that show AFR has less average screen refresh time than single card, so really all that's left is subjective experience and showing benches where it's likely AFR is smoother and the image quality higher.

Originally posted by: BFG10K
I bought a 3007WFP-HC, rather than the recalled 3008. AFAIK it has no "input lag" issues, just much faster response times and higher color gamut.
Almost all LCDs have measurable input lag:

http://www.lesnumeriques.com/d...88&mo2=92&p2=946&ph=12

The 3007 is pretty good but with an average lag of 11.5 ms, it?s far from perfect.

2560x1600 is great, no doubts there, but I?m waiting for technology to get a bit better before I take the plunge.

The whole CRT vs LCD debate really belongs in another thread as it's off topic here, I'll just note in the year 2008 I can't even stand to have a tube style device in my home anymore. Those days have been over for a while, they don't even sell decent CRTs in America anymore, or manufacture them AFAIK.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin
So you're really just debating whether it's "worth it" to have higher image quality and framerates all the time, with slightly slower average refresh, as opposed to lower image quality and framerates, but with slightly higher average refresh rate? Not even really a question to me.

i am not so sure you really get it
--we KNOW ,, that is OBVIOUS - my old argument!!!

we are taking *specifically* about Micro Stutter
--we don't want to talk with you here about the "advantages of SLi"

Please, start another thread to sign nvidia's praises .. i will join you THERE later to discuss it with you

but THAT is BESIDE the POINT - right HERE

either discuss "micro stutter" or cease and desist with the propaganda
-you are repeating your argument over and over - it has NOTHING to do with this - look at the title please and try to stay on-topic
CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!
^^This is the subject^^ Micro stutter as it relates to "reality"; some of us ARE really bothered by it ... we ALL "get it" you don't have to say it over and over - you and i are not bothered by it - cool!
- i see it but i use Xfire AA so it is gone; you don't see
it _-awesome, more power to you!! Don't ignore the more "sensitive" amongst us, please .. they have a REAL "issue"


rose.gif


please ... i am having a bad day .. sorry if i am coming off rude but my mother - after making excellent progress - suddenly relapsed, went off in the ambulance a few hours ago and i do not ever expect her to ever come back to my home and my care

:(

There's no need to be argumentative Apoppin.

A. I haven't "sung any NVIDIA praises" here, I believe in every post I referenced SLi and Crossfire, not SLi alone.

B. See my reply to BFG. What you are demanding is a debate of the difference between single card and AFR refresh times alone, and it seems there's nothing to really talk about there as all brands of AFR available seem to have some delay for the load balancing. This thread would only need one post if that's all we're "allowed" to discuss in it, not much point for others to say "yes indeed the average refresh time seems a tad longer" is there?

I've made NVIDIA aware of this issue, as is my job as a focus group member.

I guess I don't see what your fear is of someone saying "Wait a minute- I use AFR every day and there may be some screen refresh variance, but the gameplay is smooth for the most part and I'm getting settings I wouldn't be able to play.".

Why is the only permissible response "There's a 1.5ms average refresh variance, it MUST be a bad gaming experience because it's different!"?

I think I can link to a multitude of reviews of AFR gaming on the web with both ATi and NVIDIA hardware where they say it kicks ass and they never even noticed this "microstutter" at all, let alone called it a deal breaker.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
I posted the benches by way of showing the benefits of AFR do indeed outweigh the disadvantage of delay induced by the load balancing in the driver overhead.
You posted benches showing a performance gain, but whether they outweigh micro-stutter is up to each person to decide. For me they don?t outweigh the total disadvantages of AFR (heat, noise, cost, micro-stutter, input lag and driver glitches).

Obviously I can't post benches that show AFR has less average screen refresh time than single card, so really all that's left is subjective experience and showing benches where it's likely AFR is smoother and the image quality higher.
Again ?smoother? is such a relative term. A higher framerate doesn?t automatically imply it?s smoother. That is the whole point of this thread.

The whole CRT vs LCD debate really belongs in another thread as it's off topic here,
Perhaps, but it demonstrates you?re happy with a device that adds measurable input lag which potentially explains why micro-stutter doesn?t bother you. For some others that isn?t the case.

I'll just note in the year 2008 I can't even stand to have a tube style device in my home anymore.
The same applies to AFR in my home. :p

Those days have been over for a while, they don't even sell decent CRTs in America anymore, or manufacture them AFAIK.
An unfortunate and classic example of the masses being brainwashed by marketing IMO: ?the CRT is bigger and hotter so the LCD must be better!?

Much like AFR really: ?the framerate is higher, so it must be better!?

Yes, it can be better but there are issues to be aware of. That?s what this thread is about.
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: BFG10K
Let me rephrase it. First I issue a single 6 ms delay at frame 20 which then delays all subsequent frames by 6 ms. Are you with me so far?

How do I later change that delay to 4 ms? Using your explanation I can't simply issue a 4 ms delay because that will stack with the 6 ms delay I issued earlier which would then delay all frames by 10 ms from their original starting point.

Well, the answer is so easy I never figured that it would be a problem...

You do that by adding a one-time delay of 2 ms to the OTHER card. Then both cards are aligned again.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: mvvo1
I'll bet you are happy your $1000 Quad Core CPUs! :) How come no one takes Intel to task for telling me that Quad Core is better than dual core? What has ever gotten faster in gaming with the 4 cores?

Nobody calls them on this because nobody markets this idea. Emphasis has always been on more tasks, rather than faster tasks. That and the quad core actually does improve frame rate some of the time, but the game needs to be patched/designed for SMP. UT is a good example, as shown by this Anandtech article. The 2.4GHz quad core is running close to 10% faster than the 3.0GHz dual core.


Relating back to this thread, it seems like SLI should still improve performance if the frame rate is under 60. The numbers reported on an SLI rig might be very misleading (60fps looks like 50fps), but it would still look smother than a single GPU reporting half as many frames (30) at equal time intervals. If the game runs fast on a single GPU system, then the single GPU would look better because the vscync actually works.
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
so 16x put both cards on the work for each frame.... the laid back of this option is the high AA which can slow down the whole process from the root???

no baddies but 16x AA sounds high, not to mention it can cause blurriness in the image...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Grestorn
Originally posted by: BFG10K
Let me rephrase it. First I issue a single 6 ms delay at frame 20 which then delays all subsequent frames by 6 ms. Are you with me so far?

How do I later change that delay to 4 ms? Using your explanation I can't simply issue a 4 ms delay because that will stack with the 6 ms delay I issued earlier which would then delay all frames by 10 ms from their original starting point.

Well, the answer is so easy I never figured that it would be a problem...

You do that by adding a one-time delay of 2 ms to the OTHER card. Then both cards are aligned again.

told you so :p

what do i know ?

rose.gif


.. the answer .. nothing
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
I have a question since we're talking "delay induced by the load balancing in the driver overhead" that nRollo brought up. Would faster ram and a faster CPU help speed up the process so there's less of a gap between frames making it on screen? If so then since SLI is a "high end" thing and people who use it usually have high end rigs, shouldn't this be less noticiable in lets say a 3.6ghz quad vs a 2.4ghz dual?
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: bfdd
I have a question since we're talking "delay induced by the load balancing in the driver overhead" that nRollo brought up. Would faster ram and a faster CPU help speed up the process so there's less of a gap between frames making it on screen? If so then since SLI is a "high end" thing and people who use it usually have high end rigs, shouldn't this be less noticiable in lets say a 3.6ghz quad vs a 2.4ghz dual?

The faster the CPU is (relative to the GPUs), the more likely you'll see micro-stuttering.

If the load distribution between CPU and GPU is balanced, there will be almost no micro-stuttering.

If the CPU is much slower than the GPUs, the games become CPU limited and SLI won't give you any advantage anymore.