CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: taltamir
the issue is that AFR provides artifically increased FPS with the same smoothness of a single card. It is a waste of money. I am waiting to actually see someone contradict some of the SLI theory I posted in a way that invalidates this conclusion.

I don't see how you can blow out of proportion something like "AFR gives higher MEASURED performance than one card, but the the same ACTUAL performance". This indicates that the entire thing is all smoke and mirrors and one big ripoff and waste of money.

Then how do you explain a game falling below playable framerates on a single card, yet becomes playable by adding a second card? If the ACTUAL performance "was" the same, you have an unplayable game under any situation, SLI/Crossfire or not. Hmmm?

How do you explain being able to "up" a resolution and maybe add more AA? If you had the same "ACTUAL" performance of a single card, you wouldn't be able to do this.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
AFR or SFR?

And I can't explain it, thats the whole point. if it is really the case then something in the theory is wrong and needs correcting. I will look it over and see if I can find such an error.
Did you personally test it to be much smoother AFR vs single card??
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Datenschleuder
Originally posted by: nitromulletIf that is the whole point, what is the point? :)

We already know that buying a multi-gpu that benchmarks roughly equal to a single gpu is not preferable compared to buying the single gpu. Not only is performance not as consistent with a multi-gpu setup, but there is an additional level of overall system and driver complexity that make the multi-gpu setup less desirable. The only time I have ever heard anyone knowledgeable recommend a multi-gpu setup is when there was no single gpu available that could offer the same level of performance.

If that is the extent of this great service you are trying to provide here, all I have for you in response is, "/yawn".
Maybe YOU know that now, but this issue is unknown by most people.
And even those who are effected and did notice the issue are mostly unaware about the cause.

Or can you show me even a single English speaking hardware review site that measures and points at the problem of inhomogeneous frame times with AFR?!

Out Editor is now aware of it .. it is up to them ... in the future, may i also suggest a PM or an e-mail to the main site. These guys are pretty cool and i am pretty sure they will respond to you

However, you have to realize that even a large tech site doesn't have 50 reviewers waiting for things .. more like each reviewer has 50 things to review waiting :p

it's "priority" .. this is not a "scandal" .. we were aware of it for a long time .. and *most* of us put up with it without complaining [getting "more" overall FPS, smoothness - and if you are like me, AA with no gitter].

So perhaps you want to do all the testing and reviewing - if it is *really professional*, perhaps they might consider a "guest submission" from you ; email the main site if you think you have what it takes, Mr Programmer :)

rose.gif
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
So, basically what I take away from this:

With a high end mutli-gpu setups you're still getting better overall performance over a single high end card, although you never will scale 100% and you might not be getting quite as much of an increase as your fps suggest.

If you're looking at a dual mid-range card setup vs. a single high end gpu that's comparably priced, and the benchmarks are just too close to call definitely go for the single high end gpu.

I think where you'll really see micro stutter become an issue is with technologies like NVIDIA's GeForce Boost that uses SLI to run games in a dual gpu configuration consisting of a lower end discrete card and an onboard gpu.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: taltamir
AFR or SFR?

And I can't explain it, thats the whole point. if it is really the case then something in the theory is wrong and needs correcting. I will look it over and see if I can find such an error.
Did you personally test it to be much smoother AFR vs single card??

I don't think AFR or SFR matters to make this particular point. Both offer performance gains respectively. You could look over your data as long as you wish, but it will not allow you to make the claim that SLI'd cards ACTUAL performance is that of a single card. No matter how you look at it. My two questions above can only be answered one way. Not by design, but by the only answer possible.

And finally, I noticed only this: My games played the same, only without the slowdowns I would have with a single card.

Crysis for example. Single card all settings on high at 1280x1024 would cause my 8800GTS 640 to chug in a few heavy spots. By adding a second 640, I was able to play (without AA in both cases BTW) and not have these slowdowns. The game was smooth as can be.
In the very beginning of the game, where the team is in the jet getting briefed, you see the jet come into view. That was choppy on a single 8800GTS640. Same settings under SLI, the jet coming into view was smooth. That was something I was really looking out for. That jet.

CoD4, while nowhere near as taxing as Crysis, allowed smoother gameplay under heavy fire as well. Framerates never dropping below 50 as it often did with a single 640.

Taltamir, I'm in no way shape or form saying that microstutter doesn't happen. We all know it does. But to think that it affects it to the extent of reducing SLI performance to a single card is just not happening.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,959
126
I would agree with that, but why is the fps for the single card 29fps and the fps for the dual card setup 24 when their own benchmarks don't mirror this? They are either not comparing the same card in the single and multi-card setup, or they are using a specific snippet of the frame log to illustrate their point.
Like I said I only have a theory that it?s somehow related to AFR simply because the first frame takes longer to arrive and then offsets the rest.

I can almost certainly say they didn?t do something stupid like compare 9600 GT SLI to a single 8800 Ultra because that wouldn?t prove anything.

I'm saying that if the duration for AFR never exceeds that if a single card, the AFR system is no worse off than the single card. They may both in fact stutter depending on the length of a given duration.
If the duration is shorter for the AFR system than the single card, you will NEVER experience a drop in frames or micro stutter with the AFR system that you would not also experience with the single card system.
Okay, but my example shows a scenario where AFR could be worse despite the durations never exceeding the single card. What is your response to this?

It depends on the duration... If the total duration between frames for the 2ms swing was 30ms and 32ms, and the total duration between frames for the 9ms swing was 20ms and 29ms, the one with the lower duration is still providing you with the better experience.
Huh? Where did you get those figures from?

I already explained that the single card alternates between 45 FPS and 50 FPS while the AFR card alternates between 55 to 110 FPS.

So the single card would look something like this:

1 22 ms
2 20 ms
3 22 ms
4 20 ms

While the AFR card looks like something like this:

1 18 ms
2 9 ms
3 18 ms
4 9 ms

So according to you the AFR system is better off than the single card system because at no time do its frame durations exceed that of the single card? Again I wouldn?t necessarily agree with this, not when AFR is constantly cutting your framerate in half and then rocketing back up again.

With those kinds of constant swings it?s quite possible someone may notice them more than the framerate jumping between 45 FPS to 50 FPS on the single card.

What do you think causes these irregularities? Hint: a longer than normal duration between frames.
What causes the irregularities is fluctuating framerate that isn?t necessarily picked up by framerate counters given they typically poll only once per second.

If you were experiencing micro stutter with a multi gpu setup and the durations were consistently lower then a single gpu setup, the single gpu setup would be stuttering continuously.
No, like I explained above that isn?t necessarily true at all.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
So, basically what I take away from this:

With a high end mutli-gpu setups you're still getting better overall performance over a single high end card, although you never will scale 100% and you might not be getting quite as much of an increase as your fps suggest.

If you're looking at a dual mid-range card setup vs. a single high end gpu that's comparably priced, and the benchmarks are just too close to call definitely go for the single high end gpu.

I think where you'll really see micro stutter become an issue is with technologies like NVIDIA's GeForce Boost that uses SLI to run games in a dual gpu configuration consisting of a lower end discrete card and an onboard gpu.

That is a pretty good summary although the last part *depends* on Driver implementation
- for me i am looking to dump crossfire for a single GPU .. but i am stuck in the same position as GTX-Ultra owners - wait.:(

What i have noticed, is the more 'strain' on the graphical sub-system, the more likely micro stuttering is to be noticeable .. as in Crysis for me and to a single GPU. [that damn 1.02 D/L musta been corrupt .. another G-D 30 hours :|]

anyway, if Micro stutter is bothering you, try turning down the res a bit or lower details and see if it "looks better"

rose.gif
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,959
126
Nvidia has checksum's in place to modification and a direct restore. They obviously want you to modify it. You can do this through nvapps.xml amongst other things.
Then why does nvapps.xml get rejected if a preset value is changed without the checksum being correct?

Why did Grestorm resort to using ghost profiles to workaround the issue? Was that a suggestion from you or from nVidia to do this?

The facts don?t quite paint the picture you?re trying to paint.

The default value represents the restore point. And the first multi GPU rendering value represents the modifiable point. Why add a restore a function if they did not anticipate users changing these values?
You tell me. Why add a global checksum that rejects nvapps.xml if it isn?t valid? And why make the checksum break only when pre-defined values are changed?

Me and Grestorm communicate with each other regularly.
I?ve talked to him about the issue too (not with this handle though) and what you?re saying wasn?t the impression I got from him.

But take a look here:

http://www.nvnews.net/vbulletin/showthread.php?t=96604

Originally posted by Grestorn

Hi everybody,

I guess some of you have been wondering when the next version of nHancer will be available. Well, it's about to be finished, but I've got a serious problem I can't solve by myself.

Maybe somebody, who has some insider contacts to nVidia could help me out there. Here's the problem:

The DX10 SLI compatibility values for each games are not read from the registry like it has been done for all the other values so far. The driver reads this value directly from the nvapps.xml in system32 (or syswow64 in Vista64).

That wouldn't be a problem at all, but the driver only accepts the nvapps.xml file as beeing valid, if it contains a checksum that matches all containes values. Check out the start of nvapps.xml:


Code:
<?xml version="1.0" encoding="UTF-8"?>
<FILE>
<INFO Number="3276289231"/>
<PROFILESET>
...
This "INFO" Number is the checksum. As soon as you change any numeric value in the nvapps.xml, DX10 SLI stops to work. Also, if you change but one digit of the INFO Number, DX10 SLI will stop working as well. It seems that the driver doesn't accept the whole nvapps.xml file, as soon as the checksum doesn't match.

The problem is: I have no idea how to calculate that checksum. If I knew the algorithm, I could adapt the checksum when changing the file...

BTW, an interesting note is that nVidia's own panel doesn't know how to calculate the number either. Therefor they changed it so it never tries to write the file in System32, it holds a private copy in ProgramData\NVidia (Vista) or in All Users\Application Data\NVIDIA (XP) instead.

nHancer could do the same (in fact the current, unreleased version does it this way), but you couldn't change any DX10 SLI compatibility values then. And I'm afraid that in the future, the DX10 driver might read more and more values from the nvapps.xml and none of them could be changed by nHancer. Which would kind of defeat nHancer's purpose.

So, again, if anybody has some insider contacts to nVidia and could forward this plea of help or something like that, I'd very much appreciate that!

Otherwise I fear that nHancer will come to an end soon...

Originally posted by Grestorn
YAPE... Hehe. There's actually someone who remembers that!

Well, yes of course, the INFO has been there from the beginning, but it never carried any significance as far as I could tell.

I tried to use some contacts to get in touch with the driver team, but I never got any response from them (wasn't my first try). So my guess is, that they don't really care much for nHancer.

Well, I guess I have to live with that. In the end, I might very well be forced to stop working on nHancer, if they keep making it more and more difficult for third party apps to work with their drivers...

Originally posted by Unwinder
NVIDIA never provided even a single bit or info for RivaTuner. They are crazy about security and I won't be too optimistic on getting any info from their side.

Grestorn

Don't give up and find the encoding algorithm yourself, it is not that hard and requires just a few hours (or days) depending on your reverse engineering experience. If I'd give up a few years ago during Detonator 23.xx launch, when NVIDIA encrypted names of all D3D registry entries and many of OpenGL ones, we'd never get a control of many things in the tools like RT, aTuner and nHancer.

Again this isn?t painting the picture you?re trying to paint. Furthermore the idea to use ghost profiles was garnered from the findings in that thread, not from nVidia?s ?support?.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I just had an epiphany, in regards to the theory. I think I know why it will make AFR a lot smoother then single card in ultra low FPS situation, but the opposite result in higher res. I need some time to go over the numbers, but I will get back to you with this.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,959
126
I performed a detailed analysis of vsync and came to the conclusion that AFR is non beneficial, posted it last week.
If you?re talking about your vsync thread then that was full of mistakes.

It?s one thing to complain about micro-stutter and input lag but another thing entirely to claim AFR provides no benefit; such a claim is false.

It?s objectively and subjectively provable that AFR can provide a higher framerate over a single card and make situations playable that weren?t on a single card.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
I performed a detailed analysis of vsync and came to the conclusion that AFR is non beneficial, posted it last week.
If you?re talking about your vsync thread then that was full of mistakes.

It?s one thing to complain about micro-stutter and input lag but another thing entirely to claim AFR provides no benefit; such a claim is false.

It?s objectively and subjectively provable that AFR can provide a higher framerate over a single card and make situations playable that weren?t on a single card.

Ding ding ding ding! We have a winner!

To me the benefits of SLi (higher resolutions, AA) have always far outweighed the negatives. I'm sure I'd feel the same about Crossfire.

It's always preferable to achieve a level of performance with single GPU, but two high end gpus will always be preferable to single for flexibility in settings. (and with high end monitors, necessary)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Actually that thread about vsync was an early look at the theory where I asked for input to correct my understanding of SLI theory. Your's and other's input helped. I really should rewrite that one when I get the chance (IE, some free time)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: BFG10K
I performed a detailed analysis of vsync and came to the conclusion that AFR is non beneficial, posted it last week.
If you?re talking about your vsync thread then that was full of mistakes.

It?s one thing to complain about micro-stutter and input lag but another thing entirely to claim AFR provides no benefit; such a claim is false.

It?s objectively and subjectively provable that AFR can provide a higher framerate over a single card and make situations playable that weren?t on a single card.

Ding ding ding ding! We have a winner!

To me the benefits of SLi (higher resolutions, AA) have always far outweighed the negatives. I'm sure I'd feel the same about Crossfire.

It's always preferable to achieve a level of performance with single GPU, but two high end gpus will always be preferable to single for flexibility in settings. (and with high end monitors, necessary)


Whoa .. i thought i dropped out of the matrix for just a moment and i had to actually look to see who i was agreeing with
:Q


damn


Well .. except for the Big Ass LCD; i like my little one with lots of AA lovin'



rose.gif

 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Originally posted by: apoppin
If you don't play twitch shooters and don't notice microstutter then SLI + large LCD is not a bad way to fly. For the rest of us I'm glad there are single card solutions and CRTs.

i do notice and i also play twitch shooters :p ..
... everything is compromise .. but the *cure* for most of us - except to buy a faster single GPU [which for example, could not happen if you had purchased a 8800GTX-Ultra back last - last November. You still can't buy a single faster GPU - it will be TWO years!! .. So ..
If you want "faster" then your choices are - (1) to lower your resolution, (2) sacrifice details - OR (3) buy a second one

Problem Solved for most of us by #3
- the least compromise

rose.gif


I know I'm quoting you from the first page, but I totally agree. I used to play competitive Counter-Strike going to tournys everything, even Quake 3 arena tournys back in the day. Thing is not a single person I met or played with used ANY high settings. Everything was kept to a minimum to make sure you didn't get any stutter or frame drops so you could be at the top of your game. So anyone who is playing "twitch shooters" for any sort of competitive reason will have the best and run it near a minimum, they're not going to be running Crysis at 1920x1200 everything high even.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
If you are taking *competition* you want to be running GTX SLI with just enough details to see what you *need* and the resolution of choice was probably 10x7 .. or a low WS resolution - to get 300FPS and everything flowing silky smooth

if you pause to admire the gfx someone will frag your ass
:Q
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Originally posted by: apoppin
If you are taking *competition* you want to be running GTX SLI with just enough details to see what you *need* and the resolution of choice was probably 10x7 .. or a low WS resolution - to get 300FPS and everything flowing silky smooth

if you pause to admire the gfx someone will frag your ass
:Q

Haha yeah you run the bare minimum you need to get by and you usually have WAY more card than you need to get by with. Even for the Source engine.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: BFG10K
I would agree with that, but why is the fps for the single card 29fps and the fps for the dual card setup 24 when their own benchmarks don't mirror this? They are either not comparing the same card in the single and multi-card setup, or they are using a specific snippet of the frame log to illustrate their point.
Like I said I only have a theory that it?s somehow related to AFR simply because the first frame takes longer to arrive and then offsets the rest.

I can almost certainly say they didn?t do something stupid like compare 9600 GT SLI to a single 8800 Ultra because that wouldn?t prove anything.

I'm saying that if the duration for AFR never exceeds that if a single card, the AFR system is no worse off than the single card. They may both in fact stutter depending on the length of a given duration.
If the duration is shorter for the AFR system than the single card, you will NEVER experience a drop in frames or micro stutter with the AFR system that you would not also experience with the single card system.
Okay, but my example shows a scenario where AFR could be worse despite the durations never exceeding the single card. What is your response to this?

It depends on the duration... If the total duration between frames for the 2ms swing was 30ms and 32ms, and the total duration between frames for the 9ms swing was 20ms and 29ms, the one with the lower duration is still providing you with the better experience.
Huh? Where did you get those figures from?

I already explained that the single card alternates between 45 FPS and 50 FPS while the AFR card alternates between 55 to 110 FPS.

So the single card would look something like this:

1 22 ms
2 20 ms
3 22 ms
4 20 ms

While the AFR card looks like something like this:

1 18 ms
2 9 ms
3 18 ms
4 9 ms

So according to you the AFR system is better off than the single card system because at no time do its frame durations exceed that of the single card? Again I wouldn?t necessarily agree with this, not when AFR is constantly cutting your framerate in half and then rocketing back up again.

With those kinds of constant swings it?s quite possible someone may notice them more than the framerate jumping between 45 FPS to 50 FPS on the single card.

What do you think causes these irregularities? Hint: a longer than normal duration between frames.
What causes the irregularities is fluctuating framerate that isn?t necessarily picked up by framerate counters given they typically poll only once per second.

If you were experiencing micro stutter with a multi gpu setup and the durations were consistently lower then a single gpu setup, the single gpu setup would be stuttering continuously.
No, like I explained above that isn?t necessarily true at all.

Let me make this simple.

  • I agree that ideally the durations between frames should be uniform. The more uniform the durations, the more accurately the average fps reflects the actual user experience. This is true for both AFR and single gpu setups. It has been shown that duration uniformity is more difficult to achieve with AFR than a single gpu. I do in fact find this discovery quite interesting, and Datenschleuder does deserve some credit for bringing this to my attention.
  • I however contend that the absolute maximum value of the duration is a greater determining factor with regards to user experience than the fluctuation in duration length provided that the maximum duration for a given setup never exceeds that of the other setup.
 

xylem

Senior member
Jan 18, 2001
621
0
76
I definitely appreciate the fact that this issue has been brought into discussion here. I tend to be sensitive to smoothness in frame-rates, and this type of information (specifically, whether it has been satisfactorily addressed) will definitely enter into my research if i consider a high-end gaming system in future.

Of equal importance, having as many facts as possible can be very helpful in situations where a person is responsible for recommending and/or performing high-end system builds for clients. I might not have a SLI setup for testing, or might not play a particular game where an issue is particularly noticeable, so bringing issues such as these to light can help determine where time and resources are spent for a build. It seems like a good thing for people to know as much as possible about the hardware they are considering for purchase.

Hmm, 'the facts'... What are they? I have checked out the data on the German site, and the data in the link which was provided by Keysplayr2003, and i think it might be worth pointing out that there should be a lot more data made available (a large sample of system configurations tested, multiple driver revisions tested, *many* games and other 2d and 3d apps tested), and more study so the conditions under which the issue becomes severe can be noted. Maybe irregularity in the delay between rendered frames would be more or less uniform, and maybe it wouldn't be (the German data and Keysplayr2003's bit of data seem to paint a different picture, but the German site shows results from more apps). At least, i would be interested in studying such data.

Feedback from people who *do* or did have SLI setups and who were subjectively bothered by the issue is particularly interesting and helpful, since it implies that the information provided by the OP is materially relevant.

Anyway, thanks to Datenschleuder for bringing it up, despite the verbal hammering he's taken for sticking it out.
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Well to me it seems a moot point. Most situations a multi gpu setup gives you 50% or better frame bump, his numbers show a 30% degradation in the time per frame. You're STILL seeing more frames 60 doesn't = 30 that'd be a 50% degradtion. Also, everyone knows you can pump things like AA and AF much higher in multi gpu setups. The positives far out weigh the negatives and if you can afford it by all means do it, if you're regretting your purchase cause you're "sensitive" to these things, then you most likely were doubting the purchase before you even made it. To me this seems like people LOOKING for a problem rather than actually seeing one. I'm not saying it doesn't exsist and in extreme situations is unnoticable but in the long run it really shouldn't matter. I run SLI 9600 GTs now and tbh I'm quite happy with it. Went from single 8800 GTX to single 9600 GT till my other 9600 GT got here and really things seem smoother when I was trying to push the cards with a newer game and I'm happy with my purchase.

EDIT- Would like to also note I just watched the videos and I can't see wtf you're talking about? You're talking about some sort of flicker? or something? I don't see it look at the static images in the race you'll notice that doesn't happen. Smoothness?? hard to judge smoothness in a racing game since the car jerks everytime it switches gears and when he hits nitro or whatever he's doing. To me it looks prefectly playable and fine and looks as good as I've ever seen it. I think you're REACHING for something here.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
It's good to see Video can't resist pounding someone, even in these cuddlier and fluffier times!

;)

EDIT: Like many others in this thread I can see 'stutter' in that video, but I guess if you're fine with that, or don't notice it, more power to you :)

If you want to look for it, the video (to me) clearly 'two-steps', which I assume is the two quick frame updates followed by the lag, repeat.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: dug777
It's good to see Video can't resist pounding someone, even in these cuddlier and fluffier times!

;)

EDIT: Like many others in this thread I can see 'stutter' in that video, but I guess if you're fine with that, or don't notice it, more power to you :)

If you want to look for it, the video (to me) clearly 'two-steps', which I assume is the two quick frame updates followed by the lag, repeat.

of course, if any brand new person shows up and is extremely aggressive - they get a bit more then they give; we have to find out "who" is trying to bring "what" into Video
--And it looks like the OP passed our little "test" .. how often do i apologize here?
rose.gif


Most of us are willing to suspend disbelief and enjoy the faster experience and more AA that multi-GPU allowes; for me it is mostly 'moot' as i use the "extra" GPU to give me CrossFire AA which does not have this problem so obviously - heck even a single GPU can "micro stutter"

so this isn't "revolutionary" by any means - it just allowed us to examine it in better detail than any other tech forum up till now and come up with some interesting theories

Anyone hear back from Derek?

i am really proud of Video - 3 years ago, this thread would have deteriorated into flames and name-calling

Heck, even P&N has mellowed - and i am back there with the 'boys'; my controversial posting is mostly confined there now there
[heck you can even insult someone there [really bad] and get away with it - don't tell!]
:Q
 

idiotekniQues

Platinum Member
Jan 4, 2007
2,572
0
71
Originally posted by: bfdd
Originally posted by: apoppin
If you are taking *competition* you want to be running GTX SLI with just enough details to see what you *need* and the resolution of choice was probably 10x7 .. or a low WS resolution - to get 300FPS and everything flowing silky smooth

if you pause to admire the gfx someone will frag your ass
:Q

Haha yeah you run the bare minimum you need to get by and you usually have WAY more card than you need to get by with. Even for the Source engine.

there have been a few times where shadows have helped me see somebody from around a corner or above me.

other eye candy settings may also be helfpul in a tactical way as well. it is your own problem if you stop to smell the flowers instead of utilizing these things for a competitive edge. as long as fps is 60 or above it is fine for shooting, you dont need 300fps tha tis just ridiculous.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i am talking about competition at the highest level

perhaps you are unusual

and "shadows" are important - it would be part of what i "need" .. on my Multi-gpu at 10x7 CRT .. or scale down my LCD
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: idiotekniQues

other eye candy settings may also be helfpul in a tactical way as well. it is your own problem if you stop to smell the flowers instead of utilizing these things for a competitive edge. as long as fps is 60 or above it is fine for shooting, you dont need 300fps tha tis just ridiculous.

More frames is better than fewer frames. Let's say you're getting a solid 240 fps, with a 120 hz refresh CRT. If you make a turn to check out your surroundings (call it a nice leasurely 16.7 ms turn) with an LCD and 60 fps you would see no frames during the turn.

With 240 fps you would have up to 3 frames during that turn (with tearing) which may let your mind register something you would not otherwise see.

I like both -- high frame rates AND eye candy. If I have to sacrifice resolution so be it. Besides, heads are HUUUUUGE at 320x200. =)
 

idiotekniQues

Platinum Member
Jan 4, 2007
2,572
0
71
when i played haloPC and 2142 competitively (not CPL level, TWL & CAL) i also did not mind lowering resolution for FPS. FPS was very important, i was just saying that some eye candy is actually tactically helpful - it is your own fault if you stop to stare at it instead of focusing on the match at hand :)

on my LCD i targeted a straight 60fp at all times. on my CRT i used a refresh rate of 75hz so i targeted a constant 75FPS