100-150fps???? What's the purpose of that?
Always have fun answering this one
🙂
First off, you can't even percieve anything higher than 100fps....
Wrong. The USAF has conducted studies in which pilots were shown images of jets for 1/200th of a second could not only identify that they had seen a jet, but could accurately state what type it was.
Second, most monitors don't have a refresh rate that high, so you'll get vertical tearing unless you use vsync..
My monitor does at the setting I play multiplayer FPSs at, 135Hz actually. Vertical tearing is no big deal when I'm focusing on not getting my @ss fragged.
Third.... WHy the hell do you think you need 100-150fps?
Because with an average of 150FPS your minimum framerate will likely be above 70FPS most of the time, even that isn't ideal.
I play games a 1280x1024 w/ 4xAA and 8xAF. And if my frames drop down in the the thirties, it doesn't bother me. It never goes about 60fps, because I use vysnc.
Don't know what kind of games you are talking about, so that doesn't mean too much. Having WC3 drop in to the 30s is no problem at all, same with CivIII, C&C Generals, SimCity4, MS FS 2K4 etc, talk about Quake3 and it is a
major issue.
In conclusion, I believe you're an idiot. I Mean.... 150fps? For what?
Let's assume your false assupmtion that 100FPS is the max you can see. You know when your video card draws a frame that isn't the one you are seeing? Frames get drawn to a back buffer first, and then if you are using double buffering get 'flipped' and displayed on to your monitor. Your mouse input you see on the screen is what you were doing the frame prior, not at that exact point in time. The situation is amplified if you use tripple buffering as you are two frames behind. With an average framerate of 100FPS your minimum is likely to be about 47FPS give or take, with double buffering the latency from frame buffer operations effectively gives you 23.5FPS with double buffering from input to execution time(if you have perfect reflexes) and a whopping 15.67FPS when using tripple buffering. In laymans terms, if 100FPS were actually the maximum you could see, you would need 200FPS
minimum to eliminate perceptable input latency running double buffered and 300FPS
minimum running tripple buffered. That means around 460FPS or so average. Check out the settings the pros run some time, 100FPS average is far too slow for any really competitive FPS gamer.
In the example you gave for how your rig runs, you are looking at an effective input of 15FPS running in double buffered mode and 10FPS in tripple buffered. If you can't feel how slow that is, all the power to you. I certainly wouldn't be questioning the capacity of other people who state their desires for framerate if I were you
🙂