I am laughing at those who think they can see a difference between like 70fps and 120fps
I always feel compelled to respond to these
Don't forget to clarify between average framerate, instantanious framerate, and minimum framerate.
If you are talking about a timedemo average, then I would be willing to bet that in actual gameplay many people can tell the difference between a system averaging 70fps and a system averaging 120fps.
This is because minimums can be half of the timedemo average, and even lower in some cases. (Support fire explosions in Return to Castle Wolf multiplayer are a good example). So using 1/2 math, a 70fps timedemo could have minimums of 35fps where 120fps minimums might be higher. It also depends on the CPU and other factors.
As far as instantanious FPS, I agree that most people couldn't tell the difference between 70fps and 120fps. In some games, where input and controls are affected by framerate, I state that I can *tell* the difference between the two. I don't know if it's a visual thing or a "feel" thing but I can definatly, 100%,no BS, I'm-not-trying-to-sell-you-a-video-card, tell the difference in Quake engine games.
Many people who play the game online claim the same thing, and there has been documented discussions proving that framerate changes the physics of the Quake engine games.
Now, if a game *always* ran at 70fps... or 120fps... well, where can I buy that video card?
I think there is a misunderstanding between the people that talk framerates. When I say that I want 400fps or 200 or whatever, it's simply because I want my minimum framerate to NEVER be below the point where I can detect "choppy-ness". Rasing the timedemo average has been the only easy method to raise minimum frame rates.
Last point... this has nothing to do with which video card 0WN3RZ and which video card is suck. I like them all cause the more there are the lower prices get and the higher quality us tweakers end up with.