Originally posted by: mwmorph
really though, how much processing power would it take to make pc screens blur? take the difference of frame 1 and the frame before it, determine movement and blur a bit for it.
im sure some programmer can do it.
You would have to take into account that the calculations would have to be done on every single frame. Adding a massive amount of overhead on an already contrained processes.
One of the benefits of buffers is being able to draw one frame in a buffer and draw the next frame in another and swap out when the final rendering is complete on the 2nd picture. As opposed to rendering one frame, deleting it, and rendering the next frame from scratch. Adding a motion blur would require a third set at the minimum. That would have to determine the motion differences through some sort of algorithm, and fromt he comparision draw out a third frame to be processed. Set it as the next frame, and draw another frame, compare, inject... etc.
It seems like you would accomplish 30%+ more overhead just so you could possibly get a picture that should be able to run at lower FPS with a similiar or degraded IQ.
I personally get headaches when I look at monitors at 60Hz or lower. Once I put up a refresh rate to 70Hz+ I don't experience any discomfort.
I know early on when I first started to get into 3D games, I went from a Stealth III to a TI4200 I noticed a big jump in FPS and the percieved jerkiness in the game went away.
Though that's just observations on what I can determine. I'm fairly sure certain people have more or less sensitivity.