FPS vs Response time

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
I was thinking. Say I have a monitor with a 8 ms response time, and lets say I'm playing BF2 and I'm pulling 50 fps. Now, 50 fps means every 2 ms there is a new frame. Is that correct? If so, how could a 8 ms response time monitor possibly show a frame every 2 ms? The way I see it, a 8 ms response time would only let you see 12.5 fps. Am I missing something here????
 

soydios

Platinum Member
Mar 12, 2006
2,708
0
0
1000 divided by 8 = 125

FYI, that 8 milliseconds (1 millisecond = 1/1000 second) is to go from gray to white to gray.
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
Originally posted by: videogames101
I was thinking. Say I have a monitor with a 8 ms response time, and lets say I'm playing BF2 and I'm pulling 50 fps. Now, 50 fps means every 2 ms there is a new frame. Is that correct? If so, how could a 8 ms response time monitor possibly show a frame every 2 ms? The way I see it, a 8 ms response time would only let you see 12.5 fps. Am I missing something here????

Yes. Your maths is wrong. 50FPS = 1000/50 = 20ms. 16ms is effectively 60FPS.