120Hz monitors and motion blur

NickelPlate

Senior member
Nov 9, 2006
652
13
81
Hi All,

I've read that motion blur on true 120Hz monitors is much improved over the common 60Hz panels.

But does your video card have to be capable of rendering at 120 fps or above to really take advantage of this?

The reason I ask is because I recently upgraded to a Samsung PX2370 (60Hz) from my ancient 19" Sony Trinitron. It's a great monitor but I've had a little trouble getting used to the motion blur which was non-existent on my CRT. I always gamed on my CRT at 60Hz with V-sync on (can't stand the tearing without v-sync) at whatever resolution and settings allowed me to sustain 60fps or higher which always resulted in a perfect image with no tearing or blur (although maybe a little flicker from the low refresh). For my eyes, 60 fps is the magic number. And as long as my video card doesn't dip below that number too much while gaming and I'm happy with the resolution and image quality then life is good.

So the real question is, if I'm gaming on a 120Hz flat panel and my video card is rendering at something less than that say 90 fps and I have v-sync turned on what will it look like? Smooth as glass? Motion blur? Screen tearing? My gut tells me that it will be similiar to having my CRT set to a higher refresh than my video card is capable of rendering which results in a blurring effect. Unfortunately I haven't had the opportunity to try it for myself as 120Hz flat panels are not very common yet so I'm looking for personal experiences.

Thanks,

NP
 
Last edited:

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
The thing most lcd monitors list is response time. That is the time it takes to change a pixel from black to white and back to black. Unfortunately they rarely list the other important part, latency. It takes time for the lcd controller to receive the signal , process it , send it to the actual panel. A response time of 2ms with a latency in the controller of 5ms would be blurry in gaming or fast action.

120hz lcd may look better not because the panels themselves are faster but because the controllers processing the signal are running at a higher clock speed and can convert the signal faster. There are also some 60hz displays that use the exact same controllers as 120hz models and the only difference is the firmware. The panels cannot display at 120hz because they have a higher response time than their 120hz cousins but they can be tricked into providing the same latency benefits.
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
The thing most lcd monitors list is response time. That is the time it takes to change a pixel from black to white and back to black. Unfortunately they rarely list the other important part, latency. It takes time for the lcd controller to receive the signal , process it , send it to the actual panel. A response time of 2ms with a latency in the controller of 5ms would be blurry in gaming or fast action.

120hz lcd may look better not because the panels themselves are faster but because the controllers processing the signal are running at a higher clock speed and can convert the signal faster. There are also some 60hz displays that use the exact same controllers as 120hz models and the only difference is the firmware. The panels cannot display at 120hz because they have a higher response time than their 120hz cousins but they can be tricked into providing the same latency benefits.

um wow, not sure where to start with the mis information.

First off, 120hz means nothing. Refresh rate hasn't meant much since CRTs since LCDs do not refresh anything. For TV's 120hz means one thing and for monitors it means another. I'll explain both in a second after clearing up some other info.

Some of what Modelworks said was correct. Quite a bit was misleading and some was false. First off, with monitors there is a response time as he said. There is a grey to grey, the time it takes to change along the grey scale, and the black to white response time. Also, as he said, there is a latency.

Response time affects your "motion blur" or ghosting affect seen in games. Latency affects how many frames behind you are. Latency has almost nothing to do with how well a given frame looks in terms of quality. That is where he was wrong.

With a high response time, you get motion blur and ghosting with high action scenes when the changes come faster than the pixels can change color. They typically end up bleeding where some pixels are almost keeping up and others are not. Also, in most scenes, not everything changes rapidly so there is a bit of this disconnect. It's hard to describe more than that without witnessing it for yourself. Suffice it to say, response time has to deal with the quality of the frame in high action scenes. However, if the response time is fast enough, then the human eye can't even see the distortion that occurs and it looks seemless to us. Although technically motion blur is still going on, just we can't see it anymore.

Latency deals with how fast the monitor takes to process a frame and display it. Monitors with high latency can be several frames behind what is actually happening. When watching a video, this means nothing except when it comes to matching up sound with what is displayed on the screen. If your monitor has 30 milli seconds latency, then the sound needs to be delayed by 30 milli seconds as well.

The problem comes with high latency monitors and gaming. The higher your latency, the father behind in frames you are seeing versus the real action. In a first person shooter this could be a serious handicap. This is because if your target starts moving from a standstill, you could be shooting at a place they no longer really are in the game. Your aim would be off. Good gaming monitors have very low latency of usually 1 or 2 frames. Bad gaming monitors can be 10 or more frames behind.

The whole 120hz bit is more a gimmick with TVs than with monitors. Basically, with TV's in some high action scenes, even with a fast response time you can still get enough blurring and pixelation to be noticeable. A good example is watching a sporting event with lots of flash photography. Since black to white response time is slower than grey to grey on every TV and monitor, you still get visual problems. To fix that and improve visual quality, 120hz TVs introduce intended latency. The TV itself slows down and collects several frames. Then what it does is actually create inbetween transition frames and use smoothing algorithms and filters to clean up the image.

This makes the image much nicer and smoother even if it is a little less true to the source. Most of the time this is still seemless, but it can still cause some strange visual effects especially with the additional morphing frames are trying to smooth really black areas that are moving. You see what I call the black bobbing affect going on. I see this when I watch movies and say a person on screen has a really black beard and is walking. As they walk and bob up and down, the beard seems to slide a little around on their face and generate an after image. Some TV's are better and this is less noticeable. Some are worse.

Now you know what 120Hz in a TV is then it's time to explain 120Hz for an LCD monitor. It really maps to frames per second that the monitor is capable of rendering. Still, if you have a high enough response time and try to render frames too fast, the motion blur would actually be worse. If the response time is very fast 2ms or less on black to white then the motion blur isn't noticeable. The reason to have higher refresh rate for a monitor is to deal with image tearing without having to use Vsync.

Vsync for an LCD monitor forces the monitor and frames per second from the video card to match up. This introduces lag but increases image quality because you are not losing frames. If the video card is sending out more frames per second than the monitor is capable of rendering in time, you can lose frames or parts of frames as the monitor tries to keep up. With a 120Hz monitor you retain all the frames your video card can output up to 120 frames per second without losing those frames and having to using VSync. Without having to use VSync, that means you do not introduce additional latency to not lose frames.

In short, even if you have a true 120hz monitor, which means a monitor capable of rendering up to 120 frames per second from it's frame buffer, you do not need to have a video card outputting 120 frames per second. If the video card is doing 90 frames per second you are not losing anything at all. If your video card is going above 120 frames per second, then you could still be dropping frames and losing image quality due to image tearing. In which case the only way to fix that would be to turn on VSync but that would case latency. So basically, refresh rate for LCD monitors really should be "Max frames per second rendering capability." However, that doesn't have the same marketing ring to it.

Hope this answers all your questions.
 
Last edited:

VashHT

Diamond Member
Feb 1, 2007
3,257
1,249
136
You can get screen tearing when getting lower fps than the refresh rate, it has to do with the frames being sent to the screen not being synced with the refresh of the screen, which can happen when you are over or under the refresh rate.

For the OP, I've been using a 120hz monitor and it is a lot smoother than my old LCD. I don't use vsync really because the high refresh rate goes a long way in eliminating screen tearing. With vsync on though the image is incredibly smooth, with a lot of games I have I can lock the fps at 120 which looks great. As far as blur goes I think there is still a little on my screen, I haven't used a CRT in years so it's kind of hard for me to compare now but it's definitely a lot smoother than a 60hz screen.
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
You can get screen tearing when getting lower fps than the refresh rate, it has to do with the frames being sent to the screen not being synced with the refresh of the screen, which can happen when you are over or under the refresh rate.

For the OP, I've been using a 120hz monitor and it is a lot smoother than my old LCD. I don't use vsync really because the high refresh rate goes a long way in eliminating screen tearing. With vsync on though the image is incredibly smooth, with a lot of games I have I can lock the fps at 120 which looks great. As far as blur goes I think there is still a little on my screen, I haven't used a CRT in years so it's kind of hard for me to compare now but it's definitely a lot smoother than a 60hz screen.

Uhh, not really.

Image tearing occurs because the frame in the buffer being readied to be rendered has not yet been pulled completely out for display when the next frame is trying to insert itself into the buffer and instead flushes out what is left of there previously. Meaning you get parts of the first frame and parts of the second or even third frame mixed in and displayed at once. It has nothing to do with refresh rate.

Turning on VSync basically tells the image generating device to slow down until the buffer for the image rendering device has pull the previous frame before giving it the next one. More buffers also help to an extent.

http://en.wikipedia.org/wiki/Screen_tearing

You typically get more image tearing when you are displaying frames faster than the monitor is capable of rendering. IE, with a 60Hz lcd, anything more than 60 frames per second without VSync is GUARANTEED to give you image tearing. Even with VSync on there is still a chance but is minimal.

Again, by having a monitor capable of rendering more frames per second faster, ie 120Hz vs 60Hz, all you are doing is lowering the chance of image tearing without having to use VSync up to 120 frames per second. Anything over 120 frames per second is going to cause image tearing on either monitor.
 

VashHT

Diamond Member
Feb 1, 2007
3,257
1,249
136
Uhh, not really.

Image tearing occurs because the frame in the buffer being readied to be rendered has not yet been pulled completely out for display when the next frame is trying to insert itself into the buffer and instead flushes out what is left of there previously. Meaning you get parts of the first frame and parts of the second or even third frame mixed in and displayed at once. It has nothing to do with refresh rate.

Turning on VSync basically tells the image generating device to slow down until the buffer for the image rendering device has pull the previous frame before giving it the next one. More buffers also help to an extent.

http://en.wikipedia.org/wiki/Screen_tearing

You typically get more image tearing when you are displaying frames faster than the monitor is capable of rendering. IE, with a 60Hz lcd, anything more than 60 frames per second without VSync is GUARANTEED to give you image tearing. Even with VSync on there is still a chance but is minimal.

Again, by having a monitor capable of rendering more frames per second faster, ie 120Hz vs 60Hz, all you are doing is lowering the chance of image tearing without having to use VSync up to 120 frames per second. Anything over 120 frames per second is going to cause image tearing on either monitor.

Well I'm not saying you're wrong but wouldn't there still be a chance of screen tearing if you were getting say 50fps on a 60hz screen? Wouldn't there still be a chance for the output buffer to update during a monitor refresh since they are not synced up? I thought it was all about matching the update of the buffer to the monitor's refresh rate (or making it an even multiple of the rate).
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
Well I'm not saying you're wrong but wouldn't there still be a chance of screen tearing if you were getting say 50fps on a 60hz screen? Wouldn't there still be a chance for the output buffer to update during a monitor refresh since they are not synced up? I thought it was all about matching the update of the buffer to the monitor's refresh rate (or making it an even multiple of the rate).

There is always a small chance of tearing regardless. Even with syncing. However, the chance of tearing while either synced or when displaying less frames per second than the "refresh rate" is about the same.

Meaning if you are doing 50 frames per second on a 60Hz monitor, you are going to have roughly the same tearing rate if you synced it. The difference is syncing will always add in more latency.
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
I think the "motion blur" you guys are talking about is really just artifacts of the display not refreshing fast enough.

True "motion blur" is the apparent streaking of fast moving objects in images and video. If you paused video with real motion blur, the moving object in the still image would still be blurred.

The whole 120hz bit is more a gimmick with TVs than with monitors. Basically, with TV's in some high action scenes, even with a fast response time you can still get enough blurring and pixelation to be noticeable. A good example is watching a sporting event with lots of flash photography. Since black to white response time is slower than grey to grey on every TV and monitor, you still get visual problems. To fix that and improve visual quality, 120hz TVs introduce intended latency. The TV itself slows down and collects several frames. Then what it does is actually create inbetween transition frames and use smoothing algorithms and filters to clean up the image.

I don't believe this is correct.

Here's my understanding of it. I may have gotten some of the details wrong, but it should be accurate enough for a "layman's" description.

A lot of video and movies are shot at 24FPS. Try shoehorning 24FPS into a 60FPS display and what happens? It doesn't work. The solution is to do what is called "2:3 pulldown." Basically half the frames are shown twice per second and the other half shown three times per second. This produces a bit of "jutter" in the video, but the result is that 24FPS becomes 60FPS.

What happens when you use a 120Hz display? Well, 24 can go evenly into 120, so there is no problem. Just play every frame 5 times.
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
I think the "motion blur" you guys are talking about is really just artifacts of the display not refreshing fast enough.

True "motion blur" is the apparent streaking of fast moving objects in images and video. If you paused video with real motion blur, the moving object in the still image would still be blurred.



I don't believe this is correct.

Here's my understanding of it. I may have gotten some of the details wrong, but it should be accurate enough for a "layman's" description.

A lot of video and movies are shot at 24FPS. Try shoehorning 24FPS into a 60FPS display and what happens? It doesn't work. The solution is to do what is called "2:3 pulldown." Basically half the frames are shown twice per second and the other half shown three times per second. This produces a bit of "jutter" in the video, but the result is that 24FPS becomes 60FPS.

What happens when you use a 120Hz display? Well, 24 can go evenly into 120, so there is no problem. Just play every frame 5 times.

You have part of it, but not all of it. Yes, films are shot in America at 24 frames per second (fps) and are converted over to 30 fps for standard TV viewing. One way to do this is just to double up an extra frame every so often. This is about a simple a job as can be done. However, that does nothing to fix image quality problems that may be present from fast moving and high action scenes displayed on an LCD that can not keep up. For tube TVs or anything using light based projection to display the image, it's not a problem at all. However, most LCD TVs at least do a "morphed" image using various smoothing algorithms. Instead of just making a copy of a frame, they create a new frame that is a compilation between two frames to make a smoother transition between the frames. This does help with motion blur on LCD TVs in high action scenes.

120hz does work as a multiple of 24 for the number of fps movies are filmed in, it does allow 3 duplicate frames to be introduced. Again, just making the same frame display 4 times does nothing to enhance the image quality or control motion blurring and artifacts on the screen. Again, all LCD TVs instead employ a morphing and smoothing method to "create" those extra frames between 2 of the original frames to allow a smoother transition and clean up the image quality.

Just read this here.

http://hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I think the "motion blur" you guys are talking about is really just artifacts of the display not refreshing fast enough.

True "motion blur" is the apparent streaking of fast moving objects in images and video. If you paused video with real motion blur, the moving object in the still image would still be blurred.

Yeah there can be bluring in the source material as well, get your iso's and shutter speed screwed up when filming. Oh and that Intel project offset video where they intentionally blur the shit out of everything like some lame blair witch project wanna-be effect. (can you tell I'm a fan?)
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
You have part of it, but not all of it. Yes, films are shot in America at 24 frames per second (fps) and are converted over to 30 fps for standard TV viewing. One way to do this is just to double up an extra frame every so often. This is about a simple a job as can be done. However, that does nothing to fix image quality problems that may be present from fast moving and high action scenes displayed on an LCD that can not keep up. For tube TVs or anything using light based projection to display the image, it's not a problem at all. However, most LCD TVs at least do a "morphed" image using various smoothing algorithms. Instead of just making a copy of a frame, they create a new frame that is a compilation between two frames to make a smoother transition between the frames. This does help with motion blur on LCD TVs in high action scenes.

120hz does work as a multiple of 24 for the number of fps movies are filmed in, it does allow 3 duplicate frames to be introduced. Again, just making the same frame display 4 times does nothing to enhance the image quality or control motion blurring and artifacts on the screen. Again, all LCD TVs instead employ a morphing and smoothing method to "create" those extra frames between 2 of the original frames to allow a smoother transition and clean up the image quality.

Just read this here.

http://hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm


I just a WTF moment. First the thread derailed massively, the OP was asking how does 90 fps into 120hz with vsync feels. To answer the OP (which nobody did), with vsync on the video card always waits for the monitor's refresh request before sending out a complete finished frame. So the monitor is requesting a new frame every 1000/120 = 8.33ms, if the frame as been drawn completely, it is sent to the monitor, if it's not then the previous frame is refreshed again. This gives the card an additional 8.3ms to finish the next frame. So if you are getting 90 fps on 120hz vsync, you are basically experiencing 90 discrete frames and 30 duplicates. This is not ideal at all in terms of feel because the frame rate perceived actually feels lower than 90. You can do the math, there will be some 16.6ms intervals between certain frames, which feels more 60fps. This invention called triple buffering remedies the vsync halving problem (if 120 fps can't be reached, 60 is displayed, if 60 can't be reached, 30 is displayed, and so on). While the counter shows 90 FPS, the feeling is not the same as a true 90 FPS signal because of the cadences that have to be employed (11.1ms even intervals as opposed to 8.3ms - 16.6ms).

Second of all, what this guy wrote about HDTV is almost completely wrong. Most 120Hz LCDs (unless they are 3D) have panels that refresh at 120Hz (119.88Hz actually) but they can only accept 23.976Hz or 59.94Hz signals. What happens is, if you send 23.976fps already pulled down to 59.94fps @ 59.94Hz (which is the 60Hz setting on your player), the TV will simply display a frame for every two refreshes. This is what 90% of people watch, and is frowned upon for the stutter it produces on slow camera pans. If on the other hand you output a 23.976fps @ 23.976Hz (which is the 24Hz setting on your player), the TV will display a frame for every 5 refreshes. The crap that HumblePie is referring to is called motion interpolation and has nothing to do with clearing up the image or reducing artifacts like he would have you believe. It is only activated if you enable the technology in your specific TV. You can read about it here:

http://en.wikipedia.org/wiki/Motion_interpolation

What it does is it generates extra frames by analyzing two frames and adds a whole bunch of extra frames to reach 120 fps and create a smooth transitions. Unless you are the type that likes to watch post processed shit, the type that equalizes his music to make up for his shitty headphones/speakers, or your are just plain stupid (not an insult just another word for ignorant), you will want to turn this technology OFF. Unless you like it of course, at which point I back off and let you masturbate.

That's it, that's all there is to it.

3D TV is a whole other beast, it can accept and display 120fps @ 120hz signals, but it's not much use right now unless you use a PC for gaming or a watch 3D blurays.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
There is always a small chance of tearing regardless. Even with syncing. However, the chance of tearing while either synced or when displaying less frames per second than the "refresh rate" is about the same.

Meaning if you are doing 50 frames per second on a 60Hz monitor, you are going to have roughly the same tearing rate if you synced it. The difference is syncing will always add in more latency.
I have never seen tearing if vsync is actually working. the framerate most certainly does not have to be exceeding the refresh rate to see tearing with vsync off though. every game is different but there are plenty of them where tearing is noticeable no matter how low the framearate is. sometimes just a scene that has flickering lights can show obvious tearing. even just moving the mouse around in some games will show plenty of tearing even at very low framerates. heck with a my 8600gt I had to turn on vsync in Crysis because the tearing was bothering me at 20-30fps.
 
Last edited: