60hz vs 120hz LCD

AntiFreze

Golden Member
Oct 23, 2007
1,459
0
0
Is it that much better? Going with the same brand, is it better to get a smaller (32") and 120hz rather than a larger (37") 60hz?

I currenly have a plasma 60hz but I've heard it matters more on a LCD. Thoughts?
 

sivart

Golden Member
Oct 20, 2000
1,786
0
0
600Hz = Plasma :)

If you don't watch a lot of Hockey or Golf you probably won't tell a difference. I could see trails with the 60Hz LCD I owned at one time.

With all things being equal and if price is within 20% I'd go with 120Hz.
 

smitbret

Diamond Member
Jul 27, 2006
3,382
17
81
Even the 60Hz LCDs handle motion better than they used to. I just can't seem to get used to the 120Hz/240Hz look, anyway. Plasma runs at 600Hz, not 60Hz, that's why it doesn't matter with Plasma. Get a good LCD with a decent pixel response time and you can probably "escape" with a 60Hz. Of course, 120Hz LCDs are coming down in price and if it's close in price, get the 120Hz. If it doesn't work for you, just turn it off.
 

kalrith

Diamond Member
Aug 22, 2005
6,628
7
81
The only benefit of 120Hz in and of itself is that you don't have to do 3:2 pull-down to display 1080/24p content. That's it!

Companies package 120Hz TVs with motion interpolation so that there's a noticeable difference. A lot of people (including myself) don't even like motion interpolation, and I would personally turn it off if it were on my TV. It seems to give an artificial look to things, and I just don't like it.

If you take motion interpolation out of the picture, then with 120Hz you're left with a small benefit over 60Hz that's only seen when displaying a 1080/24p BD.

If the choice is between a 32" 120Hz TV and a 37" 60Hz TV, then I'd pick the 37" TV every day of the week. You may or may not notice the benefit of the 120Hz, but you'll most definitely notice the 34% increase in size over the 32" TV.
 

amdhunter

Lifer
May 19, 2003
23,332
249
106
120Hz makes things look like they are filmed live, or in fast motion. It's annoying to me, especially when you see that "watery" wave following areas with good detail.

I have an LG 42" 1080P 60Hz set, and I prefer it over my friends 52" SONY Bravia LCD when running at 120Hz. (Of course the Bravia destroys my LG in 60Hz mode.)

BUT..! I like my 50" LG Plasma (which has an awful clamp on black/white scenes) better than BOTH the 1080P LG or SONY. I don't think I'll ever buy an LCD again.
 

ICXRa

Diamond Member
Jan 8, 2001
5,924
0
71
I would stick with plasma. I bought a Samsung 120Hz a couple years ago....it doesn't work as advertised, the Sony's also suffer from the same problems. Jerky studdering images etc that granted have improved with current models but they remain to some extent. Plus as others mentioned it tends to look odd when on anything but low and sometimes off.

What ever you do don't buy a Samsung!

I'll be rethinking my decision to go with LCD next time around for sure.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Using a 120hz or above TV isn't going to improve your temporal resolution. All TV signals in North America are 60i. All the TV does is show the same field twice per refresh. It doesn't make the image appear any smoother.

120hz is used to make TVs compatible with the film standard of 24fps, since 60 and 24 are divisors of 120. There's a debate in the TV industry about how useful this is. Some videophiles claim that using 24p results in smoother pans in movies, compared to the 2:3 pulldown method which is said to introduce judder. The downside is 24fps introduces a strobing effect. Only Blu-ray uses 24p though.

When buying an LCD TV, don't go by its refresh rate. Instead, check into it's response time. That's how fast a pixel can go from grey to grey. Higher response times introduces ghosting so try and find a TV with 5ms or lower for the smoothest image possible. Of course this isn't an issue with plasma, DLP, and CRT based HDTVs. CRT produces a spectacularly smooth image even though it's only 60hz.
 

thomsbrain

Lifer
Dec 4, 2001
18,148
1
0
mmntech, most 120Hz tvs have some sort of "smoothing" effect that extrapolates frames to fit "in-between" the actual frames of the input source. It makes a huge difference that is noticeable right off the bat. But whether you like it or not is personal preference. It makes things look like you're looking through a piece of glass, and movies take on a much more video-like appearance because you don't get that constant 24 fps judder that you usually get. But ultimately I prefer to have it off. For some strange reason, that 24 fps judder is part of what makes movies feel "special" to me. I will tell you that when you switch the smoothing effects off, for a while you can actually make out the individual frames on movies and you don't see it as continuous motion anymore, which is disconcerting, but your brain quickly adjusts.
 

prism

Senior member
Oct 23, 2004
967
0
0
Would you guys say that 120hz makes movies look more like home videos or certain sitcoms (Fresh Prince and Saved by the Bell come to mind as having this look)? If so, I'll happily stick with my 60hz :) Also, can you buy digital camcorders that record at 24fps to make recordings look more like movies? Obviously lighting and camera angles play a big part too, but I love the effect 24fps has.
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
120hz can be separate from frame interpolation which is what makes stuff look like video. 24 divides evenly into 120...which is the advantage. and pixel response gets a bit better as wel
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
mmntech, most 120Hz tvs have some sort of "smoothing" effect that extrapolates frames to fit "in-between" the actual frames of the input source. It makes a huge difference that is noticeable right off the bat.

I don't know, it's debatable about the improvement. It's not whether you see it but whether most people see it. Most people don't. 60i was chosen for a reason since that's the frame rate where most people don't notice shutter effects. I have a good 600hz plasma with "smoothing" at home and honestly, I cant' see a huge difference between that and the high end LCD 60hz TVs we have at the station.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
60hz for a refresh rate was chosen because it needed to be compatible with the NTSC format that has existed for decades. CRT sets were made using 60hz because that is what the power line frequency was and all the tech could accomplish ,interlaced so it requires two passes for every complete frame . So USA is 60hz power / 2 = 30 fps for NTSC and Europe is 50hz power /2 for 25fps PAL. Accurate clock generators would have cost more than the tv back then and using the power line frequency was free.

Film is shot at 24fps but is not displayed at 24hz, that would cause serious eye strain. They do not use motion blur to smooth the frames like some think. The projectors have shutters that display each frame 3 times, so you see it at 72hz minimum. Some theaters go up to 96hz. None are 120hz. So why no 72hz LCD ? They are still using the power line for the time base. 72hz would cost more to display , breaks 30fps compatibility and 120hz looks better to marketing. All they do is double the power line frequency, they can do it cheaply.

The problem with current LCD display is that it is trying to maintain compatibility with older tech and still add new tech. In CRT you have to redraw the screen constantly or the phosphor glowing that shows the image will fade to black. So the CRT is redrawn at the refresh rate or Hz. Faster refresh means the phosphor has less time to fade making the image appear clearer.

With LCD a pixel is turned on and never need to be refreshed unless the image changes . A picture will look the same 20 seconds from now never having to be changed by the LCD controller. This is where the confusion starts. LCD adopted the hertz term from CRT because people were familiar with it. It doesn't mean the same thing though. On LCD hertz is how many times per second the controller is capable of redrawing the screen which does not mean that a higher hz will have a better picture like in CRT. The picture is static until changed unlike the CRT where the picture would fade away without an update.

There are two ways currently shipping LCD display video.
1. They take the number 60 for every second and divide that into slices equaling 1/60th. So for 30 fps content:
frame 1 1/60th sec
frame 1 2/60th sec
frame 2 3/60th sec
frame 2 4/60th sec
frame 3 5/60th sec

The problem is what happens when they are out of sync, you get tearing:
frame 1 2/60th sec
frame 2 3/60th sec
frame 2 4/60th sec
frame 3 5/60th sec
frame 3 6/60th sec

frame 1 didn't get displayed in the same second as the new frames , you only saw that frame one time where you are seeing the others twice. So they implemented vsync.
It syncs up the clock in the source with the sync in the display.

24fps film is a problem because if you double it to 48fps you are still short 12 frames from being at and equal 60 clock cycles. So they display some frames twice and a few 3 times to get it to equal out to 60. That is the pulldown people refer to.

60hz vs 120hz for 30fps content :
Difference should be zero in a properly designed display. The only difference is in 120hz displays you are dividing the time in 1/120 second slices.

for 30fps source
frame 1 1/120th sec
frame 1 2/120th sec
frame 1 3/120th sec
frame 1 4/120th sec
frame 2 5/120th sec
frame 2 6/120th sec
frame 2 7/120th sec
frame 2 8/120th sec
frame 3 9/120th sec

The reason some people say that 120hz looked blurred or slow was because early 120hz sets had panels really too slow to be doing what they were trying. They were actually redrawing the screen every 1/120th slice of time and if it can't keep up the frames will appear blurred.

The next generation of displays are coming with HDMI 1.5 . The hertz concept is finally going away. Displays and controllers and memory has gotten fast enough that sets are in design that completely ignore hertz. Instead the set connects to the source and captures a full second of video all 24 , 30 , 18, 39 or whatever frames . Along with the frames comes bits that tell the display how many frames per second they are to be displayed.
The display then uses on board hardware to divide 1 by the fps using floating point math. So it doesn't matter that 60/24 isn't equal, it uses fp math so it merely divides the second into a slice equal to 1/24 = .04167 seconds per slice. Then it updates the display at that rate. Current displays cannot do this because they lack the processing power and do not have enough memory to store frames in memory. These displays also do not update the screen every slice, but only when the frames change. Because the display redraw rate is so high it doesn't matter that you only see it redrawn every 24 frames there is no flicker.

There are studios using these now to do some film work and I really like what I saw. They can do things you can't do with film or current displays. Like shoot a movie so that some of it is 24fps, some 120fps and then back down to 45 fps. All depending on the action on screen and what they want to show. Sort of like the difference between filming with high speed cameras where they show things like glass breaking in slow motion. I can't wait to see what films using techniques like bullet time will look like in the future.