60hz for a refresh rate was chosen because it needed to be compatible with the NTSC format that has existed for decades. CRT sets were made using 60hz because that is what the power line frequency was and all the tech could accomplish ,interlaced so it requires two passes for every complete frame . So USA is 60hz power / 2 = 30 fps for NTSC and Europe is 50hz power /2 for 25fps PAL. Accurate clock generators would have cost more than the tv back then and using the power line frequency was free.
Film is shot at 24fps but is not displayed at 24hz, that would cause serious eye strain. They do not use motion blur to smooth the frames like some think. The projectors have shutters that display each frame 3 times, so you see it at 72hz minimum. Some theaters go up to 96hz. None are 120hz. So why no 72hz LCD ? They are still using the power line for the time base. 72hz would cost more to display , breaks 30fps compatibility and 120hz looks better to marketing. All they do is double the power line frequency, they can do it cheaply.
The problem with current LCD display is that it is trying to maintain compatibility with older tech and still add new tech. In CRT you have to redraw the screen constantly or the phosphor glowing that shows the image will fade to black. So the CRT is redrawn at the refresh rate or Hz. Faster refresh means the phosphor has less time to fade making the image appear clearer.
With LCD a pixel is turned on and never need to be refreshed unless the image changes . A picture will look the same 20 seconds from now never having to be changed by the LCD controller. This is where the confusion starts. LCD adopted the hertz term from CRT because people were familiar with it. It doesn't mean the same thing though. On LCD hertz is how many times per second the controller is capable of redrawing the screen which does not mean that a higher hz will have a better picture like in CRT. The picture is static until changed unlike the CRT where the picture would fade away without an update.
There are two ways currently shipping LCD display video.
1. They take the number 60 for every second and divide that into slices equaling 1/60th. So for 30 fps content:
frame 1 1/60th sec
frame 1 2/60th sec
frame 2 3/60th sec
frame 2 4/60th sec
frame 3 5/60th sec
The problem is what happens when they are out of sync, you get tearing:
frame 1 2/60th sec
frame 2 3/60th sec
frame 2 4/60th sec
frame 3 5/60th sec
frame 3 6/60th sec
frame 1 didn't get displayed in the same second as the new frames , you only saw that frame one time where you are seeing the others twice. So they implemented vsync.
It syncs up the clock in the source with the sync in the display.
24fps film is a problem because if you double it to 48fps you are still short 12 frames from being at and equal 60 clock cycles. So they display some frames twice and a few 3 times to get it to equal out to 60. That is the pulldown people refer to.
60hz vs 120hz for 30fps content :
Difference should be zero in a properly designed display. The only difference is in 120hz displays you are dividing the time in 1/120 second slices.
for 30fps source
frame 1 1/120th sec
frame 1 2/120th sec
frame 1 3/120th sec
frame 1 4/120th sec
frame 2 5/120th sec
frame 2 6/120th sec
frame 2 7/120th sec
frame 2 8/120th sec
frame 3 9/120th sec
The reason some people say that 120hz looked blurred or slow was because early 120hz sets had panels really too slow to be doing what they were trying. They were actually redrawing the screen every 1/120th slice of time and if it can't keep up the frames will appear blurred.
The next generation of displays are coming with HDMI 1.5 . The hertz concept is finally going away. Displays and controllers and memory has gotten fast enough that sets are in design that completely ignore hertz. Instead the set connects to the source and captures a full second of video all 24 , 30 , 18, 39 or whatever frames . Along with the frames comes bits that tell the display how many frames per second they are to be displayed.
The display then uses on board hardware to divide 1 by the fps using floating point math. So it doesn't matter that 60/24 isn't equal, it uses fp math so it merely divides the second into a slice equal to 1/24 = .04167 seconds per slice. Then it updates the display at that rate. Current displays cannot do this because they lack the processing power and do not have enough memory to store frames in memory. These displays also do not update the screen every slice, but only when the frames change. Because the display redraw rate is so high it doesn't matter that you only see it redrawn every 24 frames there is no flicker.
There are studios using these now to do some film work and I really like what I saw. They can do things you can't do with film or current displays. Like shoot a movie so that some of it is 24fps, some 120fps and then back down to 45 fps. All depending on the action on screen and what they want to show. Sort of like the difference between filming with high speed cameras where they show things like glass breaking in slow motion. I can't wait to see what films using techniques like bullet time will look like in the future.