Why does high frame rate video look odd?

Status
Not open for further replies.

mike5757

Member
Apr 18, 2011
49
0
66
It seems like a lot of people don't like the way film/video looks when shot at over 24 (or 25, 30) fps. Is it because the shutter speed needs to be quicker and therefore reduces or eliminates the motion blur similar to what naturally happens when our eyes shift their gaze? Is it because we're accustomed to watching films shot at 24 fps? Something else? What do you all think?
 

LoveMachine

Senior member
May 8, 2012
491
3
81
Here's my 2 bits: Motion blur hides flaws. At 24fps, the "shutter", be it manual or CCD, is open long enough for moving objects to smear across the frame. So rather than a high resolution image for each frame, there is a slight blur along the path of motion. This hides flaws in computer generated rendering, either in details or of artificial movements. Think of the old stop motion animation (Clash of the Titans, circa 1980ish), each frame stuttered because it was a series of sharp stills. CG uses the same frame rate, but the software smears each still in the direction of motion. At normal speed, our brains perceive normal motion. But frame by frame, it looks blurry. I would think normal, live action filming at 48fps (without any sort of CG effects) should look pretty natural, and be more detailed than 24fps. But most of the stuff that uses 48fps like The Hobbit, the majority of the image is still generated digitally. My guess is that the motion algorithms used by the renderers are still coming of age, so the motion still doesn't look totally natural, since the software is still just translating what the animator or motion capture info is telling it to do, rather than true real life motion with millions of cues that your brain uses.

As far as the whole 240Hz "motion engine" type features of LCD TVs etc., it gets even worse since the DSP is being fed 24 or 30fps and just having to guess at what goes in between. Any sort of CG effect looks like a cartoon to me.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
My 2 cents agree with LoveMachine, its all about the blur.

Having watched the Hobbit in 3D and High Frame Rate I would say the quality of the CGI felt really bad, but when I went back to the The Lord of the Rings movies I noticed how the CGI had similar problems when I really looked for them but it was quite blurry.

What I want is a HFR version of the hobbit in 2D so I can compare against the prior movies frame by frame but I suspect the motion blur is what made them look like an episode from a local TV network and not a high budget movie.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I think its primarily that we're used to 24fps and people are resistant to change. In 10 or 20 years when 24fps is a distant memory people will be commenting on "How did they ever live with such crappy frame rate?"

The other side of it that might be semi-legit is that a low frame rate makes it less real and is easier to have a semi-detatched dream like state when watching the video.

Its kind of funny, but the whole transition we went through to HDTV caused a big issue with FX quality as noted above. They had to change the type of make-up used on set from cake to cream since you could see the difference. The props and sets had to be updated as you could really see the details. I don't specificallly remember VFX being mentioned, but I know it took a long time for all the real time com positing stuff like weather alerts during a show to be updated. Remember them dropping to SDTV any time a weather alert came on a few years back? Any time you raise the bar in quality like sdtv to hdtv or 24fps to 48fps you're going to have to make adjustments and it'll take awhile to catch up.
 

Dranoche

Senior member
Jul 6, 2009
302
68
101
We're conditioned to the 24, its all we know. The motion artifacts present as a result of 24 are mostly fixed with HFR, but then a new set of different motion artifacts (or what we perceive as artifacts) are introduced that we aren't used to so they stand out. If we had always been at HFR speed and then somebody came along and threw 24 fps at us, then things like judder would jump out at us the way some motion does in HFR now. Not that judder isn't really noticeable for some people now, but it would be a bigger shock, along with the slower frame rate standing out.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
The interesting part is about making films in general. At high frame rates the attention to detail has to be much higher in regards to CG, costumes, sets, etc because suspension of disbelief is hard to maintain.

I was fortunate enough to see the Hobbit at 48 FPS and I noticed it is far easier to see where live action and CG meet, similar to how you can make out matte paintings in older films. At higher frame rates if the CG doesn't match camera movement exactly it is obvious. On a technical note, I'm quite surprised at how well Peter Jackson worked through some of the these difficulties. While the film itself must stand on its own, the 48 FPS portion of it was well executed giving people a good example to judge on merit.
 

mike5757

Member
Apr 18, 2011
49
0
66
I've been reading up on film and it seems that 24 fps with a 1/48 second shutter speed seems to be preferred for its look, including amount of motion blur. With 48 fps, it would still be theoretically possible to use a 1/48 second exposure, but probably not on film since the shutter would essentially always be open. You could even go with 60 fps with a 1/60 second exposure. Has anyone experimented with this on a digital camera?
 
Status
Not open for further replies.