Whoa - Hobbit at 48 fps

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

destrekor

Lifer
Nov 18, 2005
28,799
359
126
I find it hilarious how people find high frame rate stuff "cheap" looking because its too smooth. Oh no! The image looks too realistic! Let me throw in some jerkiness so people think it looks classy!

Ultra-sharp moving pictures cannot in any way look realistic.

Some people may be fond of it, but it should be an option. To me, it looks artificial. It makes motion look sharper and unnatural, when compared to the way our eyes and brain normally would process the scene.
At least, that's the way it looks to my brain after my eyes and brain compute the scene.

To process the visual world into frames is the wrong answer to begin with, so simply increasing framerate isn't going to change the fundamental flaw with the concept.
Our visual system isn't one of frame after frame. It's not closing the shutters, it's a variable contrast, variable exposure, constant-stream of data into a vast network for computations of said stream.
Capturing motion onto frames, so that the video is nothing more than a high-speed playback of photographs in succession. (Hence, the motion picture)

Why do I mention this? As our vision is ultimately a fluid "video stream", there is no true blur in the eye because the speed of light means the ultimate projector of light and reflections (aka nature) can provide our "lenses" and "sensors" with the absolutely natural limit of speed, and there is no way we can distinguish anything other than razer-sharp clarity at that kind of refresh rate.
HOWEVER, this is the important part: our brain, as in the electrical conductions and the biological method for connectivity/communication, is much slower than what nature provides. There is no way our combined visual systems can provide a perfectly sharp and crisp image of motion to our eyes.
Can you focus on motion itself? No? Images captured in frame format strangely can, in a round-about way that is just odd, and I cannot even explain what I mean it's just that perplexing of a thought. But trust me on that. :)
In short, we have natural motion blur, because we aren't capturing in frames. The concept of motion blur in frames is inherent to the method, where frames are thrown across the exposure plate (or the sensor is processed and refreshed) at a high rate of speed, so each frame is capturing 1/24th of a second of motion, as it is whisked across the plate, as the motion is happening at a faster rate... the image is blurred. It's like taking a photo of something moving, if you have your shutter set at 1/24th of a second, the subject will be captured, and motion is captured with the subject blurred.

You can increase that framerate all you want, eventually, at 1000fps, the video might appear more natural. In fact, it would probably introduce an odd effect on the overall clarity, but it might possibly appear more natural. It far less natural. I don't know.

In short, the clamor for "more frames!" is the wrong answer. As it stands, in the near-term, I think it looks less natural as we get faster. There might be a barrier where it starts to look even better with far more fps, but that's not a worthwhile expenditure.
Personally, we should invest in some kind of video-stream capture system for cameras, where the video itself is simply a raw stream of visual data, without a system of progressive frames.
That also could complicate display systems - we need some kind of display drawing technology that simply blasts a constant stream of win, so that the constant stream of visual data is displayed without a physical refresh rate, it's just constantly firing in all directions on all cylinders. Possible? Right now, I couldn't even begin to imagine how we could ever engineer such awesomeness.
Projectors can throw a constant visual stream, but the assembly through which the light shines operates on a refresh rate system. Hell, our entire computing world as we know it only understands the concept of cycles/refresh rates/clocks. I think we may be far off from my world of amazing, mind-blowing technology. Dammit.
 

IcePickFreak

Platinum Member
Jul 12, 2007
2,428
9
81
Curious to see how this actually looks. I'm not a fan of the 120hz sets either, just personal preference. Not sure if this will give the same effect but willing to give it a watch before forming an opinion.
 

Born2bwire

Diamond Member
Oct 28, 2005
9,840
6
71
well more to the point, until recently they were using sh*tty cameras that couldn't do the job and looked like video. films like collateral, public enemies with jonny depp looked like crap because of their sub par digital cameras...public enemies looked downright like a high school production.

Yeah, Public Enemies looked wrong, especially the lighting in the night scenes. But I had no idea that Knowing was shot digitally, it never entered my mind when I saw it in the theater. I think a lot of this really has to do with the fact that we are just used to watching film. Give it a decade and I think that most people wouldn't think twice about watching a digital film. I would also love to see a higher framerate but I would want them to choose one that would be compatible with the current standards of TV and video media. Could we get 48 fps to work without interpolation?
 

Mr. Lennon

Diamond Member
Jul 2, 2004
3,492
1
81
A lot of misinformation in this thread about 120hz+ TV's. The "soap opera" effect that you are seeing is due to a setting called Auto Motion Plus (For Samsung TV's..other manufactures might have different wording). Every noob that buys a 120hz TV assumes that's how it's suppose to look. If you turn this feature off, the set will look like every other TV. 120hz is always on and DOESN'T change the picture.

I like the feature for animation, but I turn if off when I watch anything else. I will say the Dark Knight looked almost life like with the feature on.
 

Jeffg010

Diamond Member
Feb 22, 2008
3,435
1
0
Some of the people here sound like my grampa when I tried to show him how to do his taxes on the computer. He started to get frustrated and said doing it on paper was more life like.

I'm very excited for 48 fps.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Ultra-sharp moving pictures cannot in any way look realistic.

Some people may be fond of
In short, we have natural motion blur, because we aren't capturing in frames. The concept of motion blur in frames is inherent to the method, where frames are thrown across the exposure plate (or the sensor is processed and refreshed) at a high rate of speed, so each frame is capturing 1/24th of a second of motion, as it is whisked across the plate, as the motion is happening at a faster rate... the image is blurred. It's like taking a photo of something moving, if you have your shutter set at 1/24th of a second, the subject will be captured, and motion is captured with the subject blurred.

Except, that isn't what you have in motion film.

There may be 24 images captured per second, but the exposure time for each frame is almost always significantly less than 1/24 s. What this means, is that you *don't* get natural motion blur. Instead, you get lightly blurred images, projected with 'gaps' between them. This leads to the typical 'juddering' effect of conventional movie filming.

By contrast, TV shows look different, because they are typically filmed at a field rate of 50/60 Hz. You get the same discrepancy between exposure time and frame rate, however, the visibility of this is reduced, giving a more fluid motion appearance. You can, of course, manipulate this, e.g. by choosing extremely fast exposure times, to ensure no motion blurring - this is often done for sporting events, to ensure that slow-motion replays are crisp. This gives a 'jerky' appearance to the motion, similar to that seen in movies.

There are a lot of preconceptions and experiential preferences about frame rates. Some people like the 'movie look', even though it is unrealistic and distracting - they say the look of video is 'too real', or give some other reason.

This can be likened to changes in other technologies - e.g. some Hi-Fi enthusiasts prefer the 'sound' of vacuum tube amplifiers over transistor/IC amplifiers. It's not because the sound from the tubes is purer - it's not - it's a personal preference for the characteristic, easily audible distortion that tubes produce. This unnatural distortion is absent in transistor amplifiers.