Here's my 2 bits: Motion blur hides flaws. At 24fps, the "shutter", be it manual or CCD, is open long enough for moving objects to smear across the frame. So rather than a high resolution image for each frame, there is a slight blur along the path of motion. This hides flaws in computer generated rendering, either in details or of artificial movements. Think of the old stop motion animation (Clash of the Titans, circa 1980ish), each frame stuttered because it was a series of sharp stills. CG uses the same frame rate, but the software smears each still in the direction of motion. At normal speed, our brains perceive normal motion. But frame by frame, it looks blurry. I would think normal, live action filming at 48fps (without any sort of CG effects) should look pretty natural, and be more detailed than 24fps. But most of the stuff that uses 48fps like The Hobbit, the majority of the image is still generated digitally. My guess is that the motion algorithms used by the renderers are still coming of age, so the motion still doesn't look totally natural, since the software is still just translating what the animator or motion capture info is telling it to do, rather than true real life motion with millions of cues that your brain uses.
As far as the whole 240Hz "motion engine" type features of LCD TVs etc., it gets even worse since the DSP is being fed 24 or 30fps and just having to guess at what goes in between. Any sort of CG effect looks like a cartoon to me.