Of course you can tell the difference between 30fps and 60fps. Watch your local news on TV then change the channel to a movie. Its kind of hard to explain, but media running @60fps is much more fluid. Folks in the PC gaming community know exactly what im talking about.
film is 24 fps. it actually looks "more natural" on the screen. This is why the 420p "mini DV" camcorder technology of yesteryear (c. 2003) looks like crap on screen--fluid and direct...but crap.
simply put--your eye doesn't interpret 2D images (film, screen) in the same way that it interprets 3D (the world). The human max "shutter speed" is 26-27fps, iirc. anything above that is simply indistinguishable. period.
film is shot at 24 fps b/c, well...after ~ 1 century of testing, that seems to be "what we like." It simply looks real. This is difficult to quantify, yes...but it tends to be more comfortable than what we see on the local news at 30fps.
I'm talking about film. recording image as it is. Talking about synthetic images...well, it's possible that our visual recognition starts having problems when you try to render cartoons, or graphics--image that suffers from aliasing or other anomalies that aren't exactly perfect, so the extra fps "may indeed" compensate as a stop-gap for the overall lower quality of each individual frame.
maybe this sounds confusing, but one has to accept that when looking at graphics...20 fps vs 60 fps vs 120 fps is essentially incomparable to 24fps vs 30 fps of real video footage.
your brain is one sensitive animal. your eyes have limits and specific ranges where it prefers to operate. images and video--we know and understand these limits. graphics and animation remain a "Cheat," b/c there is nothing realistic about this process.
i hope that makes sense....