The thing that so often gets overlooked when people discuss the 30 fps of television video or 24 fps of film versus 60 fps in a videogame is that by the very nature of their design, videotape and film pick up motion blur for moving objects in the frames. This motion blur helps your brain to "smooth out" the motion, making even 24 fps (for film) seem smooth.
Direct3D games, on the other hand, don't add motion blur to objects as a rule. (I suppose someone may have written an engine that does this at some point, but it would be rather processor-intensive to say the least, since to my knowledge there is nothing built into the HAL for automatically determining motion blur, so it would have to be software-intensive.) The absence of any motion blur causes differences in the position of any moving objects to be much easier to pick up, even at higher framerates (this is especially true when the camera is yawing, since then every object on the screen is effectively moving, and rather quickly in fact).
I don't think one can put a limit on a framerate that is "smooth" without motion blur, because you can always move things across the screen a little bit faster to re-introduce the appearance of "jumping", although there is probably a practical limit above which jumpiness could only be forced through abnormally exaggerated movements.
I've personally never found any game with an fps of 30 or better to be at all bothersome, so I don't expect to have any complaints about a limit of 60 in Doom 3 (once I finally get a look at the hopefully forthcoming demo). Heck, my brand new A64 3000+ and Radeon 9800 Pro probably won't be able reach anywhere near 60 in D3 at 1024x768 anyway.
If someone can offer info to contradict my statement that motion blur isn't available in Direct3D, BY ALL MEANS tell me about it, because that would be information I might be able to use someday. 🙂