Hmm, where to start...
When computing framerates, it really depends on the source. As another poster has said, a normal NTSC TV has 60 interlaced fields per second. It will first draw line 1, then 3, then 5, etc. and then it will draw 2, 4, 6, etc. This translates into 29.976 frames per second. I believe the .024 loss has to do with how the video is filmed. On film, I believe there is a .001 gap between frames. PAL has 50 interlaced fields for 25 fps.
A computer monitor (and some TVs) are progressive scan, which means the image is drawn from top to bottom without interlacing. This looks very sharp since there are no interlacing artifacts (more apparent in fast motion scenes). Since the picture is still being drawn top to bottom, the phosphors begin to fade ever so slightly before the image is redrawn, thus the slight flickering you might see - especially out of the corner of your eye.
An LCD is drawn constantly. As another posted mentioned, there is a constant backlight. The difference here is that there are not frames, per se, but times when everything on the screen is "updated." These changes all appear at the same time, so there is never any flickering, regardless of the refresh rate. This is why LCDs are much easier on your eyes than a CRT.