Ugh... so much disinformation in this thread.
First of all, all LCDs have motion blur - it is an inherent trait in the way LCDs are built. The crystals can only rotate at a certain speed and the longer it takes them to do so, the higher the perceived effect of motion blur is. However, nowadays many "gaming" LCDs have reduced this switching time to the point where the end user can barely perceive any blurring. Nonetheless, a new factor has sprung up - refresh rate. Many people might say that LCDs don't have a refresh rate, and although they are correct in the context of CRT technology, in more general terms the refresh rate is still present. By refresh rate I am referring to the number of different images per second. The higher this number, the smoother the motion (just like frames per second in games). If you were to theoretically reduce the response time of an LCD to zero (which is impossible) the first thing you'd notice is that motion would strike you as choppy and jittery - this is due to the refresh rates of LCDs which rarely exceed 70hz. I can't stand playing fast-paced games at 60hz regardless of what monitor I'm using, but that's just me.
By comparison, the refresh rate of a CRT is resolution-dependent. I run my fast-paced games at 1024x768 @ 120hz, my slow-paced games at 1280x960 @ 100hz, and my desktop at 1600x1200 @ 85hz on a Sony G420 19" trinitron. In none of these cases is the flickering and eye-strain associated with CRTs present. This flickering is caused by too low of a refresh rate, since the phosphor strips in the CRT panel used to create the image have a limited amount of time that they stay bright. If you trigger them more often, it's only logical that the resultant image will be more flicker-free.
An LCD monitor on the other hand uses the same refresh rate in nearly all cases regardless of the resolution being used. This is due to the inherently digital nature of LCDs as opposed to the analog nature of CRTs. The input lag someone mentioned earlier in this thread is a real issue but it's not big enough to make a fuss over. Similar to the refresh rate problem, input lag is caused by the fact that a lot more work has to be done to go from your video card to your eyes if you use an LCD versus a CRT, and this work takes a lot of circuitry and time.
Why can't they just build better circuits to increase the refresh rates? It's not that simple. Some effort is being done to make 120hz TVs and such, but a lot of the problem lies in stuff like DVI bandwidth and other random issues like that. This is why SEDs looked so promising but unfortunately those would never see the light of day.
I'm not a blind proponent of any technology, but it bugs me when people post random crap that they can't justify or sometimes even understand. LCD technology is absolutely fantastic for a lot of things (office work, movies, less weight/space, and often lower power consumption) but it also has a lot of problems (black levels, blurring at non-native resolutions, response time, low refresh rates, input lag). Does that mean it's an inferior technology? Definitely not. It all depends on the application. If you consider yourself a hardcore gamer, I would wholeheartedly recommend checking craigslist for a decent CRT, preferably a trinitron or diamondtron or some sort (read about aperture grille vs. shadow mask for more information on what makes certain CRT monitors better). If you do some gaming on the side while using your computer for a slew of other things, then getting an LCD is the better choice.