Originally posted by: kungfoo
One other specific question - what about the "drag" or "Ghosting" effect that we see when using a television (lcd/crt/plasma) vs. a nice expensive LCD/CRT computerr monitor? Is the only factor in that the refresh rate/response time? or are there more factors in this effect?
thank you!
To my knowledge the ghosting effect only occurs with LCD technology. Bigger LCD TVs can actually have a lower response time than monitors. Exactly why that is I'm not sure, but it has something to do with the bigger dot pitch making it easier for crystals to twist. This is also the reason 20" TN (twisted nematic) TFTs are slower than 19" ones. The faster a crystal can twist, the lower its response time. Likewise, the lower its color accuracy. With such a fast response time it has no time to fine tune itself to the precise color (this is why 6 bit+dithering LCDs are the fastest). S-IPS screens are next fastest, with P-MVAs, PVAs, and S-PVAs following them, respectively.
With LCDs, the fall time of the crystal is always faster than the rise time. Technologies like overdrive take advantage of this by surging the voltage when a change is requested, and then using the fast fall time to allow the pixel to drop down to the needed color at a faster speed. This technique isn't perfect, and requires a lookup table (LUT). Sometimes 75 Hz doesn't work properly with the LUT (as with the Samsung 970P) and slow-downs in response time occur.
It also requires a buffer of at least two frames (33 ms on a 60 Hz display) so that the display knows how far to shoot the pixel. I don't know how dithering factors into it all, or any more specifics. It gets complicated. Also, with overdrive, there is a potential for an overshoot, causing artifacts. They fix this by just letting the blur through for those transitions (or not shooting it as high), which is far less annoying than the artifacts. The fastest LCDs at the time of this writing can transition between any two given colors within 7.0 ms. Measurements of how long an LCD takes to respond to an image versus a CRT are here:
http://www.behardware.com/articles/632-...ages-delayed-compared-to-crts-yes.html
They will soon debut technologies that will wipe your eyes of the image, further helping reduce the blur effect. CRTs flash the picture at 60 Hz (or up to 200 Hz), and at the same time, they clear your eyes of the last image. Not that the blur on a 120 Hz CRT is anywhere near that of an average LCD, but this black scanning technique will give the LCD some time to transition. More info:
http://www.behardware.com/news/8273/a-glimpse-of-the-first-lcd-with-bfi.html
Originally posted by: spidey07
why does it "look better" on monitors?
Because monitors are monitors. They display exactly what you feed them.
Most of the time this is true, but recent displays (like the Dell 2007/2407 and NEC 20WMGX2) have done some "postprocessing" even in desktop mode. A later rev of the Dells fixed this not to occur in Desktop but it still occurs with video modes. You can always set the NEC to standard as well. This actually only affects the gamma but I figured I'd mention it anyway. They don't apply blur/deinterlacing/denoising effects in VGA or DVI.
Many times you don't hook up a TV using DVI, especially if it's a CRT. It's usually VGA, component, maybe S-Video or composite or even coax from a modulator in the case of consoles. That introduces lots of signal degradation. If you hook up an LCD TV with DVI or HDMI you'll get perfect results but the TV still has a bigger dot pitch than a monitor, making the image less seamless. And, as spidey07 pointed out, there is usually more postprocessing going on with TVs (Faroudja algorithms and such). Some TVs even upscale to accept resolutions that monitors can do natively. Few TVs hook up to a PC at 1920x1080 flawlessly.