• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Plasma/LCD/CRT Technical Questions

kungfoo

Member
Greetings!

Hopefully this is the right place to post this, I contemplated where to put it and would
appriciate a "highly technical" response for this.

My question is:

Why do (as technically as possible) PC Games (output from a PC, obviously) look/perform better on PC Monitors (LCD/CRT) then they do Televisions? (LCD TV's/CRT/Plasma)?

I have a basic understanding of it but would love a technical answer.

Also, Why is it that XBOX/PS2/etc games look just fine on TVs? As they are Games as well.

Again, looking for as technical of answers as possible 😀

Cheers!
 
PC monitors typically have higher resolutions on smaller screens; a lower dot pitch, as above. They're also often more comfortable at higher refresh rates, because they are built to be (while TVs, for most of their lives, are displaying images that don't need the fast refresh, like NTSC video). PC games can apply a lot of fancy effects to squeeze the most possible value out of this (AA, anisotropic filtering, etc.); their engines are built around a fundamental assumption that you will be close to a high-pixel-density display.

Console games come at it from the opposite direction. Their engines assume a farther-away, higher-dot-pitch display. They are optimized to take advantage of the particulars of such a setup; especially, that the distance makes small features harder to distinguish.

The difference is somewhat akin to the difference between making a car look good, and making a battleship look good. The car needs to look good up close, and so is given a highly-consistent paint job (which may be fiercely protected by the owner 🙂) and subtle lines. But the battleship has the benefit of distance: its paint can have all sorts of blemishes and they will escape notice; and finer details tend to fade from view, leaving only the more stark lines. Similarly, a car at long distance loses its panache; pretty much all modern sedans, for instance, begin to look indistinguishable from each other.
 
Originally posted by: Aluvus
PC monitors typically have higher resolutions on smaller screens; a lower dot pitch, as above. They're also often more comfortable at higher refresh rates, because they are built to be (while TVs, for most of their lives, are displaying images that don't need the fast refresh, like NTSC video). PC games can apply a lot of fancy effects to squeeze the most possible value out of this (AA, anisotropic filtering, etc.); their engines are built around a fundamental assumption that you will be close to a high-pixel-density display.

Console games come at it from the opposite direction. Their engines assume a farther-away, higher-dot-pitch display. They are optimized to take advantage of the particulars of such a setup; especially, that the distance makes small features harder to distinguish.

The difference is somewhat akin to the difference between making a car look good, and making a battleship look good. The car needs to look good up close, and so is given a highly-consistent paint job (which may be fiercely protected by the owner 🙂) and subtle lines. But the battleship has the benefit of distance: its paint can have all sorts of blemishes and they will escape notice; and finer details tend to fade from view, leaving only the more stark lines. Similarly, a car at long distance loses its panache; pretty much all modern sedans, for instance, begin to look indistinguishable from each other.

Thanks to you both!

One other specific question - what about the "drag" or "Ghosting" effect that we see when using a television (lcd/crt/plasma) vs. a nice expensive LCD/CRT computerr monitor? Is the only factor in that the refresh rate/response time? or are there more factors in this effect?

thank you!
 
I've been wondering about stuff like this, too, but in a little more detail. I've looked at the wiki article, and it doesn't make much sense to me.

How does HDTV - say 1080i stack up to to a current LCD Monitor? I know that i is for interlaced and p is for progressive, but, from what I've been able to gather, HDTVs still don't go upwards of 30 fps.

If someone wouldn't mind taking the time to explain the details, I'd really appreciate it.
 
don't all cameras for shows/movies and such record at 27 fps? So i'd think there'd be no need for a higher refresh rate on a TV, only a more dense screen.
 
To answer a few questions as brief as possible:

HD runs at 24, 30, 60 frames a second.
HD runs at varying resolutions interlaced (two fields used to display a single frame) and progressive.
HD cameras (depending on model) record at any one of these resolutions/frame rates.

why does it "look better" on monitors?

Because monitors are monitors. They display exactly what you feed them.

TVs are TVs. They do all kinds of "mucking with the input" in order to make it "look better" to the average consumer.

The "ghosting" mentioned is a fault of the display and not the input.

So, in the end the reason it "looks better" on a PC display is a combination of dot pitch/viewing distance and the quality of the display.
 
Originally posted by: kungfoo
One other specific question - what about the "drag" or "Ghosting" effect that we see when using a television (lcd/crt/plasma) vs. a nice expensive LCD/CRT computerr monitor? Is the only factor in that the refresh rate/response time? or are there more factors in this effect?

thank you!

To my knowledge the ghosting effect only occurs with LCD technology. Bigger LCD TVs can actually have a lower response time than monitors. Exactly why that is I'm not sure, but it has something to do with the bigger dot pitch making it easier for crystals to twist. This is also the reason 20" TN (twisted nematic) TFTs are slower than 19" ones. The faster a crystal can twist, the lower its response time. Likewise, the lower its color accuracy. With such a fast response time it has no time to fine tune itself to the precise color (this is why 6 bit+dithering LCDs are the fastest). S-IPS screens are next fastest, with P-MVAs, PVAs, and S-PVAs following them, respectively.

With LCDs, the fall time of the crystal is always faster than the rise time. Technologies like overdrive take advantage of this by surging the voltage when a change is requested, and then using the fast fall time to allow the pixel to drop down to the needed color at a faster speed. This technique isn't perfect, and requires a lookup table (LUT). Sometimes 75 Hz doesn't work properly with the LUT (as with the Samsung 970P) and slow-downs in response time occur.

It also requires a buffer of at least two frames (33 ms on a 60 Hz display) so that the display knows how far to shoot the pixel. I don't know how dithering factors into it all, or any more specifics. It gets complicated. Also, with overdrive, there is a potential for an overshoot, causing artifacts. They fix this by just letting the blur through for those transitions (or not shooting it as high), which is far less annoying than the artifacts. The fastest LCDs at the time of this writing can transition between any two given colors within 7.0 ms. Measurements of how long an LCD takes to respond to an image versus a CRT are here: http://www.behardware.com/articles/632-...ages-delayed-compared-to-crts-yes.html

They will soon debut technologies that will wipe your eyes of the image, further helping reduce the blur effect. CRTs flash the picture at 60 Hz (or up to 200 Hz), and at the same time, they clear your eyes of the last image. Not that the blur on a 120 Hz CRT is anywhere near that of an average LCD, but this black scanning technique will give the LCD some time to transition. More info: http://www.behardware.com/news/8273/a-glimpse-of-the-first-lcd-with-bfi.html

Originally posted by: spidey07
why does it "look better" on monitors?

Because monitors are monitors. They display exactly what you feed them.

Most of the time this is true, but recent displays (like the Dell 2007/2407 and NEC 20WMGX2) have done some "postprocessing" even in desktop mode. A later rev of the Dells fixed this not to occur in Desktop but it still occurs with video modes. You can always set the NEC to standard as well. This actually only affects the gamma but I figured I'd mention it anyway. They don't apply blur/deinterlacing/denoising effects in VGA or DVI.

Many times you don't hook up a TV using DVI, especially if it's a CRT. It's usually VGA, component, maybe S-Video or composite or even coax from a modulator in the case of consoles. That introduces lots of signal degradation. If you hook up an LCD TV with DVI or HDMI you'll get perfect results but the TV still has a bigger dot pitch than a monitor, making the image less seamless. And, as spidey07 pointed out, there is usually more postprocessing going on with TVs (Faroudja algorithms and such). Some TVs even upscale to accept resolutions that monitors can do natively. Few TVs hook up to a PC at 1920x1080 flawlessly.
 
Back
Top