and a liitle about the double conversion on a non-digital in LCD:
In thin film transistor displays, up to four transistors control each pixel (or, picture element) on the display. Each pixel is a digital bit of data that should correlate perfectly to the digital video data being generated by the computer, and therefore the image should have no chance to distort.
This is in marked contrast to the CRT, which was at its heart an analogue device, and required the video card to convert the digital information into an analogue signal in order to display it. As the thinking went, if you kept the signal digital all the way to the flat panel, you'd get a superior image.
But digital monitors have not won the day--at least not yet--for a few reasons. Because the analogue D-Sub connector was the common link between graphics cards and monitors, the move to a straight digital interface, which requires a different connector, didn't automatically happen when flat panels became available. Graphics card manufacturers weren't inclined to move to a straight digital card because it would limit their market to just digital displays. By the same rationale, the manufacturers of flat-panel monitors weren't inclined to go strictly digital because it would limit the number of graphics cards the monitors could be used with.
And so, for the most part, both groups stuck with the analogue D-Sub connector, and when there was a concession made to digital, the resulting product was a hybrid model containing both analogue and digital, thus adding to the cost.
Initially, the cost of adding DVI (digital video interactive) to a monitor or graphics card was substantial, another reason manufacturers opted solely for the widely used D-Sub connector. Recently, the price differential has come down, which has had two effects. First, there's now more of an incentive to include both on a product, and we've seen more graphics cards with both connectors onboard. The flipside is that the price of adding both connectors isn't great enough to justify getting rid of the D-Sub connector at the monitor end, which means that for the short term, most flat-panel monitors that incorporate DVI will be hybrid models that also retain the D-Sub.
But what about image quality of analogue versus digital displays? While the image of early analogue flat panels was often marred by shimmy and shake (an artifact of the double conversion from digital to analogue, then back), newer flat-panel monitors feature much better analogue conversion, to the point where many users can't even tell the difference.
According to Mickey Mantiply, display products line manager with IBM, subjects in one test were asked to choose which image they liked better--a digital one or an analogue one displayed on the same monitor. Opinion was split right down the middle--certainly not a compelling argument for the superiority of either technology, and little justification for making the move solely to digital, just yet.