• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

DVI vs. Analog input on an LCD monitor- real difference?

mfavin

Member
LCD monitors with DVI (digital video interface) are a bit more expensive than those with just an analog input.
Is it worth the extra money?

Can you describe the real qualitative difference between each interface?
 
Theres a significant difference. The digital connection will spit out the exact pixel destinations, but the analog will only spit out approximations. With DVI theres no blurry edges, and every pixel is perfectly distinguishable from the other.

Personally, I wouldnt bother with an LCD unless you're using DVI.
 


<< It's worth the extra money for DVI over analog? >>

Yes. also the Card will perform better my MX200 with DVI looks fantastic but on CRT looks crummy in 2D!
 


<< analog will only spit out approximations. With DVI theres no blurry edges, and every pixel is perfectly distinguishable from the other. >>


Not certain what BigDee means by this, however the interface used has no affect on quality of the edges or distinguishing pixels. The panel itself and the resolution you run has a bigger affect on these than the interface used.

Digital (DVI) like (BNC connectors on a CRT) in theory will provide a better picture. However if you have a good video card in practice or in a blind test I find it very difficult to tell the difference.

The myth that DVI does not need to be converted is just that, a myth. On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a DVI transmitter chip to the VC and a DVI receiver chip in the monitor.

In order to transmit the digital data from the VC in true digital, the graphics chip must have DVI outputs and the video cable would need to have a single wire for each bit. If this were true the cable would need to contain more than 27 wires. You can imagine how thick this cable would be. DVI converts the parallel data to a number of digital serial channels. Depending on the interface used (DVI-I or DVI-D) the number of serial channels varies. The serial bit steam is then converted back on the monitor side, so you can argue that DVI actually increases the number of times the signal is processed.

Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).

Most of today?s version of DVI is rather limited. DVI driver chips have a maximum 1600 x 1200 at 60Hz resolution and refresh rate capability. Keep this in mind if you plan on upgrading. LCD?s do not suffer from flicker so the 60Hz is not such a big deal unless you are playing games and want higher FPS. Faster DVI transmitter and receiver chips are being developed if you want to upgrade, however that means replacing both your video card and monitor. With an analog connection you can upgrade either one without the need to replace the other.

Also if you use DVI-I or DVI-D I do not recommend hot swapping (unplugging) the monitor. Turn the computer and monitor off prior to unplugging the monitor.
 
Well, I suppose it matters a lot on the video card. We tried a Gf2 w/ DVI on an LCD with both connections, and there was an immediate, noticble difference. Text got a little fuzzy with the analog, but the digital was pixel perfect.
 

Nvidia cards are known to have problems with analog video due to the filter circuits they use on the video outputs.
 


<< Nvidia cards are known to have problems with analog video due to the filter circuits they use on the video outputs. >>



Unless you purchase a "Leadtek" GeForce3 video card.. Seems as though they use higher quality filters and their 2D not only rivals ATI or Matrox, but surpasses them to boot..
 


<<

<< Nvidia cards are known to have problems with analog video due to the filter circuits they use on the video outputs. >>



Unless you purchase a "Leadtek" GeForce3 video card.. Seems as though they use higher quality filters and their 2D not only rivals ATI or Matrox, but surpasses them to boot..
>>



Actually, this is not exactly the case. I have a LeadTek Winfast GF3 and the DVI definitely still looks better than the VGA on both of my LCD monitors (18.1" SXGA and 20.1" UXGA).
 
I agree with gosharkss.
I can't tell the diffrence at all between the two imputs on my senergy750 which has both types. I have tried VGA and DVI with my radeon AIW and there is no difference so I wasted about $300 just to get the DVI when Gosharkss had said it was a waste in a previous post acouple of months back. As a side note I will never go back to CRT. IMO the image quailty, crispness cant be beat at the LCD native res.
 
I've had a little experience with projectors, and long cable runs - in this case the difference is startling.

Analogue signals look terrible - vertical edges shimmer by 1-2 pixels (an edge that should say look b-b-b-w-w-w, may flicker between b-b-w-w-w-w and b-b-b-w-w-w and b-b-b-b-w-w), it is very visibible on the highly magnified screen and exceptionally distracting.

This happened on a variety of different projectors and graphics cards - with none being obviously worse than another. With DVI, this was gone, and the picture was invariably perfect at all times.

With LCD screens, the difference is certainly not as obvious, but definitely there. I don't own an LCD so don't have much experience, but have seen a variety of screens of various price while looking for a new monitor. To me the difference is significant enough not to buy a new LCD monitor unless it has DVI.
 
Back
Top