Analog vs Digital input for LCD monitors?

lektrix

Golden Member
Aug 9, 2003
1,174
0
76
Can anyone explain the difference? I see some LCDs have either or both. And it seems to be more expensive for DVI. For games + movies, would DVI be necessary?
 

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,204
45
91
Not necessary, but DVI tends to give better quality images as a connection type. If you search in video, you'll find dozens of threads on the topic.
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81
Plus no adjustment is needed for correct sizing around the edges and whatnot. If you can use DVI theres no reason not to. Even if your monitor didnt come with a cable (pretty shady I'd say), I'd still spend the $10-15 for one.
 

AMDBOY

Senior member
Mar 25, 2001
436
0
71
I am a bit in the dark (no pun) on this one. It's been quite a while since my last build. I'm too exhausted from my 12 hr. work dayto do this research ( If you search in video, you'll find dozens of threads on the topic.) I remember reading that analog has to be converted to digital and back anyway, right? So it's better quality to go digital whenever possible, I think. But, when I plug my LCD to the video card as DVI, the graphics lag so bad I cannot play games. What am I doing wrong?? Tia.
 

JimPhelpsMI

Golden Member
Oct 8, 2004
1,261
0
0
Hi, Running analog your Video Card is converting Digital sigs to analog, feedint then to the LCD Monitor. It's converting the analog back to digital to run the LCD array. Running Digital the Video card can sent the digital straight to the Monitor. No conversions = less distortion. Hope this helps a little, Jim
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: duragezic
Plus no adjustment is needed for correct sizing around the edges and whatnot. If you can use DVI theres no reason not to. Even if your monitor didnt come with a cable (pretty shady I'd say), I'd still spend the $10-15 for one.

My VGA input LCD monitor auto adjusts to make sure the screen is filled properly, so no adjustment is needed (by me) for correct sizing, it does it automatically (and does it right, at all resolutions)
 

AMDBOY

Senior member
Mar 25, 2001
436
0
71
Thanks all. I still am perplexed as to why my games lag when I connect digitally.No problem when I use VGA. I do have to reset some of the video settings in certain games. The res. & back the highest settings off a bit.
 

Koing

Elite Member <br> Super Moderator<br> Health and F
Oct 11, 2000
16,843
2
0
VGA looks out of sync and kind of poor on my 23" monitor. Looks VERY sharp via DVI. I don't have dual DVI outputs :p

Koing
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: AMDBOY
Thanks all. I still am perplexed as to why my games lag when I connect digitally.No problem when I use VGA. I do have to reset some of the video settings in certain games. The res. & back the highest settings off a bit.

The only possible explanation I could see for lower performance would be that you're not actually running the games at the native resolution in VGA mode but you are over DVI. They should perform identically if the resolution and settings are the same.

Your use of the term 'lag' is confusing, since some people have discussed real "lag" issues with some LCD monitors where the display seems to be several frames behind the video card's output. Do you mean this kind of "lag", or just lower framerates in your games?
 

NaOH

Diamond Member
Mar 2, 2006
5,015
0
0
Using analog to a LCD causes unnecessary digital to analog to digital conversions. Digital video out to digital video in! Looks great, text is sharp, and the resolution fits perfectly to your monitor with ZERO adjustments (not even pressing auto image adjust which my monitor has in analog mode).
 

karfus

Junior Member
May 12, 2006
2
0
0
I'll give an interesting counter-example. I recently purchased a Pioneer 50-inch HD plasma television (PDP-506XDE). The NATIVE resolution is 1280 x 768. This means the panel has exactly 1280x768 pixel elements, just like many LCD computer monitors. The media (connector) box has two HDMI inputs and a D-Sub (analog RGB) PC input.

So for the best quality, connect the PC via a DVI cable to the HDMI input, right? HDMI video is apparently signal compatible with DVI, it just adds a sound channel and the plug is different. So I hook up via a DVI/HDMI cable to find that my ATI X850 PE can't send a perfectly matching signal to the monitor. Either it's the wrong refresh rate (frequency), or the wrong resolution, as 1280x768 at any refresh rate spills over the edge of the screen and is flickering like mad, apparently interpreted as an interlace input, which of course it is not. The closest match is at 1280 x 720 at 60hz, but even that is extending over the edges of the monitor, and the fonts are not sharp--the way things look if you're not running at native resolution. Clearly the media connector box for the panel is unable to correctly interpret the DVI signal at native resolution. There's a cryptic note in the HDMI section of the instructions saying only "Note: PC signals are out of correspondence", and this must be what they mean. The media box is probably trying to interpret the HDMI input as some kind of "consumer video" signal, and subsequently using it's inbuilt scaling ability (incorrectly), though why it would do this is beyond me.

So in desperation I hook up the analog RGB (D-Sub) connector, expecting the worst. Lo and behold, a perfect 1280x768 native resolution image. Looking closely I can see that each pixel is clearly defined; there's no apparent bleeding of colors or pixel information. And as far as I can see the color is perfect, no different than my LCD monitor attached via DVI to the same PC.

So is analog RGB really significantly inferior to digital DVI anymore? According to CNET the difference is now vanishingly subtle, see the following article: LCD Connections: Analog vs. Digital - CNET

I'd be curious if anyone has a technical perspective on this.
 

HGC

Senior member
Dec 22, 1999
605
0
0
I tried connecting my 19" Samsung by digital and analog. I could see a difference. More sharp with digital, especially on text. The analog looked great, though.

I've seen two friends recently with new analog hookups to Dell boxes with cheap onboard video. One to a 19" Samsung 191T (same as mine), one to a Dell 19". I've got to say, the image quality is fantastic. I doubt anyone not into computers would care about any difference.
 

imported_Imp

Diamond Member
Dec 20, 2005
9,148
0
0
I have an analog LCD which I think looks great. However, theoretically and from experiences of others, DVI is suppose to look better. Considering you're spending $300+ already, no reason to cheap out on maybe $50 (like I did:(). Small issue though, not sure if this happens to a lot of people, but someone in the house with a DVI LCD has a problem with the monitor detecting a DVI/analog connection; on cold boot, the monitor keeps looping DVI, analog, DVI like it can't seem to figure out the connection type. Probably isolated, but it's happening.
 

karfus

Junior Member
May 12, 2006
2
0
0
-"Why cant you set your graphics card to the exact resolution ? "

As I said, "1280x768 at any refresh rate spills over the edge of the screen and is flickering like mad". That is the exact resolution of the monitor. The thing is, the media box is expecting a "video" (not PC) input on the HDMI ports, and therefore must be trying to scale it or something, otherwise of course it would fit dot-for-dot. There may be some workaround I'm not aware of, but unfortunately I doubt it, since the instructions cryptically indicate that "PC signals are out of correspondence" with the HDMI inputs. I would imagine this refers to something to do with refresh rate, though why that should matter on a natively progressive display I don't know.

I thought if nothing else I would bring this up for the benefit of anyone out there thinking of buying a plasma as a PC display device. It probably also applies to some very large LCD panels (those not targeted as computer monitors). It works, but not exactly the way you might expect. Naturally it's going to depend on the connections available. I would have thought a DVI input would have worked better, but then a DVI input and an HDMI input, according to everything I have read, are exactly the same thing, as far as the video part of the signal is concerned.
 

GrammatonJP

Golden Member
Feb 16, 2006
1,245
0
0
If you want the best picture for your tv, you get hd tv with dvi or hdmi, why wouldn't you want any less for your LCD monitor ?

1280x1024 & less, Analog is fine, i couldn't really tell, but at 1600x1200 & 1920x1200, dvi is a must... i'm using 1920x1200 right now..