• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

DVI-D versus DVI-A

9nines

Senior member
What do you lose, as far as performance/picture quality, using an analog DVI (DVI-A) connection over a digital one (DVI-D)?

My DVI-D port might have gone bad, on my LCD monitor. If I use a VGA cable with DVI adapters, I get a display on the DVI-A channel.


I noticed the screen does not automatically aligned with different resolutions (I have to set it with vertical and horizontal movements) but other than that I see no difference. I am just guessing that the only difference would be the analog signal is susceptible to interference but for such a short run (3 meters), it doubtful that will ever be a problem. Am I wrong? Am I missing any quality issues?

 
Thanks. I am trying to decide whether to get a new monitor or just use the DVI-A - leaning toward DVI-A and keeping old one.
 
DVI-A is little bit better than analog VGA, but it is far from digital... If it works fine for you, then stay with it. I for one notice huge difference between analog and digital connection, and I can tell which one is being used. It gets worse as resolution goes up.
 
Originally posted by: postmortemIA
DVI-A is little bit better than analog VGA, but it is far from digital... If it works fine for you, then stay with it. I for one notice huge difference between analog and digital connection, and I can tell which one is being used. It gets worse as resolution goes up.


Does it get fuzzy, at higher resolutions?
 
The signal on "DVI-A" is none other than a plain old VGA signal, just on a different plug. There is nothing "better" about that - in fact, passing through an adapter dongle has a tiny negative effect on signal quality.

As usual, the quality of the analog signal is very much in the hands of the individual card's designers.
 
My old HItachi CM174 has a crap VGA decoder, my shiny newer Samsung 930 has a pretty decent one. With the former it's painful using VGA rather than digital from the same source. The latter is hard to tell apart. Better decoders on the monitor end deserve a mention.
 
Shouldn't it be called DVI-I for analog transmission? I have never seen the DVI-A
designation. Just curious.

C Snyder
 
Originally posted by: bigsnyder
Shouldn't it be called DVI-I for analog transmission? I have never seen the DVI-A
designation. Just curious.

C Snyder

DVI-I = analog+digital
DVI-D = digital only
DVI-A = analog only

you can tell if it has analog if it has that little group of four pins on the side.
more info
 
Originally posted by: YOyoYOhowsDAjello


9nines, did the DVI-D just stop working or what? Are you sure there isn't an option to select input type or something?

Yes, It quit after 2 yaers. I checked another video card and still no display but it does work using a VGA cable and channel DVI-A. Viewsonic tech support says it is a dead DVI-D port.
 
Back
Top