DVI vs. VGA: Image quality difference?

996GT2

Diamond Member
Jun 23, 2005
5,212
0
76
I'm using a Thinkpad T61 laptop (nVidia Quadro NVS 140M), which only has VGA out port. I currently have it connected to a Dell Ultrasharp U2311H (1920x1080) using a standard 6 ft VGA cable.

Since I do a lot of photo editing and movie watching on this computer, I was wondering if there would be an appreciable difference in image quality between using VGA and DVI? I can get a DVI port by buying the optional Thinkpad dock, which is around $50 on eBay.
 

Bearach

Senior member
Dec 11, 2010
312
0
0
There is quite the difference, especially at higher resolutions. Photo editing and even watching videos will definitely be more of a pleasure.
 

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
In my experience, I have seen that resolutions over 1280x1024 do not look as good on VGA as they do on DVI. Also on one of my monitors, color transitions look different n VGA mode than they do in DVI - on VGA they are less defined.
 

borisvodofsky

Diamond Member
Feb 12, 2010
3,606
0
0
I'm using a Thinkpad T61 laptop (nVidia Quadro NVS 140M), which only has VGA out port. I currently have it connected to a Dell Ultrasharp U2311H (1920x1080) using a standard 6 ft VGA cable.

Since I do a lot of photo editing and movie watching on this computer, I was wondering if there would be an appreciable difference in image quality between using VGA and DVI? I can get a DVI port by buying the optional Thinkpad dock, which is around $50 on eBay.

Unless you have a color probe to calibrate that monitor,, it doesn't matter.. and that's not a good photoshop monitor to begin with.
 

Bass

Junior Member
Aug 4, 2010
24
0
0
I was in a similar boat to you about a month ago. I own a T400 and a U2211H, and for the first week or so, I too was stuck with VGA output. At first, it didn't seem so bad. Sure, text was a tad blurry and the screen flickered a little, but it still overall looked vastly superior to my T400's crappy 14.1" TN LCD right next to it. I then ordered an advanced mini dock on Ebay for about $30 (I happened to need the extra ports, but DVI was a nice bonus) and finally connected the monitor to digital output. It was definitely an improvement.

So, to keep it short, if quality is a priority I would definitely consider getting the dock.

As VGA is an analog signal, there is always some degradation when converting back and forth to the digital connection used by the LCD, so if you don't want to spend extra on the dock, the best alternative would be to get a better VGA cable (Shorter and thicker) as the one that Dell gives you is not the best.
 
Last edited:

rockyjohn

Member
Dec 4, 2009
104
0
0
Unless you have a color probe to calibrate that monitor,, it doesn't matter.. and that's not a good photoshop monitor to begin with.

What have you been smoking?

That is a great photoshop monitor. And don't take my word for it - read the review at TFT central:

http://www.tftcentral.co.uk/reviews/dell_u2311h.htm

Which states:

"Colour accuracy, black depth and contrast ratio are all very strong, and in fact the U2311H offers some of the best performance we have seen from any monitor in these regards..... "
 

schmunk

Member
May 17, 2007
57
0
0
I use two ViewSonic VG930m 19" monitors in dual view at work and my last work PC had one DVI and one VGA from GPU. I could definatly tell the difference in the colors of my desktop and also when I would run spreadsheets and fonts I would notice a little fuzziness on the VGA that was not there on the DVI monitor.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
There usually is a slight amount of fuzziness on VGA at higher resolutions but it all depends on the length of the VGA cable, quality of shielding, interference, and quality of the analog converter in the monitor. You can get a signal as perfect as DVI over VGA if things are good. Only way to find out for your situation is to test the DVI output though, unless you already recognize some signal degradation in the picture quality.
 

schmunk

Member
May 17, 2007
57
0
0
You say you like watching movies, well, you wont ever watch any HDCP protected content over VGA so no blu-ray drive :)

Really though, a nice monitor like that limited to VGA, what a shame. There are more things that limit VGA than cable, converter and interference, like signal mapping, phase and clock corrections etc. This might help:

http://www.thesmallest.com/lessonettes/dviandvga.html


Worse case scenario, if you dont like the docking station sell it on ebay again.
 

996GT2

Diamond Member
Jun 23, 2005
5,212
0
76
Thanks everyone! Bought a Thinkpad Advanced Dock today from eBay. Only $28 shipped :)
 

Ms. DICKINSON

Golden Member
May 17, 2010
1,221
1
81
bit.ly
It really depends on your monitor. Mine seems to get color bleed when using DVI-HDMI. VGA turns out to be fine but not as crisp as DVI.
 

996GT2

Diamond Member
Jun 23, 2005
5,212
0
76
It really depends on your monitor. Mine seems to get color bleed when using DVI-HDMI. VGA turns out to be fine but not as crisp as DVI.

I've got the Dell U2311, which is an IPS panel. Some people have reported issues with color consistency, but mine seems to be pretty good over VGA. I'll see how it does over DVI once I get the dock.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I've got the Dell U2311, which is an IPS panel. Some people have reported issues with color consistency, but mine seems to be pretty good over VGA. I'll see how it does over DVI once I get the dock.

is that an e-ips like the 2209wa?
 
May 29, 2010
174
0
71
You say you like watching movies, well, you wont ever watch any HDCP protected content over VGA so no blu-ray drive :)

again.


This is incorrect. HDCP protection doesn't give a darn about the analog input/output, it only cares about "digital" input/output for usage and Blu-ray drives work just fine with standard VGA. So for example, if you have an older LCD monitor with DVI input, but it doesn't support HDCP, it wont work with a PC Blu-ray player. However if you have a monitor with VGA input (LCD, or otherwise) Blu-ray output will work just fine.

Case in point. I have a HDCP enabled LCD 40" TV and an older non-HDCP 19" monitor that has both DVI and VGA inputs. if I attach the TV via a HDMI cable to the media PC. It will play Blu-ray just fine, however as soon as I attach the older LCD monitor to the PC via DVI, Blu-ray output will stop "because" it will notice that the older monitor is attached via a digital cable, but will not support HDMI. If I attach the same older LCD monitor via a "VGA" cable. Blu-ray will play just fine to either the digital TV or analog monitor. This is the same situation if both are hooked up at the same time, even if I'm outputting video only the TV. As long as there are digital video output devices connected, all must be HDCP compliant.

Basically HDCP requires that any and all "digital" output connections be HDCP compliant, but analog connections are ignored. This allows for the use of older TV's with analog inputs with blu-ray players and such via VGA, component, composite, or other "analog" connections. There's a whole lot of customers with older TV's that the media industry isn't going to ignore by messing with the analog signal.


As to pure digital versus analog input signalling, after "proper" adjustment, most" people cant tell the difference until you go above either larger physical size (like a 40 inch versus a 60 inch 1080P TV) or very high/fine reolutions. The biggest picture setup differences is that hooking up a monitor via a digital cable means that the picture tends to be pretty good without any major adjustments while hooking up with an analog VGA cable tends to require quite a bit more adjustment to look as good. As I said though, once "adjusted", most people can't generally tell the difference, but when non-adjusted and just swapping output cables, it is pretty easy to see a difference.
 
Last edited:

schmunk

Member
May 17, 2007
57
0
0
. . . Blu-ray drives work just fine with standard VGA.
Not trying to be inflamatory, but . . .
I would say that statement is incorrect. Are you getting full 1080P resolution on your 19" monitor then? Yes, I belive you get a picture over VGA, but many LCDs do not allow full resolution through the VGA ports.

Many blu-ray palyers reduce the resolution when the HDCP protected content is played back through an VGA connection (I believe it halves it to 960 x 540). You say you monitor is a 19", well Ill bet it isnt 1080P resolution:biggrin:, so you wouldnt notice that you are outputting a reduced resolution cause it is scalling anways to something like 1280 x 1024.

As long as we are getting technical, are you familiar with AACS and the CIT flag which was made official this year? http://www.blu-ray.com/news/?id=2849 also there is a DOT digital only flag that content owners can activate on their discs (movie studios on Blu-Ray) since 2006.

The image constraint token is a digital flag built into Blu-ray and HD-DVD discs that determines how those discs output video signals through the player’s output connectors. The goal of the image constraint token is to prevent unauthorized copies, or piracy, in high definition. An image constraint token works by instructing the player to downgrade Blu-ray and HD-DVD discs’ native high-resolution 1080p video to a standard-resolution 540p for output through its analog video connectors. The video signal from the player’s digital high-definition multimedia interface (HDMI) remains in full definition because HDMI output is copy-protected. The image constraint token is activated by the movie studio during the mastering process,
So long story short, you can not garantee that Blu-Ray will provide full 1080P over VGA, but you might get a lower res display, awesome!
 
May 29, 2010
174
0
71
Not trying to be inflamatory, but . . .
I would say that statement is incorrect. Are you getting full 1080P resolution on your 19" monitor then? Yes, I belive you get a picture over VGA, but many LCDs do not allow full resolution through the VGA ports.

Many blu-ray palyers reduce the resolution when the HDCP protected content is played back through an VGA connection (I believe it halves it to 960 x 540). You say you monitor is a 19", well Ill bet it isnt 1080P resolution:biggrin:, so you wouldnt notice that you are outputting a reduced resolution cause it is scalling anways to something like 1280 x 1024.

As long as we are getting technical, are you familiar with AACS and the CIT flag which was made official this year? http://www.blu-ray.com/news/?id=2849 also there is a DOT digital only flag that content owners can activate on their discs (movie studios on Blu-Ray) since 2006.

The image constraint token is a digital flag built into Blu-ray and HD-DVD discs that determines how those discs output video signals through the player’s output connectors. The goal of the image constraint token is to prevent unauthorized copies, or piracy, in high definition. An image constraint token works by instructing the player to downgrade Blu-ray and HD-DVD discs’ native high-resolution 1080p video to a standard-resolution 540p for output through its analog video connectors. The video signal from the player’s digital high-definition multimedia interface (HDMI) remains in full definition because HDMI output is copy-protected. The image constraint token is activated by the movie studio during the mastering process,
So long story short, you can not garantee that Blu-Ray will provide full 1080P over VGA, but you might get a lower res display, awesome!


It's funny you mention the possibility of downscaling with VGA. What I didn't mention is that although my 40" 1080P TV supports digital HDMI cables, in actual usage, I connect it to my media PC via a analog "VGA" cable rather than digital HDMI cable (consider the media PC to be a dual monitor setup with a 19" regular LCD monitor for apps monitoring and the other desktop to be a 40" 1080P TV). Now the first question you ask is "why" a VGA analog versus a digital HDMI cable for a 40" LCD TV. Simple, because when using dual monitors on a Win7 PC, when you set up an analog output, the analog output stays on even if you disconnect the device.

What this means is that if I use a HDMI cable to the TV on my dual monitor PC, if I switch the source on the "TV" away from the PC input, Win7 thinks the output device to have been "removed" and will revert to single desktop mode. That means any icons, applications, or whatever was on the desktop of the TV desktop side will be moved to the other monitor when I switch the source on the "TV" side when using HDMI cables. Basically, this can be a pain when I want to jump over to OTA inputs on the TV (say I want to check the score on a game that's currently playing on OTA TV).

If I use the VGA cable, Win7 considers it always connected even if I switch away on the TV side. It does not revert to single desktop/monitor mode when the output is via VGA analog even if the output device is disconnected. The secondary reason I attach the 40" TV with analog VGA versus HDMI is that source switching on the TV is MUCH faster simply because it does not have to establish HDCP handshaking (and despite "standards" HDCP handshaking sometimes fails, even on the cable it was just worknig on a second ago, in which you have to sometimes toggle it a few times especially when you swap sources a lot).

As to the actual output via VGA cable, I can assure you that my PC has the TV set up as a full 1080P resolution monitor (it is even reflected on the TV's circuitry as when you change resolutions for any reason, the TV informs you via its OSD). Simply using my PC's HP Blu-ray player and PowerDVD 10, and a HDCP supporting video card, I can also assure you that output to the TV through the "VGA" cable is at full 1080P HD resolution. Obviously, when I play Blu-ray through my 19" 4:3 monitor, I am not getting full 1080P visual playback as the monitor does not have the resolution capability, however on my 1080P 40" TV that "does" suport 1080P, there is NO downscaling involved.

You can talk about standards all you want, but the reality is that it is fully possible to output full 1080P HD content to analog VGA simply and easy. VGA is not limited to a paltry 1920x1080@60 hz signal. Some of us were driving our old CRT trinitrons at much higher resolutions and frequencies than that 15+ years ago via VGA. In fact, it's just as easy, and without the hassle or drawbacks the HDCP bullshit problems I mentioned. While VGA may require more fooling around with to get as good an initial picture as digital, it is not exactly difficult. People have been color-matching analog monitor output for a damn long time now.

Anyone who had been around computers for while and had the giant 21+ inch 80 to 100 lb behemoth CRT high-quality monitors from yesteryear knows that VGA can output at far higher resolution and quality than people think today because of the whole "digital is best" bullshit spewed from the the industry. While consumer set-top players might have all sorts of industry forced limitations, "PC's" are not artificially firmware limited. As I stated, I use a computer blu-ray player and software.

"but many LCDs do not allow full resolution through the VGA ports" --> I don't know where that info came from as it is pure nonsense.. Go buy an LCD PC monitor from any store that has a VGA input port, attach it to a video card that support the native output resolution and frequency of the LCD monitor and the monitor will display that full resolution. If the TV or monitor you have artificially limits VGA input, you need to buy a different TV or monitor. You can argue about picture quality though analog versus digital input, but to say anything like you did about being VGA limited in resolution is foolish. The only difference between the digital output and analog output as far as the PC video card is concerned is that you might have to "manually" set the resolution and frequencies for the analog output to match the native LCD resolution as it is not always automatic like with digital.
 
Last edited:

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
VGA might work well for analog displays, but not so for LCDs. As others have mentioned, I also have seen that LCDs like to do some processing of VGA input which makes colors look different than when driven by DVI.
Also any kind of artifacts are more visible on LCDs due to techology itself. Remember that CRTs can show every resolution as native; and there are no physical pixels; so they tolerate (mask) artifacts better.

I also use VGA input on my LCD TV and it does look great, but then I don't sit next to it.
 

schmunk

Member
May 17, 2007
57
0
0
You can argue about picture quality though analog versus digital input, but to say anything like you did about being VGA limited in resolution is foolish. The only difference between the digital output and analog output as far as the PC video card is concerned is that you might have to "manually" set the resolution and frequencies for the analog output to match the native LCD resolution. As you don't seem to understand this, I question any other "knowledge" of video from you. I suppose Monster brand digital video cables work better too?

Your condenscending tone makes it apparent that you are the type to forward "but I read it on the Internet" information as the truth when you have not actually tried to do it.


ya ya im stupid and foolish and all, but I think you misunderstood me or maybe I didnt explain my point. I was refering exclusivly to BluRay play back resoulution, not PC resoulution. I also had a 21" CRT.

Didnt mean to sound condenscending by any means, that is just what I understood because of reading many articles about digital control as I am very interested in the Electronic Frontier Foundation and there efforts to fight this type of control. I do think we will see more of these limits in the future unfortunaly.
I appreciate hearing your experiences as I too am a HTPC fan, but no your right, I never hooked it up with VGA because I didnt have any reason to. I did use S-video at one time on a myth-TV HTPC build. I really thought it was not possible to play HDCP protected bluray through VGA. Guess I know now you can sometimes without any trouble. Too bad you didnt find out anything new or interesting from my post.

You've really gone way out there to start making comments about my knowlege, and putting words in my mouth, painting me as someone who likes monster cables even, how strange.

My original comment was only trying to making a point that sometimes VGA is not the best for blu ray play back. Lots of people on the AV forums complain about devices downgrading the resolution of this. Thats where your exeperince is what youve tried and I am refering to readin posts from others that had problems.

I hope you dont need to think I dont know anything at all in order to make yourself feel more knowlegable. We can discuss this and maybe both learn something and share experiences without the personal attacks. Dont you think?
 
May 29, 2010
174
0
71
ya ya im stupid and foolish and all, but I think you misunderstood me or maybe I didnt explain my point. I was refering exclusivly to BluRay play back resoulution, not PC resoulution. I also had a 21" CRT.

Didnt mean to sound condenscending by any means, that is just what I understood because of reading many articles about digital control as I am very interested in the Electronic Frontier Foundation and there efforts to fight this type of control. I do think we will see more of these limits in the future unfortunaly.
I appreciate hearing your experiences as I too am a HTPC fan, but no your right, I never hooked it up with VGA because I didnt have any reason to. I did use S-video at one time on a myth-TV HTPC build. I really thought it was not possible to play HDCP protected bluray through VGA. Guess I know now you can sometimes without any trouble. Too bad you didnt find out anything new or interesting from my post.

You've really gone way out there to start making comments about my knowlege, and putting words in my mouth, painting me as someone who likes monster cables even, how strange.

My original comment was only trying to making a point that sometimes VGA is not the best for blu ray play back. Lots of people on the AV forums complain about devices downgrading the resolution of this. Thats where your exeperince is what youve tried and I am refering to readin posts from others that had problems.

I hope you dont need to think I dont know anything at all in order to make yourself feel more knowlegable. We can discuss this and maybe both learn something and share experiences without the personal attacks. Dont you think?


1st let me apologize, you must have read it right away as I editted my post immediately to remove the poor comments after hitting the post button because I realized I was in a bad mood from something not related and being a jerk for no real reason. But apparently not fast enough. Either way, I did not mean to call you out on anything, it just came out that way because of my previously bad mood.

Again, apologies, I really didn't mean to be a jerk about it. I don't mind being an ass, but I like to reserve it for deserving people and your response did not warrant it.
 

schmunk

Member
May 17, 2007
57
0
0
Cool, your alright!

Hey, you'll laugh but I got out my old VGA/DVI connector and a VGA cable and tried it on a blu-ray last night from my HTPC, Bram Stroker's Dracula and low and behold Bluray through the VGA port onto my samsung. I then grabed my old 22" samsung LCD monitor and tried the VGA there (I was bored I guess) and also bluray playback, though weird thing about that is Arcsoft player info said 1920 x 1080 even though my 22" is a 1650 x 1080 screen??
I guess I was confusing some of the set top blu ray player issues that people were having with downscaling.
This is cool because my Soyo 24" screen on my primary PC is not HDCP which has stopped me from getting a blue ray drive there. Now I can, and I will have a way to watch a bluray when the kids are watching scooby doo and such, so thanks for the tip.