2D quality -Is it same over digital interface?

siki13

Junior Member
Oct 14, 2013
4
0
0
Here is my problem.
I have two LCDs with analog input ONLY and the only way to have crystal clear, 100% 2d quality
is to use old matrox cards.
Any other kind and result is slight or very big blurriness.
Since there are no win 7 drivers for those matrox graphic cards i am forced to use other brands
(nvidia, ATi intel ) or even considering displaylink usb graphics.
Also i am buying LCD that has DVI .
Did anyone noticed any difference using different brands or to put it differently ; does digital DVI/HDMI interface give 100% 2d quality no matter what graphic card i use? .
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The 2D image as far as I know is completely unaffected by the vendor when it comes over DVI/HDMI. Digital has no blurriness/loss at all.
 

Zorander

Golden Member
Nov 3, 2010
1,143
1
81
Here is my problem.
I have two LCDs with analog input ONLY and the only way to have crystal clear, 100% 2d quality
is to use old matrox cards.
Any other kind and result is slight or very big blurriness.
Since there are no win 7 drivers for those matrox graphic cards i am forced to use other brands
(nvidia, ATi intel ) or even considering displaylink usb graphics.
Also i am buying LCD that has DVI .
Did anyone noticed any difference using different brands or to put it differently ; does digital DVI/HDMI interface give 100% 2d quality no matter what graphic card i use? .
You are actually using the analog output of your cards when you hooked up your monitor. The DVI-I port on modern cards can output both digital and analog signals.

In this case, there can indeed be differences. Today's cards do not come with top-quality DAC since most users has moved on to LCD displays (which have their own DAC). Matrox cards were known for quality analog output. It is hard to find anything better in that respect.

Regards.
 

siki13

Junior Member
Oct 14, 2013
4
0
0
You are actually using the analog output of your cards when you hooked up your monitor. The DVI-I port on modern cards can output both digital and analog signals.

In this case, there can indeed be differences. Today's cards do not come with top-quality DAC since most users has moved on to LCD displays (which have their own DAC). Matrox cards were known for quality analog output. It is hard to find anything better in that respect.

Regards.

Forget analog , i use that for an example of what i was struggling with.
Now i am buying lCD that will have DVI input and it`ll be connected via
graphic card DVI output , so completely digital.
My question (that i already got answer from BrightCandle ,but would like some more opinions) is WHEN(edit :not if) connecting it digital way do you noticed any difference in 2d Quality when it comes to different graphic cards
or even with this new displaylink technology (external usb graphics) .
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
All other things being equal, digital output will be better than VGA. Yes. I've viewed the differences myself with cards and LCD panels that support both inputs. Or rather, your GPU can strongly affect the quality of VGA output. Not the case with digital output such as DVI or displayport - it will be consistent and better than analog VGA.

With VGA output there were vast differences in terms of 2D quality. Different vendors may have subtle nuances in terms of color saturation settings and what not (between intel iGPU, nvidia, and AMD) but as far as actual 2D quality, digital has no variance. Analog VGA does have variance. So basically the bottom line is that you want digital DVI output, do not use VGA.
 
Last edited:

siki13

Junior Member
Oct 14, 2013
4
0
0
With VGA output there were vast differences in terms of 2D quality. Different vendors may have subtle nuances in terms of color saturation settings and what not (between intel iGPU, nvidia, and AMD) but as far as actual 2D quality, digital has no variance.

That is interesting and reassuring :)
 

Zorander

Golden Member
Nov 3, 2010
1,143
1
81
Forget analog , i use that for an example of what i was struggling with.
Now i am buying lCD that will have DVI input and it`ll be connected via
graphic card DVI output , so completely digital.
My question (that i already got answer from BrightCandle ,but would like some more opinions) is WHEN(edit :not if) connecting it digital way do you noticed any difference in 2d Quality when it comes to different graphic cards
or even with this new displaylink technology (external usb graphics) .
Ah, in that case, there should be no noticable difference when using a digital output. I used IPS and VA panels with cards from various vendors and saw no difference.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Well if you're in the market for a new monitor anyway, I see no reason to bother with VGA. Its a 30 year old analogue interface. The only reason to use VGA today is for legacy projectors or very cheap monitors.

Get a good quality (1440p if your budget allows) IPS monitor with DVI/displayport. If its visual quality you're after, IPS is the way to go. Your eyes will thank you... :D
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
Yeah keep in mind that digital doesn't suffer the same image quality degradation as analogue does, it tends to either work perfectly or fail entirely.

Also keep in mind that DVI can carry analogue signals, DVI-A and DVI-I both have pinouts that support the analogue signal, so you're not getting a digital signal this way simply an analogue one over a digital connector, if you're using the analogue D-Sub connector on the display it's always an analogue signal and subject to the DAC quality of the card and presumably the ADC of the monitor plus any interference in between like signal loss caused by noise or long VGA cable runs.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
What he said. Do not use any type of D-sub adapter. Avoid adapters period if at all possible - If you want the benefits of a digital output, it has to be a DVI-D source and DVI-D monitor. Using a D-sub VGA adapter will again subject you to image quality variances which is what you do not want.