Here is a quote from DisplayMate with regards to image quality and DVI Video Cards:
Quote:
In principle all DVI sources are equivalent. In practice some combinations of graphics
boards and monitors don't work properly together.
Naturally, with my luck, i am one of those individuals who has experienced an incompatbility between a monitor(CG210) and a video card(ATI X1800XT). The artifacts produced from a DVI connection between the two was essetially screen speckles. And this was validated by a technical note that i found on the Eizo web site that clearly stipulated the incompatibility.
So after seeing that note, i went out and returned the ATI card in exchange for a BFG Geforce 7800GTX. And sure enough, the Geforce 7800GTX actually cleaned up all the speckles. But not entirely, for there are some remnant speckles that remain, which could be caused from the GTX or from damage to the CG210 from having been connected to the ATI card. So out of complete frustration and paranoia, along with the disliking of these horizontal electrical surges that appear on the CG210 when the computer boots, i decided to let go of the Geforce 7800GTX.
So i went out and got a BFG 6600GT OC in order to see if the bootup screen surges would go away. And to my luck, they did. The computer now boots nice and cleanly without any screen blips or surges of any sort. Of course, the tiny remnant speckles still remain, as this has gotten me to question wether there really is no such thing as image quality when it comes to DVI.
What do you guys think? Would a midrange desktop card like the 6600GT really have the exact same DVI signal quality as an ultra high end Quadro FX 4500? Or could it actually be possible that a professional workstation card like a quadro would actually have better signal quality which would in turn result in a better image displayed on the same screen that would have been connected with a desktop card like the geforce 6600GT?
I mean, technically, digital is digital, and should not have in betweens like analog. it should be 100% or 0%. But after what i saw with the ATI card, i started thinking that maybe the card i have has some kind of an incompabitility at a very minute scale that would be causing these remaining speckles, and that getting a high end professional card would have a higher quality that would in turn result in a high likelyhood of less potential incompatibilities.
Its so confusing. Maybe im just paranoid. After all, Eizo didn't mention any incompatibilities between their Monitors and Nvidia cards.
P.S. According to my computer buddy, those electrical surges would not have damaged my monitor. They are apparently just horizontal scan lines of some sort that the graphics card does to adjust itself with the monitor. Or something like that. I cant fully recall what exactly he had said except that there would have been no damage to the monitor. My point is that the remaining speckles that i see would not have been caused by the blips that the 7800GTX was producing with my monitor.
Quote:
In principle all DVI sources are equivalent. In practice some combinations of graphics
boards and monitors don't work properly together.
Naturally, with my luck, i am one of those individuals who has experienced an incompatbility between a monitor(CG210) and a video card(ATI X1800XT). The artifacts produced from a DVI connection between the two was essetially screen speckles. And this was validated by a technical note that i found on the Eizo web site that clearly stipulated the incompatibility.
So after seeing that note, i went out and returned the ATI card in exchange for a BFG Geforce 7800GTX. And sure enough, the Geforce 7800GTX actually cleaned up all the speckles. But not entirely, for there are some remnant speckles that remain, which could be caused from the GTX or from damage to the CG210 from having been connected to the ATI card. So out of complete frustration and paranoia, along with the disliking of these horizontal electrical surges that appear on the CG210 when the computer boots, i decided to let go of the Geforce 7800GTX.
So i went out and got a BFG 6600GT OC in order to see if the bootup screen surges would go away. And to my luck, they did. The computer now boots nice and cleanly without any screen blips or surges of any sort. Of course, the tiny remnant speckles still remain, as this has gotten me to question wether there really is no such thing as image quality when it comes to DVI.
What do you guys think? Would a midrange desktop card like the 6600GT really have the exact same DVI signal quality as an ultra high end Quadro FX 4500? Or could it actually be possible that a professional workstation card like a quadro would actually have better signal quality which would in turn result in a better image displayed on the same screen that would have been connected with a desktop card like the geforce 6600GT?
I mean, technically, digital is digital, and should not have in betweens like analog. it should be 100% or 0%. But after what i saw with the ATI card, i started thinking that maybe the card i have has some kind of an incompabitility at a very minute scale that would be causing these remaining speckles, and that getting a high end professional card would have a higher quality that would in turn result in a high likelyhood of less potential incompatibilities.
Its so confusing. Maybe im just paranoid. After all, Eizo didn't mention any incompatibilities between their Monitors and Nvidia cards.
P.S. According to my computer buddy, those electrical surges would not have damaged my monitor. They are apparently just horizontal scan lines of some sort that the graphics card does to adjust itself with the monitor. Or something like that. I cant fully recall what exactly he had said except that there would have been no damage to the monitor. My point is that the remaining speckles that i see would not have been caused by the blips that the 7800GTX was producing with my monitor.