Originally posted by: CP5670
On a side note, is there any benefit (as far as signal quality goes) to using a cable like this instead of the adapter and a normal VGA cable?
On a side note, is there any benefit (as far as signal quality goes) to using a cable like this instead of the adapter and a normal VGA cable?
Thank you for pointing that cable out- I would expect that if you are running a high end CRT then there would be a noticeable improvement in signal quality using that cable instead of the converters. I can tell you that there is a sizeable dropoff in signal integrity on the adpators when you start pushing the upper limits of high end CRTs(16x12@100- 20x15@85). I have a vid card with DVI+VGA out and a monitor with dual inputs- the supplied adaptor clearly degrades signal quality to a level that it is noticeable to non geeks.
The adapters just transfer from one pinout to the other. Unless you have one that is crapply made out of metal that isn't conductive enough and/or one that pins and holes don't connect well with the plugs on your other parts, then you won't be loosing any image quality by useing an adapter.
Yeah, Ben, we have been though this a few times. However, if you understood the principles of electron-conduction you would understand that your degradation in image quality you saw was either the result of a sub-standard adapters or simply a case of seeing what you want to belive; there is nothing inherent to the use of an adapter that will cause issues any more than using an inch longer cable.Originally posted by: BenSkywalker
The adapters just transfer from one pinout to the other. Unless you have one that is crapply made out of metal that isn't conductive enough and/or one that pins and holes don't connect well with the plugs on your other parts, then you won't be loosing any image quality by useing an adapter.
I've heard that same thing from almost everyone who has never actually tried it pushing the settings I'm talking about. I have.
if you understood the principles of electron-conduction
there is nothing inherent to the use of an adapter that will cause issues any more than using an inch longer cable.
Heh, perfection isn't even close to necessary to avoid signal degradation in a low amperage current signal system like VGA; little bits of copper pressed firmly against each other do that just fine.
Exactly how much bandwidth do you think we are talking about, and how does that compare to the bandwidth on a gigabit lan? Yeah.
I'm not sure what you are doing with your math there
So, assuming a system is designed to utilize low-cost gigabit-Ethernet components operating at 1250Mbps, it could support SXGA (1280×1024) at 75Hz refresh rate over three fibers or only SVGA (800×600) on one fiber.
The limitations of this technology come into play as the pixel rate and distance increase. The Cat 5 copper cabling has a certain amount of capacitance per foot, and that acts as a low-pass filter that reduces the signal to noise ratio of the differential analog video signal; electrical interference from motors and fluorescent lights also can be coupled into the line. Eventually the signal at the receiver end deteriorates to the point that random noise (?snow? effect) or bandwidth limitations become visible. Earth ground differences between transmitter and receiver can result in slow-moving hum bars on the display.
Note that a good share of the highend workstation graphic cards are dual DVI, and a good number of them are attached to highend CRT's using DVI>VGA adaptors and using high resolution and refresh rates.
Originally posted by: BenSkywalker
I'm not sure what you are doing with your math there
Need another reference? BTW- There figures are @75Hz which is why mine are slightly higher. It isn't like the math is very complicated either, not sure why anyone would have problems with it-