• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

dual monitor adapter cables

substance12

Senior member
at my work dell computer i have an adapter that splits a DVI output into a DVI and VGA and it then goes to 2 monitors. Is that any better in any way than just hooking up 2 monitors to the DVI and VGA ports that typically many higher end video cards have?
 
Doing that requires the graphics card to be prepared for it. Normally, a DVI-I port is ready for use with EITHER the digital OR the analog copy of the signal.

In other words, the port on that Dell card isn't a normal DVI port.
 
Originally posted by: substance12
at my work dell computer i have an adapter that splits a DVI output into a DVI and VGA and it then goes to 2 monitors. Is that any better in any way than just hooking up 2 monitors to the DVI and VGA ports that typically many higher end video cards have?

There are cables that will split the signal from a DVI-I connector into separate HD-15 (analog VGA) and DVI-D connectors. If the card doesn't support dual displays, you end up cloning the single image on two separate monitors. If the card (hardware) supports dual displays and the drivers enable this support, you can run two monitors and extend your desktop across the two monitors.

If I understand what you are asking, then I don't see any advantage/disadvantage to running two monitors from the HD-15/DVI-D on the card backplane, vs running two monitors from a signal splitting DVI-I cable - as long as the hardware/drivers support the signal splitting DVI-I cable.
 
As I said, cards typically can't serve individual content onto the analog and digital part of one single DVI-I port. The reason is that there is only one set of display detection data lines on the DVI-I plug. It's either digital OR analog use there.
 
the only reason I ask is because the LCD on the VGA port doesn't work right sometimes. When the LCD turns off from mouse inactivitiy, and I move my mouse again, the screen looks all messed up and I have to power on/off the LCD. I was wondering if this was possibly a video card problem.

I think I may have fixed it though. I swapped the monitors that were on the VGA and DVI port. It may simply be that one of the LCDs doesn't work well on VGA.
 
Back
Top