How to: Dual monitor with cloning left one (3 monitors total)

twinturbostang

Junior Member
Dec 2, 2005
11
0
0
I've got a machine tool at work that I am trying to set up with multiple monitors and multiple stations. The intent is to have one monitor in front of the machine which gives easy access to running it. Then on a desk about 8 feet away I would like to have a clone of that monitor, plus a second one set up in dual monitor configuration. This allows me to keep track of the tool software (which runs on just the first monitor) plus do analysis and run other programs in dual monitor mode. Since the machine tool software only runs on the first monitor, it's not necessary to have (and actually no space for) dual monitors in front of the tool. So only the first (or "Left") image is displayed there.

So, I tried setting up a dual monitor configuration first, which works. Then taking the primary monitor and using a DVI splitter to run that image to the desktop setup and to the monitor in front of the tool. No dice. Every time the third monitor is plugged in, it does not work. I have tried every possible combination of the connections (eg: splitter on top DVI output, and then bottom DVI output, tried the S-video output, etc.) I just can't get it to work the way I want it to. I can either run in "clone" mode where I have the same image on the desk and in front of the tool (but no dual monitor). OR, I can have dual monitor on the desk, but no image at the tool. The computer seems to know there are three monitors attached and refuses to run them all (even though one is just a clone).

Is it possible to get this working? Do I need a different video card? I currently have the ATI Radeon HD 2600 Pro. BTW, if I do need a different video card, I have only ONE slot available. All of the other slots in the computer are taken by controller cards for the tool. :-(

If you guys have any suggestions, I would really appreciate it!

Thanks,
Brian

P.S. I have a simple diagram of what I'm trying to do, but I don't see an option of attaching images on this forum.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
The remote monitor and its local clone need to be on the same display card, running the vendor's clone mode (Windows doesn't do that for you, the card's driver does.) The 3rd monitor runs on a 2nd card, preferrably of the same type so you don't run into disagreements between two different driver types.
 

twinturbostang

Junior Member
Dec 2, 2005
11
0
0
So I guess that means I'm out of luck, if I have to use two video cards. As I mentioned, I have no space available for anymore cards. All slots are filled completely. :(
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Perhaps a different splitter would work (powered rather than passive?) or if both displays were configured as generic or if both were actually identical models? Another option may be a USB to DVI adapter (integrated GPU).
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Splitting an analog VGA signal is much more feasible than 'splitting' DVI. The latter is a high speed point-to-point digital signal, not at all made to drive two clients. Failure rate in doing so anyway is expected to be high.

Analog VGA on the other hand is analog - loss of signal quality just means it'll look washed out or have a bit of ghosting, but unlike a digital signal, you don't lose the display altogether.
 

twinturbostang

Junior Member
Dec 2, 2005
11
0
0
Thanks guys. It's worth a shot I guess, trying to run it in VGA. BTW, all three monitors are identical 2007FP Dell 20inchers, except one has a silver bezel instead of black, but I doubt the electronics are different.