Hey Guys,
I recently bought a 2nd 30" monitor (Dell 3007WFP-HC) to go along with my 30" Apple. I also have a 17" Dell LCD that I use.
I had been running the Apple monitor + the 17" Dell on a 7800GTX. I purchased a 8600GT to use as a 2nd card to allow me to run both the 30's + the 17. When I have the 7800GTX and the 8600GT plugged in (with the monitors, duh) at the same time, both cards are detected by Windows, they show up (including the monitors) in the nVidia and Windows display panels, but even when I enable them, only the monitors plugged into 7800GTX get any display output.
When I remove the 7800GTX and only use the 8600GT, both 30's work fine, but that means I cannot plug in my 17" LCD also (b/c there are only 2 monitor outs).
Any idea what the problem might be? My mobo is a Gigabyte P35-DQ6.
I recently bought a 2nd 30" monitor (Dell 3007WFP-HC) to go along with my 30" Apple. I also have a 17" Dell LCD that I use.
I had been running the Apple monitor + the 17" Dell on a 7800GTX. I purchased a 8600GT to use as a 2nd card to allow me to run both the 30's + the 17. When I have the 7800GTX and the 8600GT plugged in (with the monitors, duh) at the same time, both cards are detected by Windows, they show up (including the monitors) in the nVidia and Windows display panels, but even when I enable them, only the monitors plugged into 7800GTX get any display output.
When I remove the 7800GTX and only use the 8600GT, both 30's work fine, but that means I cannot plug in my 17" LCD also (b/c there are only 2 monitor outs).
Any idea what the problem might be? My mobo is a Gigabyte P35-DQ6.