Dual GTX 670 to use 2x Cinema Displays?

dsc106

Senior member
May 31, 2012
320
10
81
GTX 670 only has 1 displayport. means I'd need an $80 accessory to use my 2nd cinema display.

Could I get a second GTX 670 and use the displayport on that one to power two seperate displays for video editing? Or in SLI do they have to both be plugged in via one graphics card?

I wouldn't be using 2 displays for gaming, just desktop/graphics/etc, but I would have them SLI'ed for gaming and redirect to the main monitor for that.

Would this work?
 

dsc106

Senior member
May 31, 2012
320
10
81
I have an EVGA GTX 670 FTW and an EVGA GTX 285. Could I install both into my Asus Rampage IV Extreme motherboard, and have them both plugged in/ready to go, but then just switch the monitor cable to the GTX 670 for windows, and my older GTX 285 for when I put into the hackintosh (still waiting on full GTX 670 drivers for hackintosh).

Would Windows know which GPU to use, and when I launch a game or do whatever, would the GTX 285 pretty much sit there idle/off and it would just use GTX 670? How does the OS know which card to use, is it just based on which one has a monitor cable plugged in?

Merging two threads since they're virtually identical. Please try to stick to one thread, dsc106
-ViRGE
 
Last edited by a moderator:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
What you propose will work just fine under Windows. As for Mac OS X, the last time I checked you had to do some serious config file editing to get two cards working on a Hackintosh.
 
Last edited:

dsc106

Senior member
May 31, 2012
320
10
81
Guys sorry these are two very different conversations that got merged together... :/ Please answer with a number sign to keep them seperate (based on post #1 and post #2):

1.) Using SLI of GTX 670s: Can I run a monitor from each card seperately for non-gaming purposes? Because the Cinema Displays only take DisplayPort signals, this would give me two displayport out from the two cards. But for gaming, I would just use the main monitor from the card in slot 1... how would this work out?

2.) Using two different GPUS (GTX 285, GTX 670) plugged into the computer, I would want the GTX 670 to be "on" in Windows and the GTX 285 to basically be off/ignored. Could I just disable the GTX 285 in device manager? For Hackintosh, could I just drop the GTX 285 into slot 1 since hackintosh is picky, and let it ignore the GTX 670 in slot 2/no monitors plugged in? But in Windows, just have it used the GTX 670 and switch my monitor cable over? Would I have to use device manager to do this, or how else could this work?

Sorry for the confusion. These were supposed to be two different threads.
 

dsc106

Senior member
May 31, 2012
320
10
81
There is no dvi-to-display port adapter?

No, there is not. There are active converters that cost $80 and are sort of finicky and use scalers. They aren't compatible specs. Won't work with HDMI either.
 

lopri

Elite Member
Jul 27, 2002
13,327
708
126
No, I mean dvi-to-displayport "converter". (not displayport-to-dvi) Do they exist? (I don't see why not but I don't know whether they exist)
 

pcm81

Senior member
Mar 11, 2011
598
16
81
GTX 670 only has 1 displayport. means I'd need an $80 accessory to use my 2nd cinema display.

Could I get a second GTX 670 and use the displayport on that one to power two seperate displays for video editing? Or in SLI do they have to both be plugged in via one graphics card?

I wouldn't be using 2 displays for gaming, just desktop/graphics/etc, but I would have them SLI'ed for gaming and redirect to the main monitor for that.

Would this work?

Should have gone with the red team...
 

dsc106

Senior member
May 31, 2012
320
10
81
Should have gone with the red team...

Did you even read the posts? I already said I have to go with Nvidia for Hackintosh.

I also already stated you can't go DVI to DP without an $80 adapter.

Can someone please tell me if I can run two monitors, each from one of the two different GTX 670 cards SLI'ed? For video/desktop? And then when I play a game, have it ignore the monitor plugged into the card in slot 2 and just use the output from the card in slot 1 to main monitor (would it just do this by default)?
 

lopri

Elite Member
Jul 27, 2002
13,327
708
126
Oh, you're set to go with SLI. I thought you were trying to get it done with a single card first. Well in that case ViRGE has given you the answer. You can enable/disable SLI with 2 monitors. When SLI is enabled the other monitor will simply go blank.

What you propose will work just fine under Windows. As for Mac OS X, the last time I checked you had to do some serious config file editing to get two cards working on a Hackintosh.
 
Last edited:

dsc106

Senior member
May 31, 2012
320
10
81
Wasn't sure if Virge was referring to post 1 or post 2; will SLI power still run for CUDA tasks on video/desktop when a monitor is plugged into each card seperately? Or does SLI on only "turn on" when I flip a switch and force it to a single monitor?
 

lopri

Elite Member
Jul 27, 2002
13,327
708
126
Apparently you can -> http://www.geforce.com/hardware/technology/sli/faq#s1

I did not know that. In the past you either enable SLI on one monitor or disable it to use extra displays. According to NV you can connect both displays to the "master" card, and use two displays in SLI mode. There is also SLI "Focus" mode where a display is being prioritized if you read the faq.

(Your Apple monitor and DisplayPort could be picky, though)
 
Last edited: