DVI-D vs DVI-I output

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
It doesn't matter which one I use for which monitor, right? I hooked up my new U2412M (1920x1200) to the DVI-D port on my GTX 770, and hooked up my old 22" Acer (1680x1050) to the DVI-I port. Problem is, the DVI-I port gets the signal while the BIOS is showing and while Windows loads, so it is essentially displaying on what I want to be my secondary monitor. I've read there is no way to change this, so I'd have to switch the cables at the outputs. But that shouldn't effect quality, correct? Both cables are DVI-D single-link.
 
Last edited:

Dahak

Diamond Member
Mar 2, 2000
3,752
25
91
Correct, there is really no difference between the -D -I other than -I allows you to use a DVI-VGA adapter if you need a vga display

By the sound of it you your main to be the DVI-D? it could be one of two things.
One of the monitors is responding faster than the other.
or The card could have a "Primary" port which would be the reason why you see the 2nd monitor for bios and windows load, which by the sound of it might be the -I port

You can in windows change which one is the primary but that is only for after windows is loaded
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Yeah I think the "primary port" is the DVI-I, so I guess I'll just switch my U2412M over to that since there is no decrease in quality.

My Acer is old, and is starting to show its age by taking a little long to come out of sleep. I don't want to keep having it turn on, then off after Windows has loaded.

Thanks for your input.