Forcing digital display out (DVI) in Nvidia cards under Windows

CarlosVR2

Junior Member
Aug 14, 2002
1
0
0
Hello. I'm trying to connect an Infocus LP130 projector to the DVI-I out of a Nvidia GeForce 3 Ti 500 video card using a DVI-D cable I bought at the Infocus Online Store.

When the PC boots, the POST screen and the initial Windows 98 screen are shown in the projector, but as soon the PC enters to Windows, the screen goes blank, and I have to reconnect the analog monitor in order to see anything.

I tried to switch to "digital display" in Nvidia's display properties page, but the option is grayed out. Clicking on "detect displays" has no effect. I'm using Nvidia's 30.82 standard reference drivers.

By the way, the TV-Out works perfectly.

Because the DVI out works until Windows is loaded, I suspect this is a driver problem. I checked Nvidia's docs for Linux and there is a way to force the card to enable the DVI out in Linux:

- - - Taken from Nvidia's docs - - -

Option "ConnectedMonitor" "string"
Allows you to override what the NVIDIA kernel module
detects is connected to your video card. This may
be useful, for example, if you use a KVM (keyboard,
video, mouse) switch and you are switched away when
X is started. In such a situation, the NVIDIA kernel
module can't detect what display devices are connected,
and the NVIDIA X driver assumes you have a single CRT
connected. If, however, you use a digital flat panel
instead of a CRT, use this option to explicitly tell the
NVIDIA X driver what is connected. Valid values for this
option are "CRT" (cathode ray tube), "DFP" (digital flat
panel), or "TV" (television); if using TwinView, this
option may be a comma-separated list of display devices;
e.g.: "CRT, CRT" or "CRT, DFP". Default: string is NULL.

- - - Taken from Nvidia's docs - - -

Does anybody knows a way to do this in a Windows-based system? or any other workaround for this problem?

Thanks in advance!