Question 1050 Ti Multiple Monitor issue

a212ww

Junior Member
Nov 20, 2016
14
0
36
Hi, I recently replaced my very old core 2 era motherboard with a slightly newer one. I have an MSI GTX 1050 Ti and 3 monitors. This worked well on my old motherboard. The issue now is that all three of the ports are working, but seemingly only one at a time. If all three monitors are plugged in, it goes to displayport and I tried each port individually by plugging the monitors in one by one and all three ports are working. I've tried reinstalling the graphics drivers twice now with no luck. Interestingly I can get all three to work correctly by unplugging them while the computer is running, plugging in HDMI first, DVI next, and DisplayPort last and it functions correctly until the next reboot when it goes back to only working on DisplayPort. I have tried using the force detection in both windows settings and in the nVidia control panel and neither detect any other monitors. (Same thing happened on this gparted live cd I had) These are the specifications of the computer:

Intel DX58SO Motherboard
Core i7 950
8GB RAM
MSI GTX 1050 Ti 4GB
500GB SSD
160GB HDD
Windows 10 Pro 64 bit

Thank you
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,202
126
OK, I'm puzzled.

The only thing that I can offer you, is to try newer or older NVidia drivers. And possibly un-install MSI AfterBurner or EVGA PrecisionX. Also, do you have any 3rd-party shell applications, like OpenShell installed?

Weird that the same problem persisted in Linux too, apparently?

Could it be a BIOS issue with the card? Maybe it needs an update? Is the mobo set to use CSM or UEFI booting? That might affect things too, though I cannot say exactly how without trying the options.

Is this system overclocked at all? Maybe a PCI-E bus or RAM or CPU clock speed issue?

Tried a CLR_CMOS reset yet?

Of the monitors that are connected, which of them support HDCP on those inputs being used?

Could it be a resolution issue, maybe it defaults the multi-mon to "Cloned mode", and not all of the monitors are the same native resolution? Maybe try a lowest-common denominator resolution, like 720P?

@UsandThem @aigomorla @BFG10K
 
Last edited:

a212ww

Junior Member
Nov 20, 2016
14
0
36
OK, I'm puzzled.

The only thing that I can offer you, is to try newer or older NVidia drivers. And possibly un-install MSI AfterBurner or EVGA PrecisionX. Also, do you have any 3rd-party shell applications, like OpenShell installed?

Weird that the same problem persisted in Linux too, apparently?

Could it be a BIOS issue with the card? Maybe it needs an update? Is the mobo set to use CSM or UEFI booting? That might affect things too, though I cannot say exactly how without trying the options.

Is this system overclocked at all? Maybe a PCI-E bus or RAM or CPU clock speed issue?

Tried a CLR_CMOS reset yet?

Of the monitors that are connected, which of them support HDCP on those inputs being used?

Could it be a resolution issue, maybe it defaults the multi-mon to "Cloned mode", and not all of the monitors are the same native resolution? Maybe try a lowest-common denominator resolution, like 720P?

@UsandThem @aigomorla @BFG10K

Hi, thanks for the advice. I tried uninstalling classic shell to no avail. I didn't have afterburner or anything similar installed in the first place. I tried clearing my CMOS and installed the newest BIOS update which did not help either. As you asked the BIOS is running in non UEFI mode (says legacy in the settings) The system is not overclocked in any way. The DVI monitor supports HDCP. The other two are VGA monitors on adapters (The adapters are not the issue, I tested them both on another system and they worked flawlessly on the old motherboard.) Did try changing the resolution to lowest and the monitors still didn't show up. I updated the display driver to the latest version from nvidia with no effect. Tried putting in my old GT 710 and the same thing is happening with it too. I would just reinstall windows but that's not much of an option for me as I have satellite internet with a 15GB monthly data cap and I'd lose the programs I had to install over several months. Hope I can solve the issue and thanks for reply.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,202
126
Tried putting in my old GT 710 and the same thing is happening with it too.
(!!!) That's a good data-point, that suggests that the problem is some Windows / driver-related setting, and not the actual physical card.

OR, that it's a problem with the adapters, and it happens with two different NVidia cards.

I would just reinstall windows but that's not much of an option for me as I have satellite internet with a 15GB monthly data cap
Well, let's see what we can do before that happens.

First of all, you said that you "Forced" the displays "Enabled" in Windows and in NV Control Panel.

Can you remove those settings, and then go into Device Manager, under Displays, and delete the monitors listed. (Also possibly try the "DEVMGR_SHOW_NONPRESENT_DEVICES=1" environment-variable setting, then "START DEVMGMNT.MSC". Google on those terms for a guide.) Then delete any "Shadow" devices listed under Displays.

Then reboot, with the monitors all connected. See if Windows will re-detect the displays. Then enable them in Display Properties if not already enabled.

Edit: I'm thinking, that "Forcing" the display enabled, in both Windows and NV Control Panel, created an entry under Displays in Device Manager, that was non-PnP, and thus, is disabling the device-detection for the monitor's presence, thus the display is not enabling for some reason. Could be out in left field, but it seemed plausible. Something along those lines, anyways.

The other option is, if you forced the Display enabled, did you also set a fixed Resolution for that Display, under Display Properties, and is that Resolution, within the range of options that the monitor supports. Don't forget the refresh rate settings too, maybe have to manually set those under Advanced Display Properties, Monitor tab, for each Display, such that they are all set the same, IF you've forced-Enabled the Displays.
 

a212ww

Junior Member
Nov 20, 2016
14
0
36
HI,
I've been messing around with this for hours and I've tried every suggestion. I even tried swapping the hard drive to another one from an old windows 7 laptop, installed the driver and it didn't work. Swapped my graphics card to another computer and it didn't work. Took the good working graphics card out of aforementioned other computer, didn't work on the one I've been working on. I can't find a single discernable pattern to any of this. At this point I'm too frustrated to care about my downloaded programs, so I'm just going to plug everything in how I want it and reinstall windows. I've verified all my monitor cables and adapters are working perfectly. If I can't get it running I'll either take it to some shop or just dump it. At this point I've completely given up.

Thanks for the help
 

a212ww

Junior Member
Nov 20, 2016
14
0
36
I did finally end up figuring this out today after much messing around. Ended up just reinstalling windows (plan on taking the pc to friend's house to reinstall the programs) and it still didn't fix it. I think something on the graphics card is shorting out because the HDMI and DisplayPort works perfectly with multiple monitors on boot if DVI is disconnected. My solution is going to be using the two adapters and only having dual monitors instead of three, which I can certainly live with. Thank you VirtualLarry for the help on this issue.