Pulling my hair on GTX 1080 Displayport to HDMI adapter

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
Recently bought a Samsung Odyssey VR headset and I plug it into the HDMI port on my MSI GTX 1080, so I have to use the Displayport to connect to my 4k TV, which only has HDMI inputs. I want to be able to run my tv at 4K@60Hz, so I bought an active dispayport-hdmi adapter. It is not working and I couldn't figure out why. Here is what I tested so far:

- GTX 1080(dp) -> Active adapter -> hdmi cable -> TV: does not work
- GTX 1080(hdmi) -> hdmi cable -> TV: works
- GTX 1080(dp) -> dp cable -> LG 2560x1080 monitor(dp): works
- RX 480(dp) (on another pc) -> Active adapter -> HDMI cable -> LG 2560x1080 monitor(hdmi): works

In summary, the GTX 1080 displayport can drive my LG monitor just fine, so the dp output on the 1080 does not seem to be the problem. The TV can display 4k@60Hz perfectly fine if connected directly to the GTX 1080 hdmi port, so the TV is not likely the problem. I thought the adapter might be defective, but it drives my LG monitor perfectly fine at 2560x1080.

It's just when I use the adapter from my GTX 1080 to the TV it stops working. It can display the bios startup screen, but when it gets into the windows login, it starts flashing between the login screen and a black screen for a few times, then stays a black screen. If I login to desktop, it can display the desktop briefly but then flashes between the desktop screen and black screen a few times then go totally black. Then if I press the power button to shutdown, it will display the logoff screen. Very strange, and it is like this every single time. Any idea what else I can try?
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
Just an update:
I decided to try using the DVI port since I have an old DVI-HDMI adapter around. I thought it should work but only in 1080p. What a pleasant surprise, it actually booted in 4k! The TV display info shows 4k, and I double-check the Nvidia control panel, it is 3840x2160 @60Hz. I thought DVI output doesn't support 4k, but whatever, it works!
 

Tweak155

Lifer
Sep 23, 2003
11,448
262
126
Just an update:
I decided to try using the DVI port since I have an old DVI-HDMI adapter around. I thought it should work but only in 1080p. What a pleasant surprise, it actually booted in 4k! The TV display info shows 4k, and I double-check the Nvidia control panel, it is 3840x2160 @60Hz. I thought DVI output doesn't support 4k, but whatever, it works!
This is somewhat surprising to me it supports it at 60hz... as long as it works though!
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Dual link DVI supports 4K at 30hz.
But it's not. The whole reason behind dual-link DVI, was that the overall bandwidth of the individual TMDS chips was limited. But HDMI came later, and used the same scheme, but clocked them higher.

What is happening here, is an HDMI pass-though the DVI port. It's not using DL DVI, it's using a SL to pass-through HDMI at a higher clock rate, using HDMI 1.4 signalling, which will support 4K60 at reduced color depth. (At least, NV seems to support that, and Intel's iGPUs with HDMI1.4 will support 4K30 over a DVI port using an HDMI adapter. Yes, I've tried it recently too, with a G4560 CPU in a Biostar B150 board, with VGA and DVI-D only outputs.)
 
  • Like
Reactions: Campy

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
But it's not. The whole reason behind dual-link DVI, was that the overall bandwidth of the individual TMDS chips was limited. But HDMI came later, and used the same scheme, but clocked them higher.

What is happening here, is an HDMI pass-though the DVI port. It's not using DL DVI, it's using a SL to pass-through HDMI at a higher clock rate, using HDMI 1.4 signalling, which will support 4K60 at reduced color depth. (At least, NV seems to support that, and Intel's iGPUs with HDMI1.4 will support 4K30 over a DVI port using an HDMI adapter. Yes, I've tried it recently too, with a G4560 CPU in a Biostar B150 board, with VGA and DVI-D only outputs.)

Thanks for explaining. Not sure how much color depth I lose, at least it's not noticeable to my naked eyes.

Anyway, after I did some more testing, it appears to be a nVidia problem. So I lugged my desktop with the Rx480 video card next to the tv, and connected it with the same dp-hdmi adapter that failed on the nVidia card. It worked without any drama. I did some googling on the topic and found that there is a lot of complaints about Displayport issues on nVidia cards. It is weird though as it can drive my WQHD monitor dp-dp but it failed driving my tv with the adapter.