DVI -> HDMI Adapter Troubles

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Not a major issue, but I have a generic DVI -> HDMI adapter like this, monitor connected via HDMI cable to the DVI-D port to the GPU.

hdmi-to-dvi-24pin-adapter.jpg

I wonder why with the adapter there is no DDC/CI communication with the monitor possible? (Aka the protocol that is used to "talk" to the monitor with tools like MagicTune or SoftMCCS). In addition, with the adapter, the GPU (GTX 970) also cannot send the monitor into standby like via HDMI or a "proper" DVI -> HDMI cable. With a DVI-HDMI cable on the other hand, this all works including that the monitor "knows" when the computer is off.

I am wondering this since there are pins for DDC/CI data communication that exist on DVI as well as HDMI, so why wouldn't this work? (Unless this adapter is incorrectly wired??)

I am also puzzled that the adapter has this behaviour....which means that my "proper" DVI --> HDMI cable I have must be wired differently...but this doesn't make sense to me.

By the way the *cable* is a single-link cable (with the 6 pins at the DVI side in the middle missing), while the adapter has all pins. But this still doesn't explain things to me why the two act differently.

ty!
 
Last edited:

worfred

Junior Member
Feb 25, 2019
1
0
6
Hi. I think you are right. Looks like wiring issue. May be by bad cable design some pins not connected as it should be.



Following HDMI pinout datagramm I found NC pin of the cable which should be GND. But it was not connected. After connecting this pin to ground with thin wire, monitor began to turn off with signal disappeared.

But DCC/IC still not working as expected.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,034
3,516
126
Feb 4, 2009
35,862
17,401
136
I had so many troubles with this on a crappy AMD system years ago I gave up and got a cheap video card with a proper hdmi output.
All adapters were the same they’d work for a while then fail without warning.