HDMI to DVI?

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
Hello all,

Previously I had my gaming desktop in the same room as my QNIX 27" monitor. It was connected by a DVI cable from my GPUs DVI output as the monitor only has a DVI input. The resolution is 1440p. I then had a rather long HDMI cable connecting my living room television to my PC as well. This worked fine but I decided to move my PC into the living room for VR reasons and instead connect it to the monitor in my bedroom via the long HDMI cable but I am having trouble. Here are some things I tried and results:

1) Connecting with HDMI to DVI adapter = monitor doesnt show anything
2) Connecting with HDMI to DVI adapter + use pixel clock patcher = monitor shows color/noise
3) Connecting with active DisplayPort to HDMI adapter + HDMI to DVI adapter = monitor shows picture but the colors are completely messed up.

Could someone explain what im seeing and possibly guide me as to how to make this work?

Thanks!
 

renz20003

Platinum Member
Mar 14, 2011
2,684
606
136
Hello all,

Previously I had my gaming desktop in the same room as my QNIX 27" monitor. It was connected by a DVI cable from my GPUs DVI output as the monitor only has a DVI input. The resolution is 1440p. I then had a rather long HDMI cable connecting my living room television to my PC as well. This worked fine but I decided to move my PC into the living room for VR reasons and instead connect it to the monitor in my bedroom via the long HDMI cable but I am having trouble. Here are some things I tried and results:

1) Connecting with HDMI to DVI adapter = monitor doesnt show anything
2) Connecting with HDMI to DVI adapter + use pixel clock patcher = monitor shows color/noise
3) Connecting with active DisplayPort to HDMI adapter + HDMI to DVI adapter = monitor shows picture but the colors are completely messed up.

Could someone explain what im seeing and possibly guide me as to how to make this work?

Thanks!

What are your system specs?
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
How long is this HDMI cable? And how good "quality" is it?

I'd guess your monitor is more affected by the attenuation in a long cable run, and since it's uncompressed digital data if the monitor doesn't recognize the signal then it simply wont work (as opposed to the digital artifacts we are used to). Notice the setup with the "active" converter performs best, maybe it's putting more power into the signal (idk why the colors are funky though).

Perhaps a HDMI booster or amplifying doobie whacker will work; and/or a shorter (if possible), and/or better "quality" cable.

*** All above is pure speculation
 
Last edited:

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
The GPU is a 980 GTX Ti and has an HDMI port, two DVI ports, and 3 DisplayPorts. The HDMI cable is this one (https://www.amazon.com/219312-Cable-Category-1080P-Capable/dp/B01BY04QK4?th=1). It's 50ft long. Why is it that the cable works fine when going from my PC to the TV but going from my PC to the monitor doesn't work? If it's the protocol, isn't the adapter supposed to correct that? I think I will try to borrow another monitor and see if I can get it to work. Unfortunately I don't think a shorter cable will work since that's the distance I need to cover.

Also I don't think the "active" one is putting in any more power as it doesn't have any additional power. It's just plugged into a DisplayPort.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
For starters I'm guessing the TV is FHD whilst the monitor is QHD? I'd assume this makes a difference.

Also the quality (or existence) of any amplifying or filtering circuitry inside the monitor may be one of the reasons "name brands" cost more. I've had vastly different experiences with receiving OTA signals with different tuners plugged into the same antenna, which I guess could be analogous...

Anyway, I'm still thinking a HDMI booster might help. A quick search will show people saying either 25 or 50 feet as max lengths for HDMI (and they also specify FHD).

Edit: Or Bacon might be correct \/
Does it work if you crank the resolution down low?
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Hello all,

Previously I had my gaming desktop in the same room as my QNIX 27" monitor. It was connected by a DVI cable from my GPUs DVI output as the monitor only has a DVI input. The resolution is 1440p. I then had a rather long HDMI cable connecting my living room television to my PC as well. This worked fine but I decided to move my PC into the living room for VR reasons and instead connect it to the monitor in my bedroom via the long HDMI cable but I am having trouble. Here are some things I tried and results:

1) Connecting with HDMI to DVI adapter = monitor doesnt show anything
2) Connecting with HDMI to DVI adapter + use pixel clock patcher = monitor shows color/noise
3) Connecting with active DisplayPort to HDMI adapter + HDMI to DVI adapter = monitor shows picture but the colors are completely messed up.

Could someone explain what im seeing and possibly guide me as to how to make this work?

Thanks!

The adapter is probably the issue. I'm guessing it was a DL-DVI cable but the adapter might only be doing single? Its the HDMI->DVI adapter that is the issue I'd guess
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
It should have been obvious but please take any advice with the usual: just some random advice on the internet, your mileage may vary - disclaimer.

In theory it should help though... Best wishes.

And for the record QHD, or quad high definition, is the equivalent of four high definition ,"HD" (1280x720), images combined in a 2x2 larger image (to give 2560*x1440). The term "1440p" is commonly used but ambiguous, like 1080p, you're only giving vertical resolution plus a redundant letter from the alphabet. When most people say "1440p" or even "2k" they should write "QHD".

Edit: Fix res'*and say the DVI-HDMI adapter you already have seems good to me.
 
Last edited:

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
Ah ok, well the actual resolution is 2560x1440p. Here's a picture of what happens over HDMI-->DVI: https://www.imgur.com/a/02xXW
And here's what happens with the DisplayPort adapter: https://www.imgur.com/a/7L3zm
02xXW
 
Last edited:

richaron

Golden Member
Mar 27, 2012
1,357
329
136
*Yes typo I meant 2560x1440 (woops) (stop saying "p").

I hope someone can diagnose those pic's. I have only seen similar to the HDMI-DVI pic' with a bad monitor or bad GPU, so if you are sure they're both working at close range with a shorter cable (plugged into same ports) then I can only guess it's an anomaly from bad signal caused by long cable.
 
Last edited:
  • Like
Reactions: scannall

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
Ok so I went to Frys and bought several things to try out. I tried different adapters to no avail. I did buy a 50ft HDMI cable that came with a signal booster. It also didn't work at all without the DisplayPort to HDMI adapter. With the DisplayPort to HDMI adapter, at first it looked way worse but then I noticed that the HDMI cable had a "0-7 equalization" feature to let me adjust the signal by turning a little dial on the cable. Setting it to "1" gave me the best picture, almost perfect but still had some artifacts: https://www.imgur.com/a/baEV3. So it seems like the problem might be a mix of crappy monitor and signal loss over long cable. It's still odd to me that the cable worked fine when connected to my TV/media receiver over that long distance and not the monitor but perhaps the monitor is sensitive to poor signals?
 

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
Ok the plot thickens. With my original HDMI cable I can get perfect signal at 24Hz. At 60Hz, I can't. As I lower the refresh rate the signal gets better.
 

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
I tried to take the distance out of the equation and I still am seeing artifacts. It seems like this QNIX monitor only wants a DVI input and trying to do anything else just messes it up. The DisplayPort adapter seems to help but only over small distances/low refresh rates.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Ok the plot thickens. With my original HDMI cable I can get perfect signal at 24Hz. At 60Hz, I can't. As I lower the refresh rate the signal gets better.

Sounds like a bandwidth issue then. What video card is it? HDMI 1.2, 1.3 or 2.0?
 

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
I've been looking up new 1440p monitors to buy and I'm enticed by some of the gsync offerings however I don't see a lot of them that support HDMI 2.0. I do see 4K ones. Did HDMI 2.0 just skip the 1440p generation?
 

cbm80

Junior Member
Feb 27, 2017
3
0
1
It's really simple. Your monitor takes a DL-DVI input. You can't passively convert HDMI to DL-DVI.
 

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
What about an active converter? Do those exist?

I was also looking at possibly getting a new monitor anyway, however, most are DisplayPort monitors but I don't seem to see a lot of long DisplayPort cables whereas HDMI cables are more common
 

cbm80

Junior Member
Feb 27, 2017
3
0
1
What about an active converter? Do those exist?

I was also looking at possibly getting a new monitor anyway, however, most are DisplayPort monitors but I don't seem to see a lot of long DisplayPort cables whereas HDMI cables are more common

Found this (first google hit):
http://superuser.com/questions/3320...ink-adapter-exist-i-dont-care-about-the-price

The guy used a HDMI to DP converter and a DP to DL-DVI converter. And then another box to fix up the EDID. Rather ugly solution!
 

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
Ya I saw that too but it seems prohibitively expensive. HDMI 2.0 --> long cable --> female to female HDMI adapter --> Startech HDMI to DisplayPort --> DisplayPort to DVI. My amazon magic has this run out to $70ish. I might be better off selling my monitor and buying one that supports HDMI.