Can't Output Video from PC to TV

clamum

Lifer
Feb 13, 2003
26,255
403
126
I wasn't sure where to post this but decided on OT since there's more traffic here, but if the mods think it should be moved then that's cool.

So I got a Samsung TV around Christmas time last year (UN55KU6270FXZA) and it has 3 HDMI inputs. One is hooked up to my cable box, one to my Blu-ray player, and I hooked the other up to my PC because my Gigabyte R9 390 video card has a DVI output (I use a DVI-HDMI adapter to output to TV), an HDMI output (to desktop monitor), and several DisplayPort outputs.

This worked fine and I could duplicate/extend my desktop to my TV. Then this stopped working (PC no longer recognizes a second display) for some reason, perhaps after I updated video card drivers. Recently I did a fresh install of Dec. 2016 drivers and no dice. And I did notice that on the TV the "HDMI 2" input is available when my PC is on, and when I unplug the cable from the vid card it grays out on the TV. So it seems like there's a signal there but the PC just isn't recognizing something. I've also tried a different DVI-HDMI adapter and HDMI cables and neither made a difference.

So before I order a DisplayPort cable or adapter I figured I'd see if anyone here had any ideas or had experienced this before. I really don't wanna re-install Windows just to fix this issue but I have a feeling that for some reason it would work.
 

Paperdoc

Platinum Member
Aug 17, 2006
2,307
278
126
With most video cards I've used, you CAN set it up to send signals to two display devices, You can even set whether those two devices display the same thing, or different "frames" of your computer's output. BUT the key phrase is "set up". Apparently your system WAS set that way at one point because you had it working. But then you updated the Video card drivers. I would bet that this reset the options for what that video card sends out, and it is no longer configured to power two display devices. You need to review the video card manual and use its configuration utility to add the second display device (the TV) to its output signal set.
 

sdifox

No Lifer
Sep 30, 2005
95,030
15,141
126
Try unhooking cable, power up both tv abd computer, turn both off, reconnect then boot.
 

clamum

Lifer
Feb 13, 2003
26,255
403
126
Well I figured it out. I tried your suggestion sdifox but that didn't work. For some reason I decided to wiggle the adapter while the Display Properties were up on the PC and I saw the TV pop up as a second display while doing this. It seems the DVI connector on my video card is kinda borked. Hmmmm. Well I suppose I'll try getting a DisplayPort -> HDMI connector (since I couldn't find any thin DP cables on Amazon and I frickin hate the big thick ones).

Thanks for the assistance boys. I <3 ATOT
 

BonzaiDuck

Lifer
Jun 30, 2004
15,727
1,456
126
Well I figured it out. I tried your suggestion sdifox but that didn't work. For some reason I decided to wiggle the adapter while the Display Properties were up on the PC and I saw the TV pop up as a second display while doing this. It seems the DVI connector on my video card is kinda borked. Hmmmm. Well I suppose I'll try getting a DisplayPort -> HDMI connector (since I couldn't find any thin DP cables on Amazon and I frickin hate the big thick ones).

Thanks for the assistance boys. I <3 ATOT

Just buying the cable is something I would do anyway after further tinkering.

Go down to the Shack or find the local electronics jobbers warehouse store and get a bottle or spray-brush aerosol form of electronics contact cleaner. And also, as my dentist might say, use a "soft" toothbrush. Examine the pins on the cable to see that they're all straight.

I won't comment on the R9, because I've become a long-term part of NVidia's customer-base just out of habit. I haven't had a board of the other flavor since maybe 2003.

My preference for cables and ports has followed a desire for highest Hz refresh rate on a desktop monitor (144 hz) and 60Hz for the HDTV across the room. DVI assuredly supports 144Hz, but DVI-to-HDMI means I get only 60 in that connection. The cable was not the least expensive, is "thick" and between 25' and 35' long. I've got an HDMI-to-HDMI and a DVI-to-HDMI, and they both seem to work flawlessly with my ONKYO AVR -> HDTV.

Supposedly a later version-standard of HDMI allows for refresh rates higher than 60Hz. I think DP cable allows for higher, but some versions of DP are not HDCP compliant. My KVM -- which allows for 2560x1440 with 144Hz -- is not HDCP compliant.

Is there any sort of sticky for it, or would someone summarize the wisdom about HDMI versions and the supported refresh rate? I have a system currently feeding the HDTV using the GTX 970, and another ready-to-be-configured that uses a GTX 1070. Does an older cable defeat the full features of a latest HDMI standard? I know my GTX1070 does not provide HDCP-compliant DP. Or -- should I have purchased a different cable? I cannot see by myself how the cable would be a relevant factor. Correct me, if you please . . .