HDCP: how does it work? (or: oh, the joys of trying to watch media legitimately)

plonk420

Senior member
Feb 6, 2004
324
16
81
a) so do videocards not send HDCP if you also have a VGA monitor hooked up?

so my May 2014 Aaxa P450 projector seems to not support a recent version of HDCP (maybe the 2.2 IIA or 2.2 for HDMI versions). it DOES support "HDMI," so i'm assuming it supports an older version of HDCP.

the moment i disconnect my VGA monitor, the monitor says "Invalid format." it even happens when i boot Windows (7) in 640x480 mode with VGA device disconnected (although i see desktop briefly before going to that message). i only get video when i boot into Safe Mode.

b) i'm assuming there are older versions of HDCP used in older AMD drivers? is there a version that supports the first 2.2 implementation so that i can maybe watch Amazon Prime Video in HD? or do i have to pirate this legitimate content i'm trying to watch...?

ninja edit: buying an HDCP stripper or even an Amazon Fire stick (which i don't think will work on this projector) is not in the budget at the moment
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
a) Typically HDCP will not engage if there's a non-compliant monitor also connected and active. Obviously a VGA monitor would be non-compliant.

b) HDCP 2.x is for next-generation 4K monitors and such. Your pico projector likely does not support it, nor does it need to. For a lower res device like that, it would all be HDCP 1.x.
 

plonk420

Senior member
Feb 6, 2004
324
16
81
well, it's a problem that i'm only getting movies in SD, and the sad (and annoying) thing is that the Amazon Prime Video SD picture quality is worse than pirated SD rips. and they won't give you 1080p or even 720p without HDCP 2.2.

the other problem is that if i have Silverlight installed and only the projector hooked up, i'd have no video. without it installed, either the videocard/drivers or projector derps out while a monitor is hooked up on VGA and drops the HDCP (when i initially boot into Windows, i get Invalid Format).

to get rid of Silverlight, i had to hook up the monitor on VGA, as it wouldn't uninstall in Safe Mode.

i've never really had issues with copy protection with legit media/games (that i didn't know how to deal with (i usually knew about fixes ahead of time)) until now, and this was amazingly frustrating and off-putting.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
well, it's a problem that i'm only getting movies in SD, and the sad (and annoying) thing is that the Amazon Prime Video SD picture quality is worse than pirated SD rips. and they won't give you 1080p or even 720p without HDCP 2.2.
To be clear, you don't need HDCP 2.2 for 720p/1080p. I get it just fine on my HDCP 1.x monitor.

the other problem is that if i have Silverlight installed and only the projector hooked up, i'd have no video. without it installed, either the videocard/drivers or projector derps out while a monitor is hooked up on VGA and drops the HDCP (when i initially boot into Windows, i get Invalid Format).

to get rid of Silverlight, i had to hook up the monitor on VGA, as it wouldn't uninstall in Safe Mode.
I'm not sure what Silverlight has to do with anything, but Amazon now has an HTML5 player. Try it with Google Chrome and see what you get.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
It may be that HDCP isn't working correctly on your machine. I went through a similar process to you after I found that my Hi-Def codecs weren't showing up in my HDMI Audio properties after doing the Win7-Win10 upgrade. I also saw that for some reason the HDMI Status window in the Nvidia CPL said my Receiver didn't support HDCP. The specs for my receiver says it supports 2.1, and Windows 7 had no problems.

After several months of troubleshooting and importing a device that strips HDCP (that I've never used), I did a fresh install of the latest version of Windows 10 and everything worked fine.

For the record my receiver and monitor are running on separate cables in parallel so if one of them doesn't support HDCP, the other isn't affected like they would be if they were set up in series. So as long as the monitor you want to view your content on has support all the way up the chain, other monitors in parallel shouldn't affect it.

Other things that affected my HDCP status:

I was able to somehow get the HD-Audio codecs back by enabling my iGPU and installing a particular version of Intel display drivers (which for some reason didn't show the audio drivers during the install process), even though the iGPU wasn't connected to anything. Installing a later version of the Intel driver broke it again. I was unable to repeat this anomaly.

Switching the Primary Display setting in the UEFI from either PEG or PCIe (can't remember which) to Auto also re-broke my HD-Audio codecs. Switching it back fixed it. For my most recent install I had it set to Auto from the start, (Most of Asus' UEFI settings seem to work best on Auto) and I've had no problems.
 

xorbe

Senior member
Sep 7, 2011
368
0
76
I bought a BluRay disc, couldn't ever get it to play, and wound up dl'ing an hd rip. Probably because I had 2 high-end NEC displays that didn't support hdcp at the time, which is ironic, since they had awesome image quality.

In the end, none of this stuff stops the pirates. It just drives up the cost when it glitches and doesn't work.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I never had issues watching protected content at full resolution on my VGA-connected FW900 CRT. Perhaps the DVI-VGA converter I used was enough to satisfy the HDCP chain? I also specifically recall the NVIDIA Control Panel saying "Your graphics card and display are HDCP-capable.", despite being an analog device. I always thought it strange...
 
Oct 16, 1999
10,490
4
0
I've seen it mentioned a few times that HDCP might not work on on unsigned/beta Radeon Drivers. No idea from first hand experience though or if that could be your issue.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Ok, the devices on HDCP chain, must meet a minimum spec for certain playback. May be the pico projector isn't even HDCP certified. I remember a friend who had problems, but i asked him to connect receiver separately, with the projector (an Epson) on a different outlet. That receiver was HDCP certified, but when the projector was being given an output from receiver, the HD audio was disabled on the receiver.

Start with clean connections. And once you have done so, restart the windows. If your projector is hdcp compliant, that should fix things.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
I'm not sure what Silverlight has to do with anything, but Amazon now has an HTML5 player. Try it with Google Chrome and see what you get.

I've had no issues with Amazon's HTML5 player. I only started watching it when they already had it but I've not had any issues with it and it's all I ever use (mainly through Chrome but Edge works fine as well).

It once warned me that I wasn't complaint and could only watch SD videos but that was because of a Chrome flag which was broken for that specific Chrome dev version and has worked fine since (was win32k lockdown[1] for PPAPI plugins set to all, they fixed the problem so it now works with it set to all).

[1] http://peter.sh/experiments/chromium-command-line-switches/#enable-win32k-lockdown-mimetypes
 

plonk420

Senior member
Feb 6, 2004
324
16
81
To be clear, you don't need HDCP 2.2 for 720p/1080p. I get it just fine on my HDCP 1.x monitor.

I'm not sure what Silverlight has to do with anything, but Amazon now has an HTML5 player. Try it with Google Chrome and see what you get.

interesting @ HDCP 1.x

the help files (and i think even the overlay help window) said i needed 2.2 and silverlight at the time. sadly, i don't have access to Prime at the moment and forgot to come back (apologies!) due to dealing with personal drama. probably won't renew it until Top Gear comes out

Ok, the devices on HDCP chain, must meet a minimum spec for certain playback. May be the pico projector isn't even HDCP certified. I remember a friend who had problems, but i asked him to connect receiver separately, with the projector (an Epson) on a different outlet. That receiver was HDCP certified, but when the projector was being given an output from receiver, the HD audio was disabled on the receiver.

Start with clean connections. And once you have done so, restart the windows. If your projector is hdcp compliant, that should fix things.

i'm currently hooked up straight from PC to projector (when the whole endeavor started, i was running thru a passively powered switch, but that's been out of the equation for a majority of the troubleshooting)

I've seen it mentioned a few times that HDCP might not work on on unsigned/beta Radeon Drivers. No idea from first hand experience though or if that could be your issue.

they're Omega WHQL

It may be that HDCP isn't working correctly on your machine. I went through a similar process to you after I found that my Hi-Def codecs weren't showing up in my HDMI Audio properties after doing the Win7-Win10 upgrade. I also saw that for some reason the HDMI Status window in the Nvidia CPL said my Receiver didn't support HDCP. The specs for my receiver says it supports 2.1, and Windows 7 had no problems.

...but it wouldn't completely surprise me if HDCP were somehow invisibly broken

i'll probably try a few more things now that one of the two computers i'm messing around with has its OS on an SSD and doesn't take 5+ minutes to reboot every time drivers fail to update.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Re-installing the most recent version of windows 10 freshly from USB without going through the upgrade from solved many issues I was having, including HDCP recognition.