Sapphire Radeon R9 285 vs. onboard graphics

liquidsense

Member
Aug 23, 2006
104
0
76
Hi, I have an extreme noob question. And, I have a feeling the answer is obvious, but I have no idea how to figure it out short of asking you fine folks.

I'm still using my first build, which contains the following relevant parts:

  • GPU: SAPPHIRE 100374OCL Radeon R9 285 2GB 256-Bit GDDR5
  • CPU: Intel i7-4790k
  • Motherboard: ASRock ATX FATAL1TY Z97 KILLER
I recently decided to cut the cable-tv cord and try using various streaming services, including Youtube TV. When watching Youtube TV on my computer, it couldn't get HD to play and, after researching it, I learned it's because my GPU doesn't have HDCP support. So, I plugged my monitor (via HDMI cable) into the onboard HDMI port on the motherboard, and everything has been dandy.

My question: If the most graphics intensive things I will be doing on my computer is basic video editing (Adobe Premier Elements 15) and watching/streaming 4K video, is there any reason I need to even utilize my GPU? In other words, if I'm not gaming or streaming myself, is there a use case for me for the GPU?

Follow-up question: Can I plug my monitor into both the onboard graphics (via HDMI) and into the GPU (via displayport) and then switch on the fly based on my needs?
 

XavierMace

Diamond Member
Apr 20, 2013
4,307
450
126
I'm not aware of any modern video card that doesn't support HDCP. How was it determined your card doesn't support it? Every R9 285 I've seen does. That said, if your not gaming the onboard will probably suit you fine. Unless you're doing any GPU accelerated stuff in Premier.
 
  • Like
Reactions: liquidsense

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,204
126
Desktop PCs don't support Optimus / Switchable Graphics like laptops do, that can actually power-down the dGPU.

If you don't mind the dGPU running and taking power all of the time, then yes, you can plug both the dGPU's output, and the onboard's output, into different inputs on your monitor and switch between then as you wish.

Note that depending on which input is selected on the monitor, may determine which GPU is selected for POST display and initial power-on stuff.

If monitor is OFF when PC is booted, you may not get any POST, and it may not enable both displays, depending on how things are set up.

But generally, yes, that works fine.

Edit: That said, your dGPU is new enough that it should certainly support HDCP. Perhaps because you are using beta drivers and not WHQL drivers, you are having HDCP issues?

The last AMD cards that didn't always support HDCP, were the HD 4850 cards, I think. And those required an adapter dongle (reference cards, at least), to get HDMI output, including audio. (Native outputs were two DVI-I DL ports, and a TV-out / component port.)
 
  • Like
Reactions: liquidsense

liquidsense

Member
Aug 23, 2006
104
0
76
I'm not aware of any modern video card that doesn't support HDCP. How was it determined your card doesn't support it? Every R9 285 I've seen does.

That said, your dGPU is new enough that it should certainly support HDCP. Perhaps because you are using beta drivers and not WHQL drivers, you are having HDCP issues?

I came to that conclusion because:
  1. Youtube TV doesn't play in HD when plugged into the card (but it does when plugged into mobo).
  2. The Sapphire website doesn't say anything about HDCP.
  3. Newegg and Amazon listings say nothing about HDCP in the specifications.
  4. I found no webpage on the entire internet that suggested the card has HDCP (but, I also didn't see any that affirmatively said it didn't have HDCP).
In terms of drivers, I downloaded the AMD Settings app, which manages drivers, so I presume I'm on the latest? Yea, I was shocked this card didn't support HDCP either. It would be awesome if it did, and I was wrong. Someone else mentioned that it could be because the card only supports HDMI 1.3. Does that have anything to do with it?
 

Iron Woode

Elite Member
Super Moderator
Oct 10, 1999
31,193
12,721
136
did you install Crimson ReLive Edition 17.10.1 drivers?

is your OS win 10?

I use an R9 280 in my HTPC and it plays HD no problem.
 
  • Like
Reactions: liquidsense

liquidsense

Member
Aug 23, 2006
104
0
76
did you install Crimson ReLive Edition 17.10.1 drivers?

is your OS win 10?

I use an R9 280 in my HTPC and it plays HD no problem.

Edit: Someone elsewhere tells me this card only has HDMI 1.3 and HDCP is only supported on HDMI 2.0.

Yes Win10. In terms of drivers, I think I have the latest. Here's what my AMD Settings app says:

1i1zsU6.png
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,204
126
There are different versions of HDCP, one for HDMI1.4, and for HDMI 2.0. I think it's HDCP 2.2 for HDMI 2.0. Your card might only support the older HDCP standard.

Is your problem, being able to watch videos at all, or 1080P Blu-Ray, or 4K YouTube, or what? Simple 1080P YouTube surely doesn't require HDCP, does it?

Edit: Is "YouTube TV" different than "YouTube", for purposes of this discussion? Because Netflix, only authorizes certain hardware to work with their 4K streams, Intel 7th-Gen Core (Kaby Lake) or newer CPU's with iGPU, or more recently, NVidia cards with 4MB or more of VRAM, of recent generation.

These requirements are independent of whether or not the hardware in question actually supports the HDCP standard, or even PlayReady 3.0, but whether the content producers have authorized that hardware "platform" for viewing their content.

And AMD hasn't shown much commitment to producing PlayReady 3.0-compliant drivers and hardware, etc. (Supposedly the hardware IS capable, though.)

Edit: IOW, this may indeed be a DRM-related issue, but not specifically HDCP capability, although that's part of it.
 
Last edited:
  • Like
Reactions: liquidsense

liquidsense

Member
Aug 23, 2006
104
0
76
Is your problem, being able to watch videos at all, or 1080P Blu-Ray, or 4K YouTube, or what? Simple 1080P YouTube surely doesn't require HDCP, does it?

Edit: Is "YouTube TV" different than "YouTube", for purposes of this discussion? Because Netflix, only authorizes certain hardware to work with their 4K streams, Intel 7th-Gen Core (Kaby Lake) or newer CPU's with iGPU, or more recently, NVidia cards with 4MB or more of VRAM, of recent generation.

YoutubeTV is, as I understand it, a new streaming service in select regions that allows users to watch live TV (35 channels or so) via the familiar Youtube interface. As noted, I was limited to 480p when watching on my PC (up to 1080p available on mobile and streaming through Chromecast).

did you install Crimson ReLive Edition 17.10.1 drivers?

This actually worked! For reasons beyond my explanation, the AMD Settings application said I was "up to date," but I found a more-current version of the driver on AMD's website. After installing, I can watch HD! EDIT: But it's flickering (only on Youtube TV)! Will try another cable.
 
Last edited:

Elixer

Lifer
May 7, 2002
10,371
762
126
For what it is worth, you need WHQL drivers, since those are the ones that flip the DRM bits on so you can see DRM content.
Running BETA drivers won't work.
All cards made after 2008 support HDCP, there is no need to specifically state it.

Sad they make legit customers put up with all these DRM hassles, while pirates just laugh.
 

liquidsense

Member
Aug 23, 2006
104
0
76
For what it is worth, you need WHQL drivers, since those are the ones that flip the DRM bits on so you can see DRM content.
Running BETA drivers won't work.
All cards made after 2008 support HDCP, there is no need to specifically state it.

Sad they make legit customers put up with all these DRM hassles, while pirates just laugh.

Interestingly, the last driver for my card is an WHQL driver, which will not allow me to play HD videos through Youtube TV. But, the latest beta is a "non-WHQL" driver, which works for YouTube TV. The problem is that, whenever I play HD content on YouTube TV, my screen blinks every 30 seconds or so.
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
239
106
... The problem is that, whenever I play HD content on YouTube TV, my screen blinks every 30 seconds or so.

What is your Internet connection speed? Could it be inadequate for HD video streaming?
 

liquidsense

Member
Aug 23, 2006
104
0
76
What is your Internet connection speed? Could it be inadequate for HD video streaming?

It's not that; I'm hard-wired into an 80 mbps connection. It's something wonky with YouTube TV and this beta-driver that's causing issues. Just to be sure, I ran like eight 4k streams on Netflix, Youtube (regular), and Vimeo simultaneously and had no issues. It's just YouTube TV (and now Xfinity Live TV) with this particular beta-driver. Unfortunately, if I get rid of the beta driver, I lose YouTube TV.
 
Last edited: