D3D9 Overlay Not Supported by my new Radeon 7790

brocks

Member
Nov 3, 2009
86
0
66
I just got a new Radeon 7790 card for my PC, and after downloading and installing the latest drivers from AMD, I ran dxdiag on it. I was surprised to see this:
D3D9 Overlay: Not Supported
DXVA-HD Not Supported

For comparison, the dxdiag report I ran on my Intel I3 CPU's integrated graphics before I installed the 7790 shows:
D3D9 Overlay: Supported
DXVA-HD Supported

I googled around, and there are all kinds of questions from Radeon owners about this, going back several years, but I couldn't find a good explanation of why those features aren't supported, or what the consequences are.

It seems to work fine with the couple of movies and low-end games I've tested, and it seems like ATI couldn't sell many cards if whatever isn't supported is something vital, but if anybody could tell me why a fairly high end card doesn't support as much as Intel's integrated graphics, or what those features are used for, or whether I can turn them on somehow, I'd be grateful.

If it matters, I'm running Win8.1 Pro 64, Gigabyte 7790 1GB card, and Gigabyte Z87M-D3H mb.

Thanks for any help.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Last edited:

mindbomb

Senior member
May 30, 2013
363
0
0
dxva hd is hardware acceleration for video decode. It should have it. I'm guessing it is reported as disabled due to a workaround for 4k videos. That's just speculation though.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
dxva hd is hardware acceleration for video decode. It should have it. I'm guessing it is reported as disabled due to a workaround for 4k videos. That's just speculation though.

DXVA 1.0 is supported on AMD as far as I recall. But DXVA-HD is not.

DXVA-HD came with Windows 7. Not exactly new. And its for 720/1080 video etc.

I added DXVA-HD 3.0 checker here:
http://shintai.ambition.cz/DXVAChecker64_3.0.0.zip

It can also be found here:
http://bluesky23.yu-nagi.com/en/DXVAChecker.html
 
Last edited:

brocks

Member
Nov 3, 2009
86
0
66

Yeah, I found that post before I posted here. The guy asked about it way back in April, for a different AMD card, and no replies.

How is this possible? How can AMD be one of the biggest, maybe THE biggest, name in video cards, and not implement features that came out years ago, and are necessary to play some games? Wouldn't there be thousands of people who spent 50 bucks on a game that wouldn't play, burning down AMD headquarters?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Yeah, I found that post before I posted here. The guy asked about it way back in April, for a different AMD card, and no replies.

How is this possible? How can AMD be one of the biggest, maybe THE biggest, name in video cards, and not implement features that came out years ago, and are necessary to play some games? Wouldn't there be thousands of people who spent 50 bucks on a game that wouldn't play, burning down AMD headquarters?

The fix in most cases seems to make the developers make a workaround for AMD cards. or simply avoid using the feature all together. You can find cases back to 2009.
 

brandonb

Diamond Member
Oct 17, 2006
3,731
2
0
The question you should be asking is:

"What is an overlay"

Now I'm going to answer that question, and you can tell me if this is something AMD should have:

An overlay allows a programmer to write directly to the frame buffer (the front buffer). Not just the front buffer of their program, but rather the entire frame buffer of the video card. Now think about the uses of that, and can you honestly tell me a good reason for this functionality?

It goes back to the good ole days of the Commodore 64 and having hardware sprites.

Today each program has a front buffer that's respective to their own window. They are unaware of any other apps running or where that frame buffer is located within the windows environment. (Think of a game in windowed mode that you can drag around the screen.)

The overlay also allows you to control things like gamma for the desktop (much like found in the controls of the ATI CCC.) Do you want a game's gamma to affect your windows desktop?

Overlays don't exactly work great in a multi app, multi tasking operating system. They are designed for single use applications and OSes. There is no need to create hardware sprites or to have that type of functionality today.

Should they support it? Maybe, maybe not. But the use case for it is really small. I'd rather not have it to be honest. I don't need certain apps changing the gamma and what not of my OS screen.

The game Everquest uses this, and every time you load, it changes the gamma and I have to reset it every time the game exits and when I get back to the OS desktop. Quite frustrating.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
The question you should be asking is:

"What is an overlay"

Now I'm going to answer that question, and you can tell me if this is something AMD should have:

An overlay allows a programmer to write directly to the frame buffer (the front buffer). Not just the front buffer of their program, but rather the entire frame buffer of the video card. Now think about the uses of that, and can you honestly tell me a good reason for this functionality?

It goes back to the good ole days of the Commodore 64 and having hardware sprites.

Today each program has a front buffer that's respective to their own window. They are unaware of any other apps running or where that frame buffer is located within the windows environment. (Think of a game in windowed mode that you can drag around the screen.)

The overlay also allows you to control things like gamma for the desktop (much like found in the controls of the ATI CCC.) Do you want a game's gamma to affect your windows desktop?

Overlays don't exactly work great in a multi app, multi tasking operating system. They are designed for single use applications and OSes. There is no need to create hardware sprites or to have that type of functionality today.

Should they support it? Maybe, maybe not. But the use case for it is really small. I'd rather not have it to be honest. I don't need certain apps changing the gamma and what not of my OS screen.

The game Everquest uses this, and every time you load, it changes the gamma and I have to reset it every time the game exits and when I get back to the OS desktop. Quite frustrating.

This seems to be more of a question of whether this feature should exist in modern games (probably not) than whether or not AMD should support it (probably yes, for legacy support).