XFX R7 250X 2GB DDR3 @ 4K30 over HDMI, no good?

VirtualLarry

No Lifer
Aug 25, 2001
56,343
10,045
126
I recently build a B150 rig, with a G4560 CPU, and added one of my R7 250X 2GB DDR3 cards.

I think that spec is supposed to be DDR3-1600, but GPU-Z says 800. I'm not sure what is correct.

Anyways, I have a couple of window-caps of the CrystalDiskMark screens. I opened two different ones, and at 4K res, there is default black area in the Windows Picture Viewer, with the CDM screen in the middle.

Standard stuff.

Well, if I arrange both windows, horizontally, such that they are taking up most of the display, I get weird black horizontal lines running through my Windows 10 background wallpaper, and my entire screen blacks out, and the audio stops playing.

If I arrange them vertically, no such problem happens.

And I've tried using the KBL iGPU (HD610), and ... NO PROBLEMS.

So, is there something wonky with my R7 250X? Why can't it handle a 2D display, with a bunch of black squares, with light-colored squares inside?

Something is weird here.

Could it have something to do with HDR info, since the Radeon GCN drivers support HDR, and that extra data being sent, causes the link over the HDMI to de-sync?

Edit: I'm using the same HDMI cable that I've always been using, that hasn't had any issues with 4K thus far. I've used it at 4K60 using my RX 460 4GB Nitro card in my SKL i5 rig.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I think that spec is supposed to be DDR3-1600, but GPU-Z says 800. I'm not sure what is correct.

I haven't used it in a while, but usually VRAM #s are doubled (Double Data Rate) so 800 would be 1600.

Edit: I'm using the same HDMI cable that I've always been using, that hasn't had any issues with 4K thus far. I've used it at 4K60 using my RX 460 4GB Nitro card in my SKL i5 rig.

The 250x only has HDMI 1.4 not 2.0 like your iGPU (Kaby Lake) and 460 which afaik both do 4k@60hz.

So it should support 4k@ 30hz but maybe there is something else going on with the setup.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,343
10,045
126
Well, Kaby Lake is only HDMI1.4 too. But there's something about large horizontal areas of either black, or high-contrast regions, that make the 250X DDR3 card freak out.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,343
10,045
126
No, Kaby Lake, like Skylake, itself, only supports HDMI1.4 (native). Some boards use a DP-to-HDMI2.0 chip onboard, and use the CPU's DP output, to pipe HDMI2.0 out the rear panel.

So, the HDMI bandwidth between the G4560, and the R7 250X, should be, in theory, the same.

I'm thinking that the issue is more the DRAM speed on the card, not being able to deal with the compositing of windows during vblank at 4K.

Btw, I was incorrect, newest GPU-Z reports DRAM clock as 667. So the card has DDR3-1333 RAM. Spec is DDR3-1600, so these are gimped cards, more like double-gimped, since the DDR3 is already a gimp.

I'm running the newest AMD drivers, 17.2.1.