Anyone playing RDR2 on AMD7000? Couldn't start it.

mikeymikec

Lifer
May 19, 2011
18,448
11,056
136
I built a PC for my brother together with my old graphics card:

AMD 7800X3D
32GB DDR5
R9 380X 4GB
1TB SSD
Win11

He installed RDR2 and tried to run it but initially it came up with an error message saying that it didn't think his graphics met its requirements. The 380X is between the minimum and recommended specs so it should work. As this processor has an odd quirk in that its on-board graphics carry on running even with a dGPU installed, I wondered if RDR2 was detecting the on-board graphics so I disabled it in Device Manager. After a restart, the game now starts apparently (I've been connecting to his computer remotely).

I don't like the idea of disabling the on-board graphics in Device Manager much, is there a more recommended suggestion? I suppose it could be disabled in the BIOS but I'd prefer for the on-board graphics to be automatically functional in the event of the dGPU being removed (e.g. in the event that it died).
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,464
24,162
146
Disable it in the bios. If you ever need iGPU to work just clear CMOS. If it still doesn't reset pull the CMOS battery for a bit with the power unplugged.

If that is above your bro's pay grade, get over there and walk him through it/train him for the task.