- May 19, 2011
- 18,448
- 11,056
- 136
I built a PC for my brother together with my old graphics card:
AMD 7800X3D
32GB DDR5
R9 380X 4GB
1TB SSD
Win11
He installed RDR2 and tried to run it but initially it came up with an error message saying that it didn't think his graphics met its requirements. The 380X is between the minimum and recommended specs so it should work. As this processor has an odd quirk in that its on-board graphics carry on running even with a dGPU installed, I wondered if RDR2 was detecting the on-board graphics so I disabled it in Device Manager. After a restart, the game now starts apparently (I've been connecting to his computer remotely).
I don't like the idea of disabling the on-board graphics in Device Manager much, is there a more recommended suggestion? I suppose it could be disabled in the BIOS but I'd prefer for the on-board graphics to be automatically functional in the event of the dGPU being removed (e.g. in the event that it died).
AMD 7800X3D
32GB DDR5
R9 380X 4GB
1TB SSD
Win11
He installed RDR2 and tried to run it but initially it came up with an error message saying that it didn't think his graphics met its requirements. The 380X is between the minimum and recommended specs so it should work. As this processor has an odd quirk in that its on-board graphics carry on running even with a dGPU installed, I wondered if RDR2 was detecting the on-board graphics so I disabled it in Device Manager. After a restart, the game now starts apparently (I've been connecting to his computer remotely).
I don't like the idea of disabling the on-board graphics in Device Manager much, is there a more recommended suggestion? I suppose it could be disabled in the BIOS but I'd prefer for the on-board graphics to be automatically functional in the event of the dGPU being removed (e.g. in the event that it died).