I dont think thats how it works.
I'm not sure what you meant, but I made a mistake. UE4 uses DXR, which still doesn't support Radeons, so that's why Nvidia GPU is required.
Anyway, even if it's already known that Radeons will be added soon (because Xbox), I haven't seen any info which older generations will be supported. So Nvidia remains the only safe bet at the moment if one wants to try RTRT in UE.
My point is that if he is going to be going through all of the things UE can do and developing & implementing code samples, it seems like he should have a GPU capable of running & validating all of those code samples.
The point is that we're talking about a single person learning game development and then creating some showcase project. So how complex is it going to be? Because this could end up with a simple 3D minigolf and a lot of cash totally wasted on hardware.
In fact, I don't understand the push to upgrade hardware so early. Especially if someone is learning. He could start on whatever he has. Buying hardware should answer actual performance needs and nothing in OP's post says that is the case.
If it is really of no concern, I don't see why a 1650 wouldn't be sufficient.
Or an MX250/IGP if he has a laptop.
In fact, wouldn't it be great if developers actually focused on slow PCs and learn optimizing - not just incontinently add effects?
The end results is that many AAA titles work on wide range of modern hardware, while indie games tend to choke flagship GPUs. Because, precisely, individual game developers are often avid gamers and they build on (and for) $2000 PCs
Also, there's really no need to make a complex, badly optimized game if learning Unreal Engine is the actual goal. People use it for animations and simulations. Even for creating data for image recognition.