• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question AMD vs nVidia for professional Unreal Engine 4 ???

Sphiral

Member
Jun 24, 2020
25
0
6
Henlouuu there!!

I've a friend who is actually programming a game using Unreal 4, and in a future Unreal 5. Since it's an unknow world for me I came here for all your knowledge! :3
Apparently there is no much difference in CPU for this between Intel and AMD....but what about GPU's?

Thanks in advance! =)
 

ThatBuzzkiller

Senior member
Nov 14, 2014
956
125
106
Depends on what you mean. There's no difference in accessible feature sets between either vendors if you're using the main branch of UE4.

Your friend might end up being more productive since the top vendor does have an absolute performance advantage.
 

Sphiral

Member
Jun 24, 2020
25
0
6
Depends on what you mean. There's no difference in accessible feature sets between either vendors if you're using the main branch of UE4.

Your friend might end up being more productive since the top vendor does have an absolute performance advantage.
What I mean is raw performance in general. Most of games perform better with nVidia, but also those are more expensive. But from the creative/programing perspective of a game....I've no clue
 

AtenRa

Lifer
Feb 2, 2009
13,358
2,187
126
Since SONY PS5 and MS XBOX X Series are both using AMD RDNA 2 GPUs, I would go with an AMD RDNA 2 Desktop GPU when they will release later this year in order to learn the architecture
 

Stuka87

Diamond Member
Dec 10, 2010
4,839
542
126
Technically he will need both once he gets farther down the road (Assuming he is making the game by himself). Historically Epic has favored nVidia, and has done a lot more optimizations for nVidia cards. But recently that seems to be not the case. If he is one of many people working on it, then its less important what card he has provided there are many configurations across the team.
 

Sphiral

Member
Jun 24, 2020
25
0
6
Technically he will need both once he gets farther down the road (Assuming he is making the game by himself). Historically Epic has favored nVidia, and has done a lot more optimizations for nVidia cards. But recently that seems to be not the case. If he is one of many people working on it, then its less important what card he has provided there are many configurations across the team.
He's on a programing course learning Unreal to create basic games; then he wants to aply that to create "basic games" as architecture demo, to show clients the final result of a project
 

tamz_msc

Platinum Member
Jan 5, 2017
2,578
2,256
106
NVIDIA has historically been more forthcoming when it comes to providing developers with hardware to optimize for. That's something to keep in mind. I also think that NVIDIA and Epic have programs catered to game developers which provide documentation and support during development.
 
  • Like
Reactions: Sphiral

Sphiral

Member
Jun 24, 2020
25
0
6
The total budget they have right now is 1.000€ for a complete tower. Asuming than the CPU is gonna be around 320€, there is not much budget for a bigass GPU
 

tamz_msc

Platinum Member
Jan 5, 2017
2,578
2,256
106
The total budget they have right now is 1.000€ for a complete tower. Asuming than the CPU is gonna be around 320€, there is not much budget for a bigass GPU
Why does he need 320€ for the CPU? He can get a Ryzen 5 3600 for almost halfof that.
 
  • Like
Reactions: Sphiral

Dribble

Golden Member
Aug 9, 2005
1,806
337
126
Why does he need 320€ for the CPU? He can get a Ryzen 5 3600 for almost halfof that.
If you are developing software you spend most of your time compiling and debugging which needs a fast cpu, and lots of memory. Gpu wise I guess it needs to be as modern as possible, and fast enough, and preferably nvidia because it's the most reliable and has better debugging tools.
 

blckgrffn

Diamond Member
May 1, 2003
6,862
160
106
www.teamjuchems.com
Hmm, well I would suggest if buying now a Turing based GPU at the minimum for full DX12 Ultimate support. I agree with others that and RDNA2 based GPU would be better in the longer term given the likely 100+ millions that will be incorporated into the next console generation based on that technology.

I mean, that still seems like a lot of CPU $$$ to get a solid nice build, I would say storage performance is also pretty darn important. An RTX 2060 is ~$300 as entry level Turing.

Leaving ~400+ euros for *everything* else seems really tight.
 
  • Like
Reactions: Sphiral

tamz_msc

Platinum Member
Jan 5, 2017
2,578
2,256
106
If you are developing software you spend most of your time compiling and debugging which needs a fast cpu, and lots of memory. Gpu wise I guess it needs to be as modern as possible, and fast enough, and preferably nvidia because it's the most reliable and has better debugging tools.
Blowing 1/3rd of a €1000 budget on a CPU seems like a bad idea to me. If you're gonna spend €300 on a CPU, then you'll need at least €100 for a decent motherboard, another €150 for 32GB RAM(for development work you'll need more than 16) which leaves €450 for the rest of the system. A Ryzen 5 3600 is going to be fast enough for this budget, and it will allow room for buying a beefier GPU if need be.
 
  • Like
Reactions: Sphiral

Stuka87

Diamond Member
Dec 10, 2010
4,839
542
126
A 3600 would be adequate, a 3700 would really speed up compile times. But with that budget, I would advise a 3600. For a GPU, find a decent priced 1660 Super or RX5500. Down the road he can always upgrade that to a card with ray tracing, which UE5 has native support for.
 
Last edited:

Dribble

Golden Member
Aug 9, 2005
1,806
337
126
Yes really there are less driver bugs, the dev support actually reply and fix stuff if you report things, the tools work better because everyone else is using nvidia, and they have things like Nsight which is really useful for debugging gpu's.

It might not be politically correct on this forum which does love the underdog, but if you are developing it's already hard enough, so make life easy and just buy the stuff that works best.
 

Elfear

Diamond Member
May 30, 2004
6,936
405
126
Yes really there are less driver bugs, the dev support actually reply and fix stuff if you report things, the tools work better because everyone else is using nvidia, and they have things like Nsight which is really useful for debugging gpu's.

It might not be politically correct on this forum which does love the underdog, but if you are developing it's already hard enough, so make life easy and just buy the stuff that works best.
Are you talking developer software and support or hardware? Your post seemed to imply the latter when you said "reliability" which is very debatable.
 

Sphiral

Member
Jun 24, 2020
25
0
6
Blowing 1/3rd of a €1000 budget on a CPU seems like a bad idea to me. If you're gonna spend €300 on a CPU, then you'll need at least €100 for a decent motherboard, another €150 for 32GB RAM(for development work you'll need more than 16) which leaves €450 for the rest of the system. A Ryzen 5 3600 is going to be fast enough for this budget, and it will allow room for buying a beefier GPU if need be.
The point atm would be getting a good CPU and decent GPU to maybe upgrade in 1 year
 

Sphiral

Member
Jun 24, 2020
25
0
6
If you are developing software you spend most of your time compiling and debugging which needs a fast cpu, and lots of memory. Gpu wise I guess it needs to be as modern as possible, and fast enough, and preferably nvidia because it's the most reliable and has better debugging tools.
That's what I heard...cause of that we are prioritizing the CPU and getting some decent GPU to be able to upgrade in a future
 

sandorski

No Lifer
Oct 10, 1999
67,534
2,681
126
Not sure it matters much for Development. Possible exceptions if it also includes using Maya, Photoshop, and other similar Tools.
 
  • Like
Reactions: Sphiral

piokos

Senior member
Nov 2, 2018
333
114
76
What I mean is raw performance in general. Most of games perform better with nVidia, but also those are more expensive. But from the creative/programing perspective of a game....I've no clue
I'm not sure why you're so focused on GPU performance here.
Compiling happens on the CPU - much like most (if not all) rendering. I'm not sure what use of GPU there is outside of testing. You'd have to look into UE4 documentation.
If 3D modelling will be done in 3rd party tools, they may utilize a GPU or not (higher chance of supporting Nvidia).

Depending on the size of the project, I'd focus on CPU and RAM. And since this is for learning and could be a one-off, I'd use cloud or a swarm setup from whatever was available (as pro game developers would :)).

In fact, if you go into UE4 hardware requirements, GPU isn't even mentioned - other than having to support DX11/12.
On the bottom of this page they show an example of a workstation used by Epic developers - with GTX970.

Nvidia probably offers more support and it will be easier to find help in the dev community, so that's clearly the safe choice. But since developing on UE4 isn't very GPU-dependent, the buyer should be fine with an AMD card as well.

The only exception being...

There's no difference in accessible feature sets between either vendors if you're using the main branch of UE4.
This is incorrect. UE4 supports RTRT and it requires an Nvidia card, precisely:
"NVIDIA RTX and some GTX series cards with DXR support using the latest device drivers."

In other words: RTRT is built on CUDA and it may use RTX or be emulated on a non-RTX card.
 
  • Like
Reactions: Sphiral

blckgrffn

Diamond Member
May 1, 2003
6,862
160
106
www.teamjuchems.com
@piokos

My point is that if he is going to be going through all of the things UE can do and developing & implementing code samples, it seems like he should have a GPU capable of running & validating all of those code samples.

I guess maybe I am just fuzzy if it really takes a full 2060 or better to see all of the features (DX12 Ult), or if the lesser Turing derivatives can do them all but just at low performance levels.

If it is really of no concern, I don't see why a 1650 wouldn't be sufficient.
 
  • Like
Reactions: Sphiral

piokos

Senior member
Nov 2, 2018
333
114
76
I dont think thats how it works.
I'm not sure what you meant, but I made a mistake. UE4 uses DXR, which still doesn't support Radeons, so that's why Nvidia GPU is required.
Anyway, even if it's already known that Radeons will be added soon (because Xbox), I haven't seen any info which older generations will be supported. So Nvidia remains the only safe bet at the moment if one wants to try RTRT in UE.
My point is that if he is going to be going through all of the things UE can do and developing & implementing code samples, it seems like he should have a GPU capable of running & validating all of those code samples.
The point is that we're talking about a single person learning game development and then creating some showcase project. So how complex is it going to be? Because this could end up with a simple 3D minigolf and a lot of cash totally wasted on hardware. :)

In fact, I don't understand the push to upgrade hardware so early. Especially if someone is learning. He could start on whatever he has. Buying hardware should answer actual performance needs and nothing in OP's post says that is the case.
If it is really of no concern, I don't see why a 1650 wouldn't be sufficient.
Or an MX250/IGP if he has a laptop.
In fact, wouldn't it be great if developers actually focused on slow PCs and learn optimizing - not just incontinently add effects? :D

The end results is that many AAA titles work on wide range of modern hardware, while indie games tend to choke flagship GPUs. Because, precisely, individual game developers are often avid gamers and they build on (and for) $2000 PCs :D

Also, there's really no need to make a complex, badly optimized game if learning Unreal Engine is the actual goal. People use it for animations and simulations. Even for creating data for image recognition.
 
  • Like
Reactions: Sphiral

Paul98

Diamond Member
Jan 31, 2010
3,676
100
106
Unreal likes to have more cores/threads, especially when compiling shaders or building lights.


As for the GPU, right now doesn't feel like the best time to buy, especially with new stuff from both AMD and NVidia right around the corner. I would look at something cheaper for now, then not only when you have a bit more of a budget, but when new stuff comes out you can get something good with any new features.

If not and you must get something now, I would say the 2060 as it should at least allow most things to be tested.
 
  • Like
Reactions: Sphiral

ASK THE COMMUNITY