Tup3x
Golden Member
- Dec 31, 2016
- 1,224
- 1,316
- 136
Also the reason why you see loading screens constantly. The basics haven't really changed since Morrowind and it shows.
Yep, exactly. You may as well say Call Of Duty MW3 from 2023 uses the Quake 3 engine because the original COD did.Generally speaking we expect designers of any sort to take what was good from an earlier design and improve on it, right? DirectX was released in 1995, no doubt DX12 is "based on" it yet no-one is complaining about that, are they?
That's...not true. Morrowind couldn't come close to handling the draw distance of later games. Even OpenMW has trouble if you push cells out too far, especially with shadows. This is still true even after recent improvements which specifically target the problem with batching optimizations.The basics haven't really changed since Morrowind and it shows.
It's very true. Of course they have made improvements but it is very obvious to see what the foundation really is.That's...not true. Morrowind couldn't come close to handling the draw distance of later games. Even OpenMW has trouble if you push cells out too far, especially with shadows. This is still true even after recent improvements which specifically target the problem with batching optimizations.
It appears Bethesda asked for AMD's help in June of 2023 for Starfield. The Vulkan code was problematic and needed help to use DX12. Starfield is a bloated mess and it's no wonder it has performance issues.
A few years ago in this very forum, people were telling us Mantle would allow AMD to dominate the Android platform (LMAO), and that low-level APIs would provide automatic performance gains to even unknown Indie developers just by flipping a simple switch to DX12, with zero effort (LOL).
Instead what we got is upscalers are now mandatory to make games playable, and this is a terrible place to be. This is the result of the collective failure of low-level APIs, along with the push to ray tracing (though the latter doesn't apply to Starfield).
Almost none of these games look significantly better than the best rasterized DX11 titles. And aside from a few outliers like Doom Eternal and Tomb Raider Shadow, the former perform vastly worse than the latter.
We were also told in this very forum "oh, once game engines are rebuilt from the ground up, we'll see the true performance of low-level APIs!" Yet DX12 is now nine years old, and even the brand new Baldur's Gate 3 can recommend switching to DX11 because it often runs better than Vulkan.
I said back then, anyone who thought the average game developer can optimize code better than GPU driver programmers backed by hardware engineers (ya' know, the people that actually build the things) was delusional, and I was right.
This is also exactly why VRAM requirements are ballooning. Because once again, it's lunacy to expect game developers to manage GPU memory better than AMD/NV engineers. In virtually every game that allows DX11 alongside Vulkan/DX12, DX11 uses far less VRAM.
I'm not. Starfield is just the latest example of a long list of games that fail to use low-level APIs properly.I think it is a bad idea to focus on Starfield as representation of a DX12/Vulkan title.
Interesting article from chips and cheese about starfield GPU utilisation between NV and AMD.
Link.
Seems the game just utilities AMD hardware really well rather than under utilising NV hardware.
People were discussing this at the Beyond3D forum, and I hoped to learn something reading the discussions.
No one can agree with anything, anything from that article, it's conclusions, what happens with NV GPUs.
I'm not. Starfield is just the latest example of a long list of games that fail to use low-level APIs properly.
I've been running this game on a "temporary" 1st gen Ryzen machine with a 1700 and oof, some of the areas like Akila and NA are super choppy. Smaller indoor areas and space battles are like butter, though.
3440x1440 still doesn't max out my GPU, so definitely looking forward to moving the game to one of the better systems.
EDIT: Just realized this old board has beta BIOS for Ryzen 5000 series... the hell am i even doing with my life? 5700X ordered lol
$190-200 vs. ~$320I don't know how much of a cost difference it is these days, but I would have tried to get a 5800X3D to drop in since those do so well for gaming.
The performance is not just fps numbers in a reviews. I swapped my 5600X for a 5800X3D a couple of months ago (just for the sake of science) and have noticed almost no fps increase in games, except for maybe two or three titles, which is not surprising given the res I play is 1440p.We're talking ~60% higher price for ~15% higher perf
A few years ago in this very forum, people were telling us Mantle would allow AMD to dominate the Android platform (LMAO), and that low-level APIs would provide automatic performance gains to even unknown Indie developers just by flipping a simple switch to DX12, with zero effort (LOL).
Instead what we got is upscalers are now mandatory to make games playable, and this is a terrible place to be. This is the result of the collective failure of low-level APIs, along with the push to ray tracing (though the latter doesn't apply to Starfield).
Almost none of these games look significantly better than the best rasterized DX11 titles. And aside from a few outliers like Doom Eternal and Tomb Raider Shadow, the former perform vastly worse than the latter.
We were also told in this very forum "oh, once game engines are rebuilt from the ground up, we'll see the true performance of low-level APIs!" Yet DX12 is now nine years old, and even the brand new Baldur's Gate 3 can recommend switching to DX11 because it often runs better than Vulkan.
I said back then, anyone who thought the average game developer can optimize code better than GPU driver programmers backed by hardware engineers (ya' know, the people that actually build the things) was delusional, and I was right.
This is also exactly why VRAM requirements are ballooning. Because once again, it's lunacy to expect game developers to manage GPU memory better than AMD/NV engineers. In virtually every game that allows DX11 alongside Vulkan/DX12, DX11 uses far less VRAM.