Discussion [ H ]: Battlefield Raytracing 2070

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/1

55% performance hit makes an unplayable slideshow on a heavily overclocked 2070, even at 1080p. Who buys a $600 card to slideshow @ 1080p?

Also previous comparisons of DXR are flawed. The correct way is to compare it to DX11:
15449785070ikskvadx6_8_2.png
DX12 drops performance by 18% just by itself, so using DX12-only falsely shows a smaller hit.

Yet another example (of many) showing the failure of low level APIs. Johan Anderson was cheerleading Mantle yet his AAA engine still flops DX12 even after repeated iterations injected with $millions from EA. If even he can't get it right then it's time to move on.

I still remember certain individuals on this forum telling us "DX11 would be dead within 18 months and all indie developers would get automatic performance gains from DX12" (LOL).

Epic fail.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
There is no going back on low level api's - development of GL has basically stopped. It's being deprecated on apple devices. Metal, DX12 and Vulcan are what we will have to develop with. They are horrendous to code, it's like going back to the 1960's - you have to write your own memory manager's for Vulcan, you have to do everything and it's got to be efficient over a range of different gpu's with different bottle necks and performance characteristics.

I do agree it basically started as a way of gpu makers getting out of needing to code a lot of the driver complexity - e.g. they had/have incredibly complex drivers with tweaks in them for individual games. It took a lot of time and money to maintain them - something AMD in particular didn't have. All that goes, and once they realised that at least some of the software writers (e.g. The BF bunch) actually wanted the low level api. Saves the gpu driver writers a lot of time and money as it basically pushes all complex coding onto the games/software devs.

You can see how hard it is to do well, after all that time starting with Mantle and continuing on to DX12 the BF devs are 20% behind what effectively the Nvidia dev's can do - given that DX12 is mostly down to the BF devs, and DX11 gives the Nvidia driver writers a lot more control.

I suspect we'll eventually end up at some mid level with standard libraries to do most of the basic stuff (e.g. memory managers) that it's really inefficient to have each company writing their own variant of.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
the "funny" thing is that I can't remember the Mantle version of BF4 having this kind of performance drop that DX12 is showing.
That's because Mantle at the time was the low level api for one design (GCN) of AMD gpu's, and AMD probably gave Dice most of the driver code. Even then it was buggy for quite a while, and it only worked well for the exact gpu's it was optimised for, the ones that came later didn't work half as well.

DX12 and Dice's implementation of it has got to cover lots of gpu designs from Intel, AMD and Nvidia, and can't just be optimised for a single one. In addition Nvidia's driver team are significantly ahead of AMD's (due to much greater manpower) so optimise better, which is part of the reason AMD released Mantle in the first place - it got them out of a driver writing competition they weren't going to win.

Fundamentally these low level api's move some of the driver coding from the gpu manufacturers driver writers to the game devs, and the gpu manufacturer driver writers, particularly at Nvidia, are just better at it for their cards. Not surprising as that's all they do. Hence DX11 despite the overheads still beats DX12 particularly on Nvidia gpu's.
 
  • Like
Reactions: prtskg

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/1

55% performance hit makes an unplayable slideshow on a heavily overclocked 2070, even at 1080p. Who buys a $600 card to slideshow @ 1080p?

Also previous comparisons of DXR are flawed. The correct way is to compare it to DX11:
15449785070ikskvadx6_8_2.png
DX12 drops performance by 18% just by itself, so using DX12-only falsely shows a smaller hit.

Yet another example (of many) showing the failure of low level APIs. Johan Anderson was cheerleading Mantle yet his AAA engine still flops DX12 even after repeated iterations injected with $millions from EA. If even he can't get it right then it's time to move on.

I still remember certain individuals on this forum telling us "DX11 would be dead within 18 months and all indie developers would get automatic performance gains from DX12" (LOL).

Epic fail.
Agree with the sentiment about Dice failing here but just an FYI Johan has been gone from EA for a while now.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
But when DX12 or Vulcan are put in the hands of the masters like ID you get brilliance like Wolf 2 and Doom vulkan performance.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Battlefield is such a terrible game for DXR. I run almost all low settings on my 2080 so that I can feed 1440p @ 144hz buttery smooth and it still looks decent. I see DXR as being a fantastic technology for initially single player games once the 3000 series is out. In a single player game I can spend a little more time and I care about immersion more than a competitive multiplayer game.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Shocking, low level bare metal APIs that requires intimate knowledge of the hardware fails across a spectrum of hardware. I said this years ago when Mantle was being touted. The industry moved away from low level access to the hardware because of compatibility and complexity. We have a hard enough time getting games working at a higher level. Now we expect a render path per piece of hardware and any associated bugs also be fixed? Good luck. Instead feed it into a high level API and let the driver sort it out. Everything is going away from direct access to hardware. But people believed video cards would be different?

The only place these low level APIs belong is in an ecosystem like a PS4\Xbox One. Where the developers have a single piece of hardware with predictable performance.
 
  • Like
Reactions: DooKey and ozzy702

GodisanAtheist

Diamond Member
Nov 16, 2006
6,717
7,014
136
No expert in the APIs but: Is there any home for a sort of "fusion" API that handles most of the basic stuff in sort of generic fall back mode with perhapse more complicated elements being GPU specific (phsyics, AI, Ray Tracing).

Or is that what DX 12 already is?

Low level APIs always seemed shady because they appeared to break one of the core tenets of forward compatibility and cross compatibility that makes PC gaming PC gaming.

**Pours out a 40 for the bliss that was UT in Glide**
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
No expert in the APIs but: Is there any home for a sort of "fusion" API that handles most of the basic stuff in sort of generic fall back mode with perhapse more complicated elements being GPU specific (phsyics, AI, Ray Tracing).

Or is that what DX 12 already is?

Low level APIs always seemed shady because they appeared to break one of the core tenets of forward compatibility and cross compatibility that makes PC gaming PC gaming.

**Pours out a 40 for the bliss that was UT in Glide**
DX12 isn't to the metal. It does have a thin layer to handle card specific stuff. It's just not as high level as DX11.
 
  • Like
Reactions: Headfoot

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
I suspect we'll eventually end up at some mid level with standard libraries to do most of the basic stuff (e.g. memory managers) that it's really inefficient to have each company writing their own variant of.

Isn't that the point of game engines? Those need to be coded once per game engine.

Those engines are adapted to work with, not build for Dx12.

Which is the real issue. If you have a motorcycle as base, yes you can with clever tricks build a car around it. But it won't be a good car and probably be slower than the bike. What you want is your own platform completely freed of bike-heritage.

It's clear, if it will ever work, dx12 only engines are needed.
 
  • Like
Reactions: coercitiv

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Isn't that the point of game engines? Those need to be coded once per game engine.
Kind of but that game engine has to do much more of the performance optimisation that was done by the drivers so the game engine devs now need to do a lot of the driver dev's work which is hard - those driver devs are the experts at how to get the most out of some class of gpu and better at it then the game engine devs. The drivers are also updated monthly and get things like new gpu's and performance optimisations added, where as the game gets updated less - many games never get updated again a few months after release so are stuck on some out of date version of the engine.

In addition individual games still write many of the shaders, they don't just use something pre-packaged with the game engine. It's shaders that tend to be badly written for a particular gpu and were being replaced by the drivers in DX11 for big performance boosts. In DX11 you'd get a game, Nvidia/AMD would take a look at it, essentially fix some of the performance bottlenecks by hacking it using the drivers. In DX12 the game engine dev's won't do that - they'll sell a generic DX12 engine to some game maker. The game maker has no time to do it as they are too busy trying to release a game to a silly deadline, and the gpu maker driver writers can't do it as the drivers are now too low level.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
In addition individual games still write many of the shaders, they don't just use something pre-packaged with the game engine. It's shaders that tend to be badly written for a particular gpu and were being replaced by the drivers in DX11 for big performance boosts. In DX11 you'd get a game, Nvidia/AMD would take a look at it, essentially fix some of the performance bottlenecks by hacking it using the drivers. In DX12 the game engine dev's won't do that - they'll sell a generic DX12 engine to some game maker. The game maker has no time to do it as they are too busy trying to release a game to a silly deadline, and the gpu maker driver writers can't do it as the drivers are now too low level.

Yeah I read that post back then as well. It simply means the game codes need to step up and learn but given the fact that programming in that industry is underpaid while doing overtime I can clearly see why the good ones will leave sooner rather than later. I mean in other performance critical industries you don't have intel fixing your REDACTED code because our lazy and bean-counting. It explains the situation but it's no excuse for it.

We have a zero tolerance policy for profanity in the tech sub-forums.
Don't do it again.

Iron Woode

Super Moderator
 
Last edited by a moderator:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It's a question of priorities and what's going to get you a bonus at the end of the year. If the Nvidia driver guy tweaks popular game X so it runs 5% faster and means it's now faster then the competition so Nvidia sells more cards he gets a bonus.
If the game dev spends his time doing the same thing no one much cares - that's not going to sell more copies of the game. If instead he gets the game out the door faster, or with the new battle royal mode feature done then his boss loves him as that makes the game a success.
 
  • Like
Reactions: ozzy702

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It's not just AMD pissing on nVidia.
There was a need for low(er) level API.

Shitty developers need to step up their game.
There was a desire for low level API's, but a lot of us knew the reality was they were not needed. Even a great developer is going to need to spend a lot more time on the low level API, and constant tweaks for new GPU's releasing. Time they simply don't have.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
It's not just AMD pissing on nVidia.
There was a need for low(er) level API.

Shitty developers need to step up their game.

Hope in one hand, crap in the other and tell us which one fills up first. Developers have timelines, money to make, any reason you want to not to sit down and build a custom low level path for every card on the market. And then troubleshoot any potential issues on any given path.

Technology as a whole is getting away from bare metal in order eliminate that part of incompatibility. Thinking Graphics cards would buck this trend is naive imo. Closed ecosystems aside.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Realistically only one company needs to develop a really good DX12/Vulcan engine and maintain it. There are already a lot of games that license their engine or companies that reuse the same codebase across multiple games. That alone cracks things wide open.

However, there are still a lot of gamers that can’t even use DX12 so of course companies aren’t in a rush to use it. Also, as many have pointed out it’s a big shift from DX11 and requires a lot of new learning on the part of developers. That takes time and an openness to doing things differently.

There are other impediments as well. Both NVidia and AMD have an interest in making it harder for each other’s platforms to succeed so there isn’t a lot of interest at the hardware level for compatibility. I suspect that neither really minds if you need to update cards more frequently than you might like due to driver rot either.
 

Geegeeoh

Member
Oct 16, 2011
145
126
116
Dx12 it's not that low level.
Game developers can and should do better... just see what id Software can do.

Also AMD and nVidia will send their engineers to help who needs.

But untill Dx11 is the priority... Dx12 won't shine.
 
  • Like
Reactions: happy medium

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
It's not just AMD pissing on nVidia.
There was a need for low(er) level API.

Shitty developers need to step up their game.
Nonsense. What's next, asking game developers to write their own C++ compilers? How about their own OS?

Mantle started because AMD wanted to shift their optimization burden to developers. Because let's face it, AMD can't optimize their driver performance as well as nVidia. That's an objectively proven fact, especially DX11 multithreading and OpenGL.

It's not the job of the developer to optimize for GPU architectures down to the hardware like that. The driver is responsible for that heavy lifting. No developer is going to know the hardware as well as a GPU engineer.

Also games aren't perpetually patched so when new hardware arrives, it's once again the job of the driver to optimize for that.

Witness the utter failure of Mantle on a 285 which was only a minor change over previous hardware, because the games stop getting patched:
67600.png
Low-level APIs on an open platform are a terrible idea. It's why we need DOSBox emulation to play DOS games today, because the mechanism those games operate under no longer exists. and patches for 30+ year old games aren't going to happen.
 
  • Like
Reactions: Genx87

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Nonsense. What's next, asking game developers to write their own C++ compilers? How about their own OS?

Mantle started because AMD wanted to shift their optimization burden to developers. Because let's face it, AMD can't optimize their driver performance as well as nVidia. That's an objectively proven fact, especially DX11 multithreading and OpenGL.

It's not the job of the developer to optimize for GPU architectures down to the hardware like that. The driver is responsible for that heavy lifting. No developer is going to know the hardware as well as a GPU engineer.

Also games aren't perpetually patched so when new hardware arrives, it's once again the job of the driver to optimize for that.

Witness the utter failure of Mantle on a 285 which was only a minor change over previous hardware, because the games stop getting patched:
67600.png
Low-level APIs on an open platform are a terrible idea. It's why we need DOSBox emulation to play DOS games today, because the mechanism those games operate under no longer exists. and patches for 30+ year old games aren't going to happen.

While I mostly agree with you, DX12 and Mantel aren't to the metal either. There is room for a middle ground.
 
  • Like
Reactions: Headfoot

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
As far as high end graphics technology development is concerned, D3D11 is a dead end gfx API which will just receive maintenance fixes from now on and that means the prospects of exposing new hardware features will be extremely grim in the future ...

D3D12/Vulkan are the only options capable for high performance graphics applications since developers can't get access to killer features like bindless, programmable sample positions, depth bounds test, multiview rendering, barycentric coordinates in pixel shaders, 64-bit shader atomics, cross-lane operations, separate compute/copy queues, resources barriers, and multi-rate shading in D3D11 ...

Aside from ray tracing, Microsoft could decide in the future to standardize primitive/mesh shaders in D3D12 and no amount of concerns from programmers about how it's very verbose will amount to the fact that D3D11 is dated by comparison since it's exposed hardware capabilities and abstraction is not a good fit for the current hardware ...

Maintaining parallelism with D3D11 is seriously starting to become unsustainable in the long term once we consider how much of the frame time is wasted on straight up stalling in the driver/hardware either due to resource transitions or cache flushes which amounts to significant losses in tons of GPU execution resources where hundreds of thousands of threads in-flight are being blocked ...
 
  • Like
Reactions: Headfoot