This really doesn't make much sense at all to me. DX12 is a low level API, it's supposed to expose more of the GPU, not less. Anything available in DX11 should be available in DX12, and then some. Plus, there are games where NVidia definitely gains in DX12, like Ashes of the Singularity, a very highly optimized DX12 title. So to me it seems like it's much more a matter of optimization than anything else. It took MONTHS before Oxide studios could patch the game and bring NVidia's DX12 performance up to par as the game was heavily AMD biased from it's inception (plus NVidia definitely tweaked their DX12 drivers), but now NVidia is very competitive in that game.
You know there are times when I really want to believe that you are an unbiased source, but when you make comments like these, it's hard for me to drink the Kool-Aid.The insinuation being of course, that NVidia's architecture is inherently DX12 unfriendly. No matter how many times we see NVidia outperform AMD in DX12 titles, this myth still persists and it seems to be driven by the same people.
Could it not be that their arch is designed to work with DX11, but not optimized to work with DX12 when compared to DX11? I would imagine that DX11 has a structure that is inherently different in terms of how things are executed. Thus, it would seem reasonable that on the hardware side, their arch was not set up to take advantage of DX12 like it was with DX11. That is why drivers are needed, which make up for the hardware not being designed around DX12 as it was with DX11.