This is a terrible assumption. Just because a title is old doesn't mean that it runs better than current titles. Go look at Kingdom Come Deliverance or Watch Dogs 2, as examples of DX11 titles.
So what? There are always modern titles that run better on one architecture or another due to sponsorship or optimizations being done for those games. Normally we tell people who really care about performance in one specific title to go with the company that gets the best performance in that particular title even if it's not the best choice overall.
I can give examples of even DX9 titles which bring modern GPUs to their knees but that'll be a digression.
So what? There aren't very many of those games being made so it's not worth optimizing for and it isn't as though Black Mesa looks particularly good (outside of comparing it to the older games in the franchise) so bumping up the resolution doesn't add a lot.
How do you know that those who play older games are insignificant enough to not matter?
How do you know they are? You're the person asserting that it's important, so the burden to prove it is rests on your shoulders. If I were to make the claim that unicorns are real, I don't get to demand that you have to prove they actually aren't when you say you don't believe me.
Wow you must love AMD GPUs so much that you're willing to gloss over its flaws.
No, I have no problem admitting their cards don't perform as well in older titles, which is obvious from benchmarks. I just don't think it's worth AMD's limited time to address that problem for all the reasons I previously stated. There are plenty of other reasons to prefer an Nvidia GPU such as RT performance, but if you don't think that's important either then it doesn't matter or shouldn't factor in to purchasing decisions.
There is a word for that behaviour, too bad it isn't allowed to be mentioned in these forums.
You can't say "sane" on these forums anymore?