You're assuming that the hardware decoders are properly implemented and software decode is a mere emulation. This is not the case. Onboard hardware decoders are designed to be as cheap as possible, not to properly process every feature.
I am no assuming that "software decode is a mere emulation". These codecs have mathematical algorithms, on how the process the bit-stream. It's in the codec standard. Sure, hardware may take some shortcuts, but ... so might the software decoders. I don't believe, a-priori, that somehow hardware is automatically "less accurate" than software decode. If anything, I would assume that the hardware decoders might be
more accurate, to a point, because the software-decoders don't have unlimited CPU cycles to process, they have a "latency deadline", otherwise, you get frame-skips.
And I disagree with your assertion, that hardware decoders "don't properly process every feature". You know this how? You've spoken to hardware codec designers?
While it's true, that hardware codecs are perhaps "brittle", in a sense, that the incoming video stream needs to be standard-compliant, and within the defined parameters of the specified profile level of the codec. They aren't as flexible as a software codec, in terms of handling streams that are non-compliant. Perhaps that's what you really meant to say? That hardware decoders can't process some streams that more forgiving software decoders can work around? Sure, I'll give you that.
But these things, codecs and profile levels, are written in standards documents. And if a CPU supports "HEVC Main12 profile" in hardware, then sure, I expect it to be supported as well as a software codec is, in terms of features.
Edit: I mean, if you're in the Anime community, I know that they use some advanced encoding stuff, that's often incompatible with mainstream hardware video decoders. So I can see where you might get the idea that hardware decoding is "missing features", or otherwise, a "lesser creature" than certain software decoders or codecs. But I don't really believe that to be true. Take a good, mastered video stream, say a Blu-Ray or HD-DVD raw rip (into MKV), and play it back using hardware decoding... to my eyes, it looks amazing, beautiful, and thanks to the hardware decoding support, perfectly fluid.
I don't pretend to attempt to understand people, that think that video isn't encoded "correctly", unless they run it through 20-50 passes of the CineCraft encoder, or something.
But hey, I still love 'em for their releases.