Can anyone elaborate on this?
Is this some kind of new extension to the H264 codec standard?
Why isn't it hardware-accelerated like normal H264 material is, on current video cards?
What kind of CPU horsepower is necessary to decode this material, if your video card HW doesn't support HW decoding?
You could have just PM'ed me or Googled it:
http://haruhichan.com/wpblog/?p=205
You can actually have DXVA working on such materials but the decoder is incapable of properly decoding the colors, resulting in artifacts akin to running in 256-color for viewing your pictures/videos.
You can use FFDShow Tryout or the LAV/MadVR combo but that leaves out GPU acceleration. Most modern CPUs will have no issue. The trouble lies on stand-alone media players and low-powered CPUs that depend on GPU acceleration to get their work done. Hi10 materials will render them impotent.
Edit: I never pay attention to how much more CPU power is required with 10-bit but, with FFDShow Tryout, there is probably very little difference (a few % over the normal 8-10% on my Lynnfield).
It's the LAV/MadVR combo that can put lesser GPUs to their knees. The MadVR renderer utilises the Shader Processors heavily on these. I notice up to 60% GPU usage on the HD5850 and I have read somewhere else that people with lesser video cards (e.g. HD4200, HD45xx, etc) are experiencing stutterings.