Sure, since it is available through DXVA. OpenCL solutions never match true hardware implementation in terms of performance or power efficiency, so a true decoding capability is highly preferred. VP9 is inferior format and it needs to die, sooner the better. Unfortunately it was chosen by Google and their muscle guarantees that it's departure won't be as swift as one could hope. If your CPU cannot decode VP9 at 1080P regardless of the bitrate, it is pretty much useless for anything else too.
VP9 is more advanced of a format than h264. You can sort of look at it as a middle ground somewhere between h265/HEVC and h264. This says nothing as to the quality of the encoders being used but just want to put it into perspective since some people here may have the wrong idea about VP9.I'd hope so, VP9 is a competitor to H.264, it's not really meant to go head to head with HEVC. The competitor to H.265 (HEVC) is AV1 which is a collaboration of Mozilla's Dalla, Cisco's Thor, and Google's VP10, all rolled into one royaltyfree codec called AV1. Which should get a finished standard at some point in the first half of 2017.
Superior technology always comes at a cost. Anyone who has ever tried to encode VP9 using the reference encoder / library soon realizes what kind of a joke it is. Not to mention the codec itself is inferior to HEVC in terms of compression efficiency. It is pretty hard for a new contender to put up a serious challenge on HEVC, since HEVC is largely based on the current industry standard (AVC, 264).As for the next generation of video codecs there's one thing that people may not have realized about HEVC/h265, the licensing costs are much much higher. This may push people to alternative codecs such as AV1.
Considering the prices that HEVC is charging it's not difficult for an alternative to look very attractive. Furthermore, at least on Youtube, VP9 offers superior quality to h264. I never said that VP9 was direct competition to HEVC, it's somewhere in the middle between h264 and HEVC.Superior technology always comes at a cost. Anyone who has ever tried to encode VP9 using the reference encoder / library soon realizes what kind of a joke it is. Not to mention the codec itself is inferior to HEVC in terms of compression efficiency. It is pretty hard for a new contender to put up a serious challenge on HEVC, since HEVC is largely based on the current industry standard (AVC, 264).
Considering AV1 also has hardware partners out the ass, they should be. If it's anywhere close to HEVC I don't see why anyone would bother paying the royalties.I wonder if the HEVC patent pool holders are starting to panic. If not, perhaps they should be.
You say that, but then HDMI has been the dominant home theater connector for a decade despite its royalties and poor performance (relative to VESA DisplayPort or DVI, for example). Planned obsolescence is a huge part of what makes certain standards "better" than others. In this particular case, AV1 won't even be released until 2017, while HEVC was already complete and ready in 2015, so timing was the real reason.Considering AV1 also has hardware partners out the ass, they should be. If it's anywhere close to HEVC I don't see why anyone would bother paying the royalties.
Adobe
Amazon
AMD
ARM
Ateme
Cisco
Intel
Ittiam
Microsoft
Mozilla
Netflix
NVIDIA
Vidyo
VeriSilicon
Except, every major digital content provider, every major browser maker, plus AMD, ARM, Intel, MS, and Nvidia are all firmly behind AV1. Within 12 months of AV1 being published, every GPU will support it in hardware. Ever browser will support it. Android will default to it. Youtube will default to it. Amazon will default to it. Netflix will default to it.Superior technology always comes at a cost. Anyone who has ever tried to encode VP9 using the reference encoder / library soon realizes what kind of a joke it is. Not to mention the codec itself is inferior to HEVC in terms of compression efficiency. It is pretty hard for a new contender to put up a serious challenge on HEVC, since HEVC is largely based on the current industry standard (AVC, 264).
No one is saying that HEVC isn't more advanced currently. The issue is the licensing costs. The licensing costs for HEVC are prohibitively expensive which is why you have all of these other companies/groups backing AV1. Encoders can be improved.Had a look at the much hyped AV1. Based on what I saw, I cannot really see what the fuss is all about. AV1 can match or even excel HEVC in term of quality, but outside of being encoded using a dedicated, fixed function encoder I cannot see any serious use for it (for consumers). It shares the same exact issues as VP9 does, it is extremely slow and clumsy. Anyone who has tried encoding HEVC knows that it is extremely slow compared to X264 for example. Based on my tests using the most recent encoder versions (AOM 0.10 & HEVC 2.1), HEVC is 11.3x - 13.6x faster at the same bitrate and at settings resulting comparable (within 0.005) SSIM. Since HEVC is already extremely slow to encode it means that AV1 is completely impossible to be encoded on a CPU.
I used a RAW 420P 1080P ("Cobra" by Harmonic) sample video with following settings:
AV1 = --passes=2 --good --i420 --threads=18 --width=1920 --height=1080 --target-bitrate=* (*1000 / 5000) --end-usage=cbr --cpu-used=2
X265 = --preset medium --input-res 1920x1080 --fps 30 --bitrate * (*1000 / 5000) --pools "18" --pass * (* 1 / 2)
AV1 - 1000kbps = 3.71fps (2nd pass), SSIM ~ 0.967, 5000kbps = 1.92fps (2nd pass), SSIM ~ 0.975
HEVC - 1000kbps = 41.71fps (2nd pass), SSIM ~ 0.9675, 5000kbps = 26.15fps (2nd pass), SSIM ~ 0.973
Tested on Haswell-EP HCC (18C/36T), both binaries compiled with x86-64 GCC 6.20 & YASM. AOM compiled with cpu runtime detection / "fat binary" (dispatcher) enabled.
If its a smart TV then pushing an updated FFMPEG build for the Android OS running it shouldn't be too difficult, unless the SOC in that smart TV is old rubbish, which it shouldn't be for 2015 era smart TVs.I get the impression that Netflix will continue to use HEVC for a long time even after AV1 lands. The problem is that HEVC will be too firmly entrenched in TVs and stuff like that, and those can't be updated to support AV1. At best, Netflix will start supporting both AV1 and HEVC, the former for computer browsers.
Perhaps that installed base in TVs, etc. is what the patent pool is counting on but in my undereducated opinion that is very risky for them since some manufacturers could easily switch to support AV1 in the future. Sony won't but as time goes on they are becoming less important as a driver of these types of technologies.
It's often not about the software but the capability of the hardware decoders present.If its a smart TV then pushing an updated FFMPEG build for the Android OS running it shouldn't be too difficult, unless the SOC in that smart TV is old rubbish, which it shouldn't be for 2015 era smart TVs.
Sweet but I see that page is for Windows 10. What about macOS? (I'm guessing they can't really say because Kaby Lake Macs don't actually exist yet.)https://bugzilla.mozilla.org/show_bug.cgi?id=1292374
Firefox 52.0 when released will have VP9 hardware decoding support on Intel Kaby Lake, Nvidia Maxwell GM206 & Pascal family GPUs.
both have hardware encoding and decoding but Intel ==> 4K VP9 while APU ==> 1080p VP9.So APUs are out of the question for 4K VP.Also, 1080p at what Fps? 1080p@60 possible ?Like Intel's recently-released Kaby Lake CPUs, Bristol Ridge offers hardware acceleration for the HEVC and VP9 video codecs. It's worth noting that these APUs can only perform hardware encoding and decoding of 4K content for HEVC, however—VP9 support is limited to 1080p content. Kaby Lake, on the other hand, can decode 4K VP9 content in hardware. Given that the next-generation codec wars still appear to be in full swing, Bristol's limited VP9 support may not be that important in the grand scheme of things.
It's not real two-pass.I guess, I don't quite see how hardware video encoders can be both two-pass, and real-time.
QuickSync(Haswell/Broadwell/Kaby Lake(Skylake not 100% sure)), NVENC(Maxwell+?), and VCE 3.x(BriRid,StoRid, Pol12) support this fake 2pass.NVENC does have a "vbr_2pass" in its options, but what it means is that it will
look ahead a number of frames for the rate control to make better decisions.
True two-pass encoding means to do two full passes over the data. Although what
NVENC does improves the quality is this far from the real thing. The naming of
the options is confusing and misleading...
"Two-pass" isn't a very good way to describe it anymore considering that even 2-pass with x264 is really just to get a crf value which will hit the target bitrate (1-pass crf and 2-pass will output the exact same thing). The difference here is that x264 actually has excellent rate control whereas the hardware encoders probably are not as good in that category.It's not real two-pass.
QuickSync(Haswell/Broadwell/Kaby Lake(Skylake not 100% sure)), NVENC(Maxwell+?), and VCE 3.x(BriRid,StoRid, Pol12) support this fake 2pass.