You'll need to be connected to an HDR display in order to actually benefit from it, otherwise YouTube will automatically downsample it to your display chain's capabilities.
Don't think so, I still see a lot of artifacting in 1080p. It's most obvious with the dirt that is kicked up by that bike @ 0:45.Have they upped the 1080p's bitrate however?
Apparently only a few devices get the hdr version served, the rest still gets the sdr version. They just turned the color saturation to 11 in that video. That and the Lumias have well calibrated displays that profit more from properly mastered content.I have a lumia 1520 phone,and that video was the best picture I have ever seen with my phone @1080p, 60fps, for some reason.
Have they upped the 1080p's bitrate however?
I upload 20mbit videos only to see them reduced to 6-8mbit at best.
Since most of the world has 1080p/1440p screens, they should focus on that me thinks.
Progress is welcome however in all forms.
Well my pc despite having a dual core 4 threads haswell i5 cant handle a 1080p 60fps video without dropping frames let alone a 8k video which will kill it
Strangely enough my smartphone is perfectly capable of playing 1440p 60fps smoothly...
Would an i7 6700k play 8k video at 60fps without dropping frames?
I think something is wrong with your machine. My Ivy Bridge 3570K (4.2GHz) and 970 combo can play 4K60 YouTube seamlessly. So can my i3-2100 with Radeon 7750. I'd have to double check, but I'm pretty sure my 3570K can do 8K without dropping frames, too?Well my pc despite having a dual core 4 threads haswell i5 cant handle a 1080p 60fps video without dropping frames let alone a 8k video which will kill it
Strangely enough my smartphone is perfectly capable of playing 1440p 60fps smoothly...
Would an i7 6700k play 8k video at 60fps without dropping frames?
Not until they move to VP10 or whatever next gen codec they'll go along with.Have they upped the 1080p's bitrate however?
I upload 20mbit videos only to see them reduced to 6-8mbit at best.
Since most of the world has 1080p/1440p screens, they should focus on that me thinks.
Progress is welcome however in all forms.
https://en.wikipedia.org/wiki/AOMedia_Video_1
HDR videos will only be relevant when Intel & Nvidia has fixed function AV1 hardware decoding, atm no one except Intel's Kaby Lake has VP9 Profile2 10bit hardware decoding, limiting the amount of viewers because 4K decoding is heavy workload on CPU software decoding.
yes and no VP9 support in 16.10.1.
Ellesmere and Baffin implement UVD 6.3, which has no VP9 support what so ever.
https://en.wikipedia.org/wiki/AOMedia_Video_1
HDR videos will only be relevant when Intel & Nvidia has fixed function AV1 hardware decoding, atm no one except Intel's Kaby Lake has VP9 Profile2 10bit hardware decoding, limiting the amount of viewers because 4K decoding is heavy workload on CPU software decoding.
anyone else find it funny that youtube supports all these great things like 4k and HDR but at the same time compresses the hell out of the videos negating most of the point in having 4k and HDR in the first place.....
I do I only watch streaming video as a fall back. I much prefer my offline AVC 28Mbps 1080P Bluray disks along with the accompanying Dolby TrueHD / DTS Master audio or uncompressed multi-channel PCM!
http://forum.doom9.org/showthread.php?p=1782414#post1782414
AMD STILL can't even enable VP9 hardware decoding in Polaris as mentioned by a RX 480 owner, what makes you think they're even relevant?
https://forums.anandtech.com/threads/4k60-fps-vp9-decoding-performance.2477963/page-5#post-38412120
DXVAChecker pretty much exposes AMD's lies about VP9 support.
Pascal already has VP9 decoding. I just watched that video in the OP at 4K 60 FPS and my CPU usage was only about 2-3% with the Edge browser. Does HDR support require different hardware, or is it just a driver update?