• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

4K60 FPS VP9 decoding performance

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
50,427
6,027
126
My CHT Acer Cloudbook 1.6Ghz dual-core couldn't play back the 4K60 VP9 video. Going at like 1 FPS. (Shouldn't be an internet issue, this laptop has 802.11ac wireless.)

Interestingly, I think thse CHT "TV boxes" with similar specs are specced to be able to play back 4K videos, and most can successfully, but those videos are H.264 or H.265. I don't think CHT plays back VP9, or 4K VP9, in HW.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,294
129
106
802.11ac will dictate the link speed between your device and the router, it doesn't mean your internet is fast. That's determined by the plan you have with your ISP
 

VirtualLarry

No Lifer
Aug 25, 2001
50,427
6,027
126
802.11ac will dictate the link speed between your device and the router, it doesn't mean your internet is fast. That's determined by the plan you have with your ISP
Yes, I realize that. I mentioned elsewhere in this thread that I have 50/50 internet.

Wireless N150 gets 20-30Mbit/sec nominally, which might limit my connection speed. I was mentioning that laptop has AC wireless, so the local connection wasn't a bottleneck.
 

fastamdman

Golden Member
Nov 18, 2011
1,327
68
91
I wish I had a 50 upload, I am on 150/25, but I actually receive 150/30 on speedtests give or take, but from downloads I see 175-200 down because my service provider has this "boost" thing apparently.
 

imported_bman

Senior member
Jul 29, 2007
262
54
101
Played around with MPC-HC with the latest LAV-Filters that have hardware support for VP9 decode and found the performance to be about the same as software decoding on my Core M Skylake. Guess one will have to wait for Kabylake to decode UHD VP9 on Core M, I hope Intel is ahead of the game for the upcoming AV1 codec and includes full hardware decode of AV1 in Cannonlake.
 

Carfax83

Diamond Member
Nov 1, 2010
6,050
850
126
That same video when played using mpv only uses ~30-35%. This is on an i7-4771 with me just looking at task manager CPU usage.
One has to wonder that if ffVP9 is that much faster than libVPX, why haven't Google integrated it into Chrome?

I mean, it's not like there's any royalties involved, since VP9 is open sourced. And one of the developers of ffVP9 used to work for Google if I'm not mistaken..

My 4790k is still using like 75% on all cores on the 4k video with the gtx 1080. Is there something I need to enable for it to work with vp9? This is also a fresh windows install so I may be missing something as well.
What drivers are you using? Unless NVidia haven't released drivers that enable hardware accelerated decoding for VP9, you should be seeing much lower CPU utilization.

I would use NVinspector and look at your VPU usage when streaming 4K/8K videos. The VPU may not be working..
 

IGemini

Platinum Member
Nov 5, 2010
2,473
2
81
I read that ASUS worked HEVC decoding into the GTX 980 Poseidon's core, not sure if it extends to VP9. The 4K60 video consistently runs my GPU at 33% and my 4670K uses 50-75% (all stock), perfect playback with no dropped frames. My internet is around 110/10.

The 8K video is a different story. My setup drops a lot of frames at a couple points early on, but most of it plays fine. GPU uses 18% and CPU is fully taxed.
 

lyssword

Diamond Member
Dec 15, 2005
5,763
25
91
using my rig 0 dropped on 4k 60-70% of all cores, 70 frames dropped on 8k (I beleive those were due to connection): 90-100% of all cores loaded125mbps connection
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
One has to wonder that if ffVP9 is that much faster than libVPX, why haven't Google integrated it into Chrome?

I mean, it's not like there's any royalties involved, since VP9 is open sourced. And one of the developers of ffVP9 used to work for Google if I'm not mistaken..
Because bikeshedding and politics. Your guess is as good as mine.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,265
75
91
unless you're one of the lucky few to have a Pascal GPU, it's decoded in software by the CPU.
Would be interesting to compare power consumption versus software. Surprised, no review has mentioned that in-depth.

unless they changed it you need Chrome to play VP9, if you use another browser it will play using h264 and not display all the resolution options
Yeah, it's limited to 1080p@60.
 
Last edited:

NoStateofMind

Diamond Member
Oct 14, 2005
9,716
6
76
Core i5-750 @ stock stays at 91% up to 98% for the 20 seconds I tried it and it was choppy as hell. I have 16GB RAM which should be enough but as mentioned above it's probably more to do with CPU/decoding power.
 

mnewsham

Lifer
Oct 2, 2010
14,364
344
136
Would be interesting to compare power consumption versus software. Surprised, no review has mentioned that in-depth.
With a 5820k @ 4.4GHz I am idling around 170w power draw, while playing 4k youtube in chrome (CPU only) I was pulling between 215w and 255w. With 10-50% utilization per core.

When using the M$ Edge browser and my GTX 960 for hardware decoding I am pulling 190-210w with 0-20% utilization per core on the 5820k and 25-55% utilization on the GPU video engine.

So it appears to save a decent bit of power, a good 30w+ on average.
 

mnewsham

Lifer
Oct 2, 2010
14,364
344
136
@mnewsham

Thanks. Did you use Edge with the VP9 flag enabled, though?
It wasn't but i'm getting identical results using firefox. (with the VP9 flag enabled) 200-210w vs 220-250w. The power draw is much more steady as well. With chrome it's fluctuating a lot more.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,265
75
91
I see, well at any case Maxwell v2+ seems to support VP9 decoding well enough. In my own findings, Edge seems to be the most efficient as well.
 

SPBHM

Diamond Member
Sep 12, 2012
4,978
340
126
Yeah, it's limited to 1080p@60.
my info was outdated, back when I first checked the 4K60 videos last year Chrome was the only browser with VP9 support

now testing the updated Firefox it also uses VP9 and it plays the 4K60, both Chrome and Firefox had 99% CPU usage on my i5 (sandy bridge 3ghz) but, Chrome had a lot more dropped frames... basically on firefox it kept dropped frames at 0 while I tested but didn't feel smooth like it does at lower res, in any case it was a clear win for firefox, chrome stuttered a lot more.
(I'm running Chrome 64 bits and Firefox 32bits, the latest version for both)

but I guess it's a good thing my internet is so slow, so I can't watch this stuff anyway, I have to pause and let it buffer for a bit to watch a few seconds without it being a factor...

but as I said before, judging by what I see here, I would think a stock i7 2600 will run this stuff fine, higher clocked old i5s and new i5s in general should also be fine...
 
Last edited:

ressonantia

Junior Member
Jun 16, 2016
10
0
11
i5-6500 @ stock speeds
Internet speeds around 12Mbps (definitely lots of buffering especially for the 8k video)

4k = CPU 60~65% usage, 0.6% dropped frames (although I think this has more to do with my connection than the CPU)
8k = CPU 60~80% usage, 6% dropped frames
 

fastamdman

Golden Member
Nov 18, 2011
1,327
68
91
I'm using nvidias newest hot fix drivers aka 368.51

I downloaded the program you recommended and my gpu usage isn't going above 10% and the VPU is staying at 0. Any suggests or ideas as to why this would happen? The videos play just fine and checking my cpu it looks like this time around its sitting around 40% instead of the higher 70% ish that I was seeing prior, but im assuming it should be lower with the gtx 1080.

Let me know if there is another way I can test to see if the VPU (no idea what that stands for or is) is actually working on this card.

One has to wonder that if ffVP9 is that much faster than libVPX, why haven't Google integrated it into Chrome?

I mean, it's not like there's any royalties involved, since VP9 is open sourced. And one of the developers of ffVP9 used to work for Google if I'm not mistaken..



What drivers are you using? Unless NVidia haven't released drivers that enable hardware accelerated decoding for VP9, you should be seeing much lower CPU utilization.

I would use NVinspector and look at your VPU usage when streaming 4K/8K videos. The VPU may not be working..
 

Batmeat

Senior member
Feb 1, 2011
773
35
91
4K60 FPS on YouTube uses the VP9 codec, and is CPU intensive because unless you're one of the lucky few to have a Pascal GPU, it's decoded in software by the CPU.

I'm curious to see how various CPUs (especially quad cores with and without SMT, and dual cores with SMT) handle this test video of BF1 @ 4k60 in browser. (note that the actual resolution isn't 4K, but it shouldn't matter)

Check your CPU usage, and look for stuttering or lag. On my 5930K @ 4.4ghz I'm getting mid 20s to low 30s as far as CPU usage goes, and very smooth playback.

It doesn't default to 4k60 so you'll have to manually set it. Also, to get 4K you'll need either Chrome, Firefox or the latest Windows 10 insider's build of Edge.
I'm confused. Do we need to set it to 4k or not, and how do we do this? I got 32% CPU usage with no frame drops or stuttering using a Core 2 Duo E7500 stock at 2.9Ghz. Seed the image link below.

 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I'm confused. Do we need to set it to 4k or not, and how do we do this? I got 32% CPU usage with no frame drops or stuttering using a Core 2 Duo E7500 stock at 2.9Ghz. Seed the image link below.

You'll notice in that image in the "stats for nerds" that you were only playing the 480p stream. You need to manually select the 2160p stream (or 4320p) from the YouTube interface. So you've got 32% CPU usage with 480p. What GPU are you using?
 

Batmeat

Senior member
Feb 1, 2011
773
35
91
You'll notice in that image in the "stats for nerds" that you were only playing the 480p stream. You need to manually select the 2160p stream (or 4320p) from the YouTube interface. So you've got 32% CPU usage with 480p. What GPU are you using?
Now I see. Did it and noticed 100% cpu usage and roughly 50% frame drops with huge stutters.
 

escrow4

Diamond Member
Feb 4, 2013
3,333
113
106
I'm confused. Do we need to set it to 4k or not, and how do we do this? I got 32% CPU usage with no frame drops or stuttering using a Core 2 Duo E7500 stock at 2.9Ghz. Seed the image link below.

A 7500 hits 30% usage with DVD res? Surprising.
 

IGemini

Platinum Member
Nov 5, 2010
2,473
2
81
Would be interesting to compare power consumption versus software.
Interestingly, I run a pair of VE248H monitors separately on the 980 and 4600. Seems to allow for better framerates in gaming...when I tried this in Witcher 3 I got a ~10% bump. These have been adjusted down 15W for the second monitor:

GTX 980:
4K: 100W
8K: 140W w/<5% dropped frames

HD 4600 (512MB VRAM):
4K: 135W
8K: 155W w/25% dropped frames

I'm mildly surprised the integrated didn't drop any frames at 4K, though it still ran the CPU at max.
 

ASK THE COMMUNITY