H.265 encode / decode, and Intel CPUs and QuickSync?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dazelord

Member
Apr 21, 2012
46
2
71
Don't know why you would use H.265 right now even if willing to put up with the encode times. Reason being nothing much supports hardware playback right now.

For anyone that wants to check it out and has Chrome, you can download the libde265 player - go to apps and search for HEVC, should be the 1st one listed.

I downloaded the 720p and 4K versions of "Tears of Steel" from here : http://www.libde265.org/downloads-videos/

The 4K video had my i7-4790 running 35-45% total CPU usage across 8 cores (4 physical / 4 HT), and I could barely watch it. In VLC it was even worse -
it was a slideshow with massive artifacting and whatnot.

Only those with GTX 960s can play these back with hardware decode support right now. With 4-16x the processing needs for decode vs h.264, using just the CPU is pretty much not an option unless you're at 720p. I suspect a lot of lower-end CPUs would struggle with 1080p h.265.

Tried the 4K "Tears of Steel" clip on my 4770K's (@ 42, 44) using the standalone 4k HECV player. Both systems saw a cpu usage of around 35%. The first computer has a 750ti (which in theory should offer partial hardware decoding), while the other has a R9 280. Guess the hardware capabilities of the 750ti didn't kick in or it does very little at the moment?
 

liahos1

Senior member
Aug 28, 2013
573
45
91
Has anybody heard of V-Nova and their perseus codec?

http://www.v-nova.com/en/index.html

Company just came out of stealth mode April 1st and they claim

• PERSEUS® is highly efficient and less CPU-intensive than all legacy and current state-of-the-art codecs (e.g., systems based on H.264/AVC, H.265/HEVC, VP9, JPEG-2000). Third party testing shows that PERSEUS® uses 15-35% less processor power than H.264 and offers even greater power savings with respect to H.265/HEVC

• 2x – 3x average compression gains,” compared with legacy video codecs. V-Nova promises to make 4K transmission commercially viable, while enabling HD on 3G or 4G mobile network by using less power.

They have been developing the codec for 5 years and have some major support behind them

• The consortium members publicly listed in V-Nova’s press release include Broadcom, European Broadcasting Union (EBU), Hitachi Data Systems (HDS), Intel, and Sky Italia.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Tried the 4K "Tears of Steel" clip on my 4770K's (@ 42, 44) using the standalone 4k HECV player. Both systems saw a cpu usage of around 35%. The first computer has a 750ti (which in theory should offer partial hardware decoding), while the other has a R9 280. Guess the hardware capabilities of the 750ti didn't kick in or it does very little at the moment?

750Ti has no HW decoding for H.265, nor does 970 / 980. They have a lot of advanced H.264 support though o_O

One thing that makes the subject confusing is Google's VPx, current version VP9. This can do 4K video and is what YouTube uses, so when you are watching a YouTube 4K video streaming it is not using HEVC. Since VPx has been around since 2010 (vs H.265 which came out in 2013) most current GPUs actually have some kind of HW decode support already.

So I said earlier only the GTX 960 supports H.265, turns out that isn't quite true. The only discrete GPU to support it is the 960.

As of a new driver patch set from Intel on Jan 15 2015, the Haswell Iris / Iris Pro supports HW decode as well as the new Broadwell GPUs.

Which is great I guess, though I had planned to eventually put my 750Ti into an HTPC :(
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
@shady28

So what happens when you play back the said video format. Does your CPU go to 99% usage?

Why is it, such a big deal? GM204 doesn't even support it. I'd rather get a faster CPU and eliminate any reliance on discrete graphics all together.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,783
4,691
136
As of a new driver patch set from Intel on Jan 15 2015, the Haswell Iris / Iris Pro supports HW decode as well as the new Broadwell GPUs.

There s no H265 hardware encoding/decoding in those chips, this feature will be available only with next gen CPUs...
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
There s no H265 hardware encoding/decoding in those chips, this feature will be available only with next gen CPUs...

Its partial decode from what I understand. Which is worthless, we all want what the 960 has in Intel GPUs. I just hope they get it right in Skylake.
 

Abwx

Lifer
Apr 2, 2011
11,783
4,691
136
Its partial decode from what I understand. Which is worthless, we all want what the 960 has in Intel GPUs. I just hope they get it right in Skylake.

Yes , i have heard about the solution they cooked but using GPU + CPU for such a task will get high utilisation and the chip heating uselessly, i mean this will get the fan spinning hard in videoplaying, wich is quite annoying, for the rest it will be supported by the next iterations :


http://wccftech.com/intels-6th-gene...detailed-95w-enthusiast-quad-cores-confirmed/
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
@shady28

So what happens when you play back the said video format. Does your CPU go to 99% usage?

Why is it, such a big deal? GM204 doesn't even support it. I'd rather get a faster CPU and eliminate any reliance on discrete graphics all together.

Below is what I get on an i7-4790 (not overclocked) with a 750Ti.

If someone has a non-hyperthreaded quad core they could run this on and snapshot the task manager it might be more useful.

If you have an i7 quad core and are running at 50%, that's like have 4 physical core pegged which is why it becomes fuzzy as to what is going on with hyperthreading.

I think this is really pegging my cores though - I saw overall CPU spike to ~60% a couple of times which means it's using more than 100% of 4 cores, so HT is kicking in to give it a small boost.

Even at this, I get a lot of stuttering and some artifacting. A message also comes up saying my computer is too slow to decode this without stuttering etc.

A6GMGas.jpg
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Don't know why you would use H.265 right now even if willing to put up with the encode times. Reason being nothing much supports hardware playback right now.

For anyone that wants to check it out and has Chrome, you can download the libde265 player - go to apps and search for HEVC, should be the 1st one listed.

I downloaded the 720p and 4K versions of "Tears of Steel" from here : http://www.libde265.org/downloads-videos/

The 4K video had my i7-4790 running 35-45% total CPU usage across 8 cores (4 physical / 4 HT), and I could barely watch it. In VLC it was even worse -
it was a slideshow with massive artifacting and whatnot.

Only those with GTX 960s can play these back with hardware decode support right now. With 4-16x the processing needs for decode vs h.264, using just the CPU is pretty much not an option unless you're at 720p. I suspect a lot of lower-end CPUs would struggle with 1080p h.265.

I just played the video using the google player, wasn't very smooth. Then I downloaded the latest K-Lite codec pack and played it using Windows Media Player, played back perfectly. CPU usage ranged from 33-67% depending on the scene.

ETA: This is using an i7 3770k @ 4.2
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
I just played the video using the google player, wasn't very smooth. Then I downloaded the latest K-Lite codec pack and played it using Windows Media Player, played back perfectly. CPU usage ranged from 33-67% depending on the scene.

ETA: This is using an i7 3770k @ 4.2

Going to try that too. According to the changelog limited hardware decode assist was introduced in the codecs for ATI and Nvidia GPUs.

33-67% follows what I've been seeing, but since you've got hyperthreading (like me) the percentages are a bit misleading.

Anyone with an i5 want to give it a go and see what the CPU load looks like?


This is the part of the changelog about H.265 (limited) support in hardware :

Changelog 10.7.1 to 10.7.5 ~ 2014-09-23
...
Added support for hardware accelerated decoding of HEVC/H.265 video. Because all currently available hardware does not yet contain a fixed-function decoder for HEVC, the current implementation utilizes the general shader and video processing units of the GPU. As a result the performance is lower and power consumption higher than you might expect. Future hardware will contain a fixed-function decoder and provide better acceleration. At this moment the decoder only works with high-end Intel (HD 4600/5x00/6x00) and NVIDIA (GTX 680/780/870/880/Titan) GPUs. Latest graphics driver is also required.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
That worked like a charm using their included player.

Down to 7-15% CPU, solid 24FPS (which it's encoded at).

Freakin aye, good enough.

Cwh9zJh.jpg
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
That worked like a charm using their included player.

Down to 7-15% CPU, solid 24FPS (which it's encoded at).

Freakin aye, good enough.

Cwh9zJh.jpg

I just reconfigured the DirectShow Filters to use NVidia hardware decoding (was set to software before) still using WMP, averaging about 10% on the CPU and about 25% on my 680
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
^ This. How much encoding do you plan to do VirtualLarry, rip your entire 1,000x DVD collection or just a few files? If you don't want to upgrade CPU, then the obvious common sense solution is make the most of your CPU when you're not using it (batch encode overnight, when you're out at work / shopping, eating dinner, etc). People used to do that anyway when single-core CPU's were the only option.

You don't need to do this anymore,just make sure that the priority of the video conversion is set to low or idle and you can keep using your machine for whatever you want, you can even play your games without any slowdowns(on the game,the conversion is gonna slow down) and if you want you can adjust(lower) your game fps to get a better conversion time.
video showing this being done on a celleron
 

dazelord

Member
Apr 21, 2012
46
2
71
750Ti has no HW decoding for H.265, nor does 970 / 980. They have a lot of advanced H.264 support though o_O

One thing that makes the subject confusing is Google's VPx, current version VP9. This can do 4K video and is what YouTube uses, so when you are watching a YouTube 4K video streaming it is not using HEVC. Since VPx has been around since 2010 (vs H.265 which came out in 2013) most current GPUs actually have some kind of HW decode support already.

So I said earlier only the GTX 960 supports H.265, turns out that isn't quite true. The only discrete GPU to support it is the 960.

As of a new driver patch set from Intel on Jan 15 2015, the Haswell Iris / Iris Pro supports HW decode as well as the new Broadwell GPUs.

Which is great I guess, though I had planned to eventually put my 750Ti into an HTPC :(

Too bad about the 750ti. I guess we will see plenty of hardware becoming obsolete due to lack of proper H.265 support.

Last year Anandtech said Maxwell had at least partial hardware support, whatever that means.
http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/2
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,202
126
You don't need to do this anymore,just make sure that the priority of the video conversion is set to low or idle and you can keep using your machine for whatever you want, you can even play your games without any slowdowns(on the game,the conversion is gonna slow down) and if you want you can adjust(lower) your game fps to get a better conversion time.
video showing this being done on a celleron

I don't know about that. HandBrake, using QSV, was also chewing up nearly 100% of my overclocked G3258 CPU time while encoding.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Anyone with an i5 want to give it a go and see what the CPU load looks like?
Don't have any i5's here. But I suppose you could disable HT on your rig. Don't think the extra cache would play much.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Too bad about the 750ti. I guess we will see plenty of hardware becoming obsolete due to lack of proper H.265 support.

Last year Anandtech said Maxwell had at least partial hardware support, whatever that means.
http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/2

The last few tests shady and I did show that partial support is actually pretty darn good. We both got our CPU usage down to about an average of 10%. I'm using a 680 he's on a 750Ti
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
I don't know about that. HandBrake, using QSV, was also chewing up nearly 100% of my overclocked G3258 CPU time while encoding.

If you get 100% cpu usage with qsv than you are doing something wrong,
doesn't matter though,I know that conversion will get you to 100% that's why I said "make sure that the priority of the video conversion is set to low or idle" this way as soon as you run anything the conversion will slow down to allow that to run at full speed.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
http://www.anandtech.com/show/9219/the-surface-3-review/4

In addition to the GPU update, the ISP and hardware decode capabilities get a bump as well. There is full hardware acceleration for decode of H.263, MPEG4, H.264, H.265 (HEVC), VP8, VP9, MVC, MPEG2, VC1, and JPEG, as well as hardware encode for H.264, H.263, VP8, MVC, and JPEG. This marks the first Intel product to ship with the company's full, fixed-function HEVC decoder, making Atom the company's most advanced media processor, at least for this short moment.

Cherry Trail/Braswell has HEVC hardware decoding, Braswell NUC will be nice for HTPC usage.
 

mikk

Diamond Member
May 15, 2012
4,287
2,370
136
I thought Gen8 was a hybrid for HEVC decoding. There is no proof in this not, I'm not sure if they are correct about full fixed-function HEVC Decoder.
 
Dec 30, 2004
12,553
2
76
I don't know about that. HandBrake, using QSV, was also chewing up nearly 100% of my overclocked G3258 CPU time while encoding.
the windows 7/vista scheduler is terrible compared to XP's, but in XP you could set to lowest priority and do everything else just fine, gaming included; if game using 75% it just used the last 25% for the encode job
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
the windows 7/vista scheduler is terrible compared to XP's, but in XP you could set to lowest priority and do everything else just fine, gaming included; if game using 75% it just used the last 25% for the encode job

What are you basing on? This is the exact opposite of everything I've ever read or experienced using multi core CPU's with these operating systems. Virtual Larry's problem lies elsewhere considering my CPU is nearly idle when using Intel QSV.