Sony Vegas GPU accelerated rendering

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
I'm doing video editing with Sony Vegas and rendering out to mp4 and trying to decrease render times by using hardware acceleration, I was under the impression that GPUs ought to be able to render a lot faster?

I have a GTX 980 and CUDA is enabled but I'm getting really long render times which are a bit slower than real time video, we're at 1920x1080 and 30fps. I've followed all the guides to enable CUDA but it doesn't seem to make a difference to CPU only.

Looking at GPU usage its fluctuating up and down between about 3% and 5% which is tiny, and CPU is pegged at 100% for all 8 cores (it's a i7 2600k @ 4.9ghz)

Is this just a case of rendering not being something you can parallelize well? In which case is this low GPU usage expected, or should it be higher? I find it hard to believe that Vegas can only use a few percent of the GPU.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Is this just a case of rendering not being something you can parallelize well? In which case is this low GPU usage expected, or should it be higher? I find it hard to believe that Vegas can only use a few percent of the GPU.

Rendering is a textbook case of parallelization,but GPU cores are far less accurate then cpu cores so maybe (just maybe) vegas is not using the gpu for the actual rendering but only for secondary stuff so you still end up with a quality transcoding,thus the small utilization.
Maybe you could try with a lower quality setting,or try a different program.

Also read this,you might have to find a codec with cuda support.
https://forums.creativecow.net/thread/24/984273
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I have a GTX 980 and CUDA is enabled but I'm getting really long render times which are a bit slower than real time video, we're at 1920x1080 and 30fps. I've followed all the guides to enable CUDA but it doesn't seem to make a difference to CPU only.

NV isn't good for this particular application.

75492.png


You would probably be better off with a 5820K @ 4.5ghz and an AMD graphics card for this program. Even a $120 used HD7970 beats a 980Ti in this program.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Vegas uses OpenCL, AMD is FAR superior in OpenCL applications. But it sounds like your are not actually using the GPU. So do verify your settings using the link provided above.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
NV isn't good for this particular application.


You would probably be better off with a 5820K @ 4.5ghz and an AMD graphics card for this program. Even a $120 used HD7970 beats a 980Ti in this program.

There's really not much difference there in the latest cards, though.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
There's really not much difference there in the latest cards, though.

In % terms of hours of work on a weekly basis over a year? A $120 7970 beats the $470 980 by 21%. My statement stands true that if rendering speed is the priority for this program, 6-8 core Intel i7 + AMD GPU is the way to go. Alternatively, the OP can re-assess his future GPU upgrade path in 2016-2017 based on the performance of future cards in Sony Vegas Pro and their compatibility/performance with Oculus Rift. Right now, everything we hear online is that NV also sucks for VR as its latency is way too high.
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
There's really not much difference there in the latest cards, though.
Probably because their test is too short and setup time dominates at that speed. This test could use an update to h265 and 4k anyways.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
In % terms of hours of work on a weekly basis over a year? A $120 7970 beats the $470 980 by 21%. My statement stands true that if rendering speed is the priority for this program, 6-8 core Intel i7 + AMD GPU is the way to go. Alternatively, the OP can re-assess his future GPU upgrade path in 2016-2017 based on the performance of future cards in Sony Vegas Pro and their compatibility/performance with Oculus Rift. Right now, everything we hear online is that NV also sucks for VR as its latency is way too high.

I didn't know the OP was running a video rendering business...

For home use, there's really no significant difference, right? It's a few seconds among the latest cards.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Are you sure you're not encoding-bound? Vegas can use the GPU to accelerate the rendering process. However most of the encoders make little-to-no use of the GPU.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I agree w/ Virge, sounds like it might not be turned on or inapplicable for the task.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I didn't know the OP was running a video rendering business...

For home use, there's really no significant difference, right? It's a few seconds among the latest cards.

It's 20%-30% faster. Since when is that not significant? Just because that test runs only 20+ seconds doesn't mean that most real world tasks will.
 

VR Enthusiast

Member
Jul 5, 2015
133
1
0
Vegas is a great editor but a bad choice for rendering. GPU acceleration is only valid for graphical effects and transitions etc, the rest is done on the CPU.

I'll assume you just want to make a video of gameplay so if you're a masochist you probably want to check out a frameserving option like debugmode frameserver to fake the output to a .avi that can then be encoded with x264 in VirtualDub or something else (handbrake doesn't work).

Be forewarned that the deeper you go into this stuff the more ridiculous it gets but it kills Vegas and Premiere for speed and quality.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,734
327
126
LOL at the salesmen trying to sell OP a new card...

What version of Sony Vegas are you using, and what format is your video source?
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
LOL at the salesmen trying to sell OP a new card...

What version of Sony Vegas are you using, and what format is your video source?

Yeah I'm not looking at buying a new card, I'm just trying to determine if that's an expected render time or not, the fact its only using a few percent of the GPU could be a bug.

I'm doing 2 encodes first from FRAPS uncompressed AVI's around about 250gb to a ~10-15gb file using either MainConcept AVC/AAC or Sony AVC/MVC, saved to an mp4, this is with Sony Vegas 13.

After that I'll normally chop/edit the 10-15Gb file into smaller 15 mins files which can take about 40 mins to re-render to final vids.

Sony AVC/MVC has options for GPU if available and the check GPU buttons says "no GPU available", and the MainConcept AVC/AAC lists both OpenCL and CUDA as options and the check GPU button gives the result "CUDA detected", I also grabbed the CUDA installer from Nvidias site and ran that to ensure it's the latest and up to date.

The most telling thing is the GPU usage, for such a straight forward GPGPU task like this I'd expect that to shoot to damn near 99% and stay close to that which I'd expect should render at something like 10x faster than the CPU. The CPU usage during GPU rendering is also really high damn near maxed so maybe some is done on the CPU and it's simply bottlenecked. Still a 2600k @ 4.9gz is really fast.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Just curious, why not use ShadowPlay instead of FRAPS and get an easy to use MP4 to start with?

FWIW, I see the same results with Sony Movie Studio. Same render times with GPU acceleration enabled and barely and GPU use. I gave up on it since some renders took longer. I decided to let my 4790k do its thing.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
It's 20%-30% faster. Since when is that not significant? Just because that test runs only 20+ seconds doesn't mean that most real world tasks will.

It's apparently not using the GPU, so there's probably still little difference there for the home user, even at 25% faster.

It's using the CPU.
 

Makaveli

Diamond Member
Feb 8, 2002
4,915
1,503
136
Huh?

GPU 3-5%
CPU totally pegged.

Depending on the Codec being used.

I can get Vegas use to 45-50% on either of GPU's in Vegas 13.

Note my system has both a Radeon and a NV card so I can use either one in vegas.

o7vvcl.png


If you choose automatic if will use both CPU and GPU

293ejc2.png


10qxu0z.png
 
Last edited:

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
So I ended up playing around with the settings in Movie Studio 13 (which look identical to what is posted above). I used a test clip of some 1080p video captures, rendered in the best quality. With CPU+GPU CUDA and my 980 Ti, I saw 15-20% GPU usage. Render time was 8:15. I went back and did it again with CPU only, all other settings the same. GPU usage was at 1% the whole time and the render finished in 7:30.

So, 45 seconds faster without. Sounds broken in Sony's implementation. Files were exactly the same when finished. I even MD5 hashed and they are identical.
 

Makaveli

Diamond Member
Feb 8, 2002
4,915
1,503
136
So I ended up playing around with the settings in Movie Studio 13 (which look identical to what is posted above). I used a test clip of some 1080p video captures, rendered in the best quality. With CPU+GPU CUDA and my 980 Ti, I saw 15-20% GPU usage. Render time was 8:15. I went back and did it again with CPU only, all other settings the same. GPU usage was at 1% the whole time and the render finished in 7:30.

So, 45 seconds faster without. Sounds broken in Sony's implementation. Files were exactly the same when finished. I even MD5 hashed and they are identical.

Are you able to set it to GPU only then test?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
So I ended up playing around with the settings in Movie Studio 13 (which look identical to what is posted above). I used a test clip of some 1080p video captures, rendered in the best quality. With CPU+GPU CUDA and my 980 Ti, I saw 15-20% GPU usage. Render time was 8:15. I went back and did it again with CPU only, all other settings the same. GPU usage was at 1% the whole time and the render finished in 7:30.

So, 45 seconds faster without. Sounds broken in Sony's implementation. Files were exactly the same when finished. I even MD5 hashed and they are identical.
There's nothing "broken" about Sony's implementation. You are straight-up encoding bound, and a GPU is not going to help you here.

GPUs are pretty awful on the whole for H.264 encoding; it's a fairly linear task that doesn't mesh well with the thousands of threads model that GPUs use. Context-adaptive binary arithmetic coding in particular really hammers GPUs. As a result the only time GPUs can do well at H.264 encoding is when they use a fixed function block such as VCE/NVENC/QuickSync.

Sony's implementation works correctly. Using a GPU in a programmatic fashion (i.e. without a fixed function encoder) is in fact slower than a modern CPU. You either need to use QuickSync or invest in a different CPU, as the GPU is only useful for rendering, not encoding.