GPUs: the next frontier in film

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
More PR fluff. 2012 is one of the most heavy CG and VFX films to ever be produced and we rendered it all on cpu not gpu.
Maybe in 5years gpu will take over but for now, not going to happen.

And when they say that Cloudy with a chance of meatballs used the gpu, yeah it did, the same way we used it in 2012. To render artist previews, the output for film was still cpu renderfarms. That is where the real processing power shortage is .
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
More PR fluff. 2012 is one of the most heavy CG and VFX films to ever be produced and we rendered it all on cpu not gpu.
Maybe in 5years gpu will take over but for now, not going to happen.

And when they say that Cloudy with a chance of meatballs used the gpu, yeah it did, the same way we used it in 2012. To render artist previews, the output for film was still cpu renderfarms. That is where the real processing power shortage is .

I don't think anyone is really saying it's going to instantly take over right now. 5 years is a fairly good estimate. It takes time for widespread adoption to happen and standards to be set. But the transition is definitely starting to gain momentum now that people are becoming more aware of the potential power. And if Fermi's whitepaper is true to it's word, then it should be a real boost to that sector.

As more industry software get's fully functioning GPU acceleration, the faster adoption will become.
 
Last edited:

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I can say that the GPU is making a dent in some things that could have not been done before in CG work. Two programs released this year really make use of the GPU in ways that could not be done on a cpu.

Mudbox 2010 - does real time displays of textures, lighting, bump maps with very high poly counts.
http://usa.autodesk.com/adsk/servlet/pc/index?id=13565063&siteID=123112

3d Coat - uses the GPU for similar things as mudbox does but also supports voxels using Cuda.

3d coat is a great program by a very talented programmer. One guy wrote it and he apparently is a math wiz because some of the stuff he codes in is done days after someone request it.
http://www.3d-coat.com/
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I don't think anyone is really saying it's going to instantly take over right now. 5 years is a fairly good estimate. It takes time for widespread adoption to happen and standards to be set. But the transition is definitely starting to gain momentum now that people are becoming more aware of the potential power. And if Fermi's whitepaper is true to it's word, then it should be a real boost to that sector.

As more industry software get's fully functioning GPU acceleration, the faster adoption will become.

Your correct it has started already . Dreamworks went all intel CPU and larrabee. For the future in film . Ati has some really great stuff for film also .

I know about fermi white paper interesting read . But. NV did just recently hire engineer from transmedia . So either firma was going to emulate X86 and its not working . and NV needed help . Or fermi isn't working out and NV needs to hurry with emulating X86.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
And when they say that Cloudy with a chance of meatballs used the gpu, yeah it did, the same way we used it in 2012. To render artist previews, the output for film was still cpu renderfarms. That is where the real processing power shortage is .

You used CUDA / GPU farms for the previews but CPU farms for the end product? That wouldn't appear to make sense from a business standpoint, but I'm not in the industry. Can you elaborate a little?
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
You used CUDA / GPU farms for the previews but CPU farms for the end product? That wouldn't appear to make sense from a business standpoint, but I'm not in the industry. Can you elaborate a little?


Not really farms, just lots of cards to accelerate previews.

Example, something like smoke in a scene.

Smoke is created by generating particles. Basically a flat plane with 3 or 4 sides. Each plane is mapped with a transparency map and a texture map. These planes are tiny , 2x2 pixel type sizes. You use millions of these to make up something like smoke from a burning car. To get them to move like smoke you have to run a simulation that takes into account wind, gravity, velocity of particles, birth rate and death rate as well as how long each particle will be visible in the scene.

Where things like a gpu help is in crunching the numbers of that simulation. It is almost all math and the gpu excels at that. So I can tweak the simulation displaying it at 3 -4 fps and seeing the results versus using straight cpu and having to wait 30 seconds between every change to see the end result. Realize that even with the gpu assist I cannot view what the final render will look like, the resolutions and amount of other calculations required for the final scene render are too complex for the current setups.

The final render is done on cpu because right now the rendering code only runs on cpus. It is very complex and the porting of it to gpu will take a lot of time. The cost in time to do that has to be worth the gains. The current gpu also are not set up to handle the massive amounts of memory and interconnection that would be required. You would need cards with 8GB+ ram each and several hundred gpus to match what renderfarms can do now. It will get there but not for several years.

The future for things like smoke creation is morphing particles. Particles that can merge into other particles and break free from existing ones, similar to what you might have seen in liquid demonstrations using physx, but a lot more advanced.
 
Last edited: