Rendering- CPU intensive or GPU intesive?

Cogman

Lifer
Sep 19, 2000
10,277
125
106
It depends.

You can do almost all rendering on the CPU if you like, as well, you can do almost all rendering on the GPU if you like.

Generally, people associate rendering as a GPU intensive process because it generally does do the lions share of it. However, there is no hard "This is always the case" sort of rule.

Check out raytracing for a good example of modern CPU rendering.
Glasses_800_edit.png

This pic always astounds me, it is CPU made.
 
Last edited:

senseamp

Lifer
Feb 5, 2006
35,787
6,195
126
Depends on whether your renderer program is written for CPU or GPU. High end GPU is far more parallel, so will generally have a higher throughput, but it's doing same types of calculations. You can do ray-tracing on either now.
 

BassBomb

Diamond Member
Nov 25, 2005
8,396
1
81
Well, some of my 3D models that I have rendered in 3D Studio Max have taken 72 hours (back on AMD 3200+) and they only use CPU.
 

MagnusTheBrewer

IN MEMORIAM
Jun 19, 2004
24,135
1,594
126
3D Max, Maya, Lightwave, Blender are all heavily CPU dependent. They are all written to take advantage of as many cores as you can throw at them. GPU is used for effects and applying textures but rendering is all about CPU. While there may be some rendering software out there that primarily uses the GPU, I have never heard of one.
 

abhaybhegde

Member
Jun 13, 2007
26
0
0
Thanks for the response , Are there any freeware which lets me benchmark ray tracing images on two different CPUs
 

Murloc

Diamond Member
Jun 24, 2008
5,382
65
91
it looks weird even if you remove the ice cube.
It's too perfect and glossy
 

silverpig

Lifer
Jul 29, 2001
27,709
11
81
Real-time - GPU
Pre-rendered - CPU

The image is definitely good, but the glass looks too clean to be real. Only the champagne flute should be that polished :)
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.

The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.

Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/

Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.

When GPU can render like this then people will switch .
vicki.jpg

http://forums.cgsociety.org/showthread.php?f=121&t=713053
 

abhaybhegde

Member
Jun 13, 2007
26
0
0
The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.

The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.

Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/

Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.

When GPU can render like this then people will switch .

Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?

I wanted to benchmark certain class of CPUs with regard to Rendering. Could you point out any freeware/Open source Rendering Benchmaking tool (like Cinebench) which can be used. What i mainly intend to do is, to study the Time taken to render a frame on a Two different classes of Processor.

Thanks
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
Although a slight derail, I can often spot fake vs rendered because of how perfect a rendered image looks (aka, no blemishes, no dirt, nada). I call it "the shiny anime effect." The amount of extra information and calculations to map out the random imperfections could technically be astronomical.

I wonder if there is a point where an image looks "too good."
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?

A lot of it comes from how 3d rendering evolved. When I first started in this about 15 years ago the best we had was simple ray tracers like povray. Then people started adding features along the way like caustics, sub surface scattering, bump mapping , etc. All those functions were designed with a cpu as the target. The whole method of how the calculations were done are based upon that platform. When you switch to GPU you now have to look at what you want the end result to be and figure out how to adapt that to the GPU and still have the exact same visual result. A lot of these features depend on registers that are unique to the CPU.

I wanted to benchmark certain class of CPUs with regard to Rendering. Could you point out any freeware/Open source Rendering Benchmaking tool (like Cinebench) which can be used. What i mainly intend to do is, to study the Time taken to render a frame on a Two different classes of Processor.

There isn't anything out there yet that has the same cpu and gpu features in the renderer except maybe octane. I haven't checked out the latest beta. You would need to load the same scene and render it with cpu only and then gpu only and that would give a fair comparison. Cinebench I never liked , it really doesn't stress a system like it would be used in a real world. The scenes are too simple .

You could download the trial of Maya 2010 and load up some scenes and compare render times that way.
http://usa.autodesk.com/adsk/servlet/pc/index?id=13578047&siteID=123112
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Although a slight derail, I can often spot fake vs rendered because of how perfect a rendered image looks (aka, no blemishes, no dirt, nada). I call it "the shiny anime effect." The amount of extra information and calculations to map out the random imperfections could technically be astronomical.

I wonder if there is a point where an image looks "too good."


I wrote an article about that about 10 years ago. Computers like to generate perfect images and it does take time on the artist part to cover that up. There are quite a few tutorials out there that discuss how to make things look more realistic. I have a hard time playing games anymore because the graphics I see just make me want to grab the artist that did it and shake them . A lot of it doesn't have to look that way. My current pet peeve is Bloom effect. It is WAY overused in gaming graphics. So many people have jumped into the industry because they think it is the cool thing to do that it is really hurting the overall look of the industry.

I was talking with an artist who did a movie, the scorpion king 2, if you have seen it, the effects are horrendous. I told him what I thought of his work and he replied, well it was money and I worked as hard as I got paid to. I replied you shouldn't take the job if that is the kind of work you are going to turn out, everyone sees that and uses it as a reference to what you can do. He didn't care, it was only money to him, pride in your work seems to be losing ground.
 

HeXen

Diamond Member
Dec 13, 2009
7,831
37
91
Thanks for the response , Are there any freeware which lets me benchmark ray tracing images on two different CPUs

There is some game related RT stuff. Quake 3 Ray Tracing, though i think its a hybrid of rasterization and RT, this including several others are using an Open source ray tracing engine...i think its called OpenRT. I dont know much about it but you can google it.

there is a simple RT called C-Ray benchmark. Another called BART. There is also Realstorm engine here
http://www.realtimeraytrace.de/

although non of these are like the CG stuff, dont look any better than your typical 3d game. Bart is one of the better looking ones but dont think theres an actual demo for it.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?

Yikes, not to be too much of a stickler here, but the analogy isn't a good one. A better one would be the difference between multiplying by several additions vs a specific multiplication function.

CPU's do lots of things decently good, where as GPUs do very few things really well. (and this is a semi-recent development that we could even compare the two.)

Got huge arrays of data that need the same operation performed on it? That screams "Use a GPU.". Got tons of finite/branching operations on relatively small data sets? That's got CPU written all over it.

It'll be a long time (if ever) before we see GPUs used more like CPUs.
 
May 11, 2008
19,561
1,195
126
The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.

The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.

Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/

Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.

When GPU can render like this then people will switch .

Is this perhaps the real reason why Intel developed the Larrabee ?
To replace those CPU only render farms ?
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
In video, the technique was first applied by Pinnacle Systems. In fact, I do believe they were first across the board. The last version of this (Avid just announced that Liquid is now dead) would play M2V (1080i/p) in real time with effects (CPU type rendered, GPU rendered on the fly). Started as a project with ATI using DirectX on the ATI 8500 chipset. It was the first practical application that showed the benefit of the PCI-e bus vs AGP and was demoed at IDF when PCI-e was being introduced.

But as noted above, what do you want to do and that determines how fast.

Note that Avid is now using OpenGL render in Media Composer 4.x. I may take the $500 upgrade offer - now just need a Quadro card...
 

SSJ4Gogeta

Junior Member
Mar 1, 2000
6
0
0
There's some aliasing which gives away that the image is rendered:
Glasses_800_edit.png


(I can't upload attachments so I had to upload the file elsewhere. What is the minimum number of posts you have to make before you can upload attachments?)