Rendering- CPU intensive or GPU intesive?

Discussion in 'Highly Technical' started by abhaybhegde, Mar 16, 2010.

  1. abhaybhegde

    abhaybhegde Member

    Joined:
    Jun 13, 2007
    Messages:
    26
    Likes Received:
    0
    Hi, so what do you think? Is rendering a CPU intensive or GPU intesive? Also Why?

    Cheers
     
  2. Cogman

    Cogman Lifer

    Joined:
    Sep 19, 2000
    Messages:
    10,080
    Likes Received:
    12
    It depends.

    You can do almost all rendering on the CPU if you like, as well, you can do almost all rendering on the GPU if you like.

    Generally, people associate rendering as a GPU intensive process because it generally does do the lions share of it. However, there is no hard "This is always the case" sort of rule.

    Check out raytracing for a good example of modern CPU rendering. [​IMG]
    This pic always astounds me, it is CPU made.
     
    #2 Cogman, Mar 16, 2010
    Last edited: Mar 16, 2010
  3. senseamp

    senseamp Lifer

    Joined:
    Feb 5, 2006
    Messages:
    27,327
    Likes Received:
    21
    Depends on whether your renderer program is written for CPU or GPU. High end GPU is far more parallel, so will generally have a higher throughput, but it's doing same types of calculations. You can do ray-tracing on either now.
     
  4. BassBomb

    BassBomb Diamond Member

    Joined:
    Nov 25, 2005
    Messages:
    8,385
    Likes Received:
    0
    Well, some of my 3D models that I have rendered in 3D Studio Max have taken 72 hours (back on AMD 3200+) and they only use CPU.
     
  5. I_Sinsear_I

    I_Sinsear_I Junior Member

    Joined:
    Mar 9, 2010
    Messages:
    9
    Likes Received:
    0
    Depends on the program.
     
  6. MagnusTheBrewer

    Joined:
    Jun 19, 2004
    Messages:
    20,010
    Likes Received:
    7
    3D Max, Maya, Lightwave, Blender are all heavily CPU dependent. They are all written to take advantage of as many cores as you can throw at them. GPU is used for effects and applying textures but rendering is all about CPU. While there may be some rendering software out there that primarily uses the GPU, I have never heard of one.
     
  7. wwswimming

    wwswimming Banned

    Joined:
    Jan 21, 2006
    Messages:
    3,712
    Likes Received:
    0
    it would be hard to tell if it wasn't for the dice. they look fake.
     
  8. The Boston Dangler

    Joined:
    Mar 10, 2005
    Messages:
    14,649
    Likes Received:
    0
    no they don't. if anything, the ice cube looks fake.
     
  9. abhaybhegde

    abhaybhegde Member

    Joined:
    Jun 13, 2007
    Messages:
    26
    Likes Received:
    0
    Thanks for the response , Are there any freeware which lets me benchmark ray tracing images on two different CPUs
     
  10. Murloc

    Murloc Diamond Member

    Joined:
    Jun 24, 2008
    Messages:
    4,776
    Likes Received:
    0
    it looks weird even if you remove the ice cube.
    It's too perfect and glossy
     
  11. silverpig

    silverpig Lifer

    Joined:
    Jul 29, 2001
    Messages:
    27,710
    Likes Received:
    0
    Real-time - GPU
    Pre-rendered - CPU

    The image is definitely good, but the glass looks too clean to be real. Only the champagne flute should be that polished :)
     
  12. Matthiasa

    Matthiasa Diamond Member

    Joined:
    May 4, 2009
    Messages:
    5,671
    Likes Received:
    1
    Nothing in that imaged looked right...
     
  13. DominionSeraph

    DominionSeraph Diamond Member

    Joined:
    Jul 22, 2009
    Messages:
    8,275
    Likes Received:
    1
    Your nerd cred just zeroed. A d6 doesn't have razor edges.

    [​IMG]
     
  14. Modelworks

    Modelworks Lifer

    Joined:
    Feb 22, 2007
    Messages:
    16,237
    Likes Received:
    0
    The CPU/GPU debate is about to get really heated in the professional 3d market.
    The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.

    The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.

    Some companies have claimed they have done it:
    http://www.randomcontrol.com/arion
    http://www.refractivesoftware.com/

    Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
    CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.

    When GPU can render like this then people will switch .
    [​IMG]
    http://forums.cgsociety.org/showthread.php?f=121&t=713053
     
  15. abhaybhegde

    abhaybhegde Member

    Joined:
    Jun 13, 2007
    Messages:
    26
    Likes Received:
    0
    Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?

    I wanted to benchmark certain class of CPUs with regard to Rendering. Could you point out any freeware/Open source Rendering Benchmaking tool (like Cinebench) which can be used. What i mainly intend to do is, to study the Time taken to render a frame on a Two different classes of Processor.

    Thanks
     
  16. KIAman

    KIAman Diamond Member

    Joined:
    Mar 7, 2001
    Messages:
    3,211
    Likes Received:
    1
    Although a slight derail, I can often spot fake vs rendered because of how perfect a rendered image looks (aka, no blemishes, no dirt, nada). I call it "the shiny anime effect." The amount of extra information and calculations to map out the random imperfections could technically be astronomical.

    I wonder if there is a point where an image looks "too good."
     
  17. Modelworks

    Modelworks Lifer

    Joined:
    Feb 22, 2007
    Messages:
    16,237
    Likes Received:
    0
    A lot of it comes from how 3d rendering evolved. When I first started in this about 15 years ago the best we had was simple ray tracers like povray. Then people started adding features along the way like caustics, sub surface scattering, bump mapping , etc. All those functions were designed with a cpu as the target. The whole method of how the calculations were done are based upon that platform. When you switch to GPU you now have to look at what you want the end result to be and figure out how to adapt that to the GPU and still have the exact same visual result. A lot of these features depend on registers that are unique to the CPU.

    There isn't anything out there yet that has the same cpu and gpu features in the renderer except maybe octane. I haven't checked out the latest beta. You would need to load the same scene and render it with cpu only and then gpu only and that would give a fair comparison. Cinebench I never liked , it really doesn't stress a system like it would be used in a real world. The scenes are too simple .

    You could download the trial of Maya 2010 and load up some scenes and compare render times that way.
    http://usa.autodesk.com/adsk/servlet/pc/index?id=13578047&siteID=123112
     
  18. Modelworks

    Modelworks Lifer

    Joined:
    Feb 22, 2007
    Messages:
    16,237
    Likes Received:
    0

    I wrote an article about that about 10 years ago. Computers like to generate perfect images and it does take time on the artist part to cover that up. There are quite a few tutorials out there that discuss how to make things look more realistic. I have a hard time playing games anymore because the graphics I see just make me want to grab the artist that did it and shake them . A lot of it doesn't have to look that way. My current pet peeve is Bloom effect. It is WAY overused in gaming graphics. So many people have jumped into the industry because they think it is the cool thing to do that it is really hurting the overall look of the industry.

    I was talking with an artist who did a movie, the scorpion king 2, if you have seen it, the effects are horrendous. I told him what I thought of his work and he replied, well it was money and I worked as hard as I got paid to. I replied you shouldn't take the job if that is the kind of work you are going to turn out, everyone sees that and uses it as a reference to what you can do. He didn't care, it was only money to him, pride in your work seems to be losing ground.
     
  19. HeXen

    HeXen Diamond Member

    Joined:
    Dec 13, 2009
    Messages:
    7,529
    Likes Received:
    1
    There is some game related RT stuff. Quake 3 Ray Tracing, though i think its a hybrid of rasterization and RT, this including several others are using an Open source ray tracing engine...i think its called OpenRT. I dont know much about it but you can google it.

    there is a simple RT called C-Ray benchmark. Another called BART. There is also Realstorm engine here
    http://www.realtimeraytrace.de/

    although non of these are like the CG stuff, dont look any better than your typical 3d game. Bart is one of the better looking ones but dont think theres an actual demo for it.
     
  20. Cogman

    Cogman Lifer

    Joined:
    Sep 19, 2000
    Messages:
    10,080
    Likes Received:
    12
    Yikes, not to be too much of a stickler here, but the analogy isn't a good one. A better one would be the difference between multiplying by several additions vs a specific multiplication function.

    CPU's do lots of things decently good, where as GPUs do very few things really well. (and this is a semi-recent development that we could even compare the two.)

    Got huge arrays of data that need the same operation performed on it? That screams "Use a GPU.". Got tons of finite/branching operations on relatively small data sets? That's got CPU written all over it.

    It'll be a long time (if ever) before we see GPUs used more like CPUs.
     
  21. PlasmaBomb

    PlasmaBomb Lifer

    Joined:
    Nov 19, 2004
    Messages:
    11,820
    Likes Received:
    1
    Typically they don't (which I was going to call it on)...

    [​IMG]

    But you can get them...

    [​IMG]
     
    #21 PlasmaBomb, Mar 19, 2010
    Last edited: Mar 19, 2010
  22. William Gaatjes

    Joined:
    May 11, 2008
    Messages:
    13,645
    Likes Received:
    10
    Is this perhaps the real reason why Intel developed the Larrabee ?
    To replace those CPU only render farms ?
     
  23. gsellis

    gsellis Diamond Member

    Joined:
    Dec 4, 2003
    Messages:
    6,062
    Likes Received:
    0
    In video, the technique was first applied by Pinnacle Systems. In fact, I do believe they were first across the board. The last version of this (Avid just announced that Liquid is now dead) would play M2V (1080i/p) in real time with effects (CPU type rendered, GPU rendered on the fly). Started as a project with ATI using DirectX on the ATI 8500 chipset. It was the first practical application that showed the benefit of the PCI-e bus vs AGP and was demoed at IDF when PCI-e was being introduced.

    But as noted above, what do you want to do and that determines how fast.

    Note that Avid is now using OpenGL render in Media Composer 4.x. I may take the $500 upgrade offer - now just need a Quadro card...
     
  24. TuxDave

    TuxDave Lifer

    Joined:
    Oct 8, 2002
    Messages:
    10,576
    Likes Received:
    0
    PFFFT!!! You can't fool me! Those are rendered too! :D
     
  25. SSJ4Gogeta

    SSJ4Gogeta Junior Member

    Joined:
    Mar 1, 2000
    Messages:
    6
    Likes Received:
    0
    There's some aliasing which gives away that the image is rendered:
    [​IMG]

    (I can't upload attachments so I had to upload the file elsewhere. What is the minimum number of posts you have to make before you can upload attachments?)