- Jun 13, 2007
- 26
- 0
- 0
Check out raytracing for a good example of modern CPU rendering.
This pic always astounds me, it is CPU made.
it would be hard to tell if it wasn't for the dice. they look fake.
no they don't.
The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.
The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.
Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/
Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.
When GPU can render like this then people will switch .
Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?
I wanted to benchmark certain class of CPUs with regard to Rendering. Could you point out any freeware/Open source Rendering Benchmaking tool (like Cinebench) which can be used. What i mainly intend to do is, to study the Time taken to render a frame on a Two different classes of Processor.
Although a slight derail, I can often spot fake vs rendered because of how perfect a rendered image looks (aka, no blemishes, no dirt, nada). I call it "the shiny anime effect." The amount of extra information and calculations to map out the random imperfections could technically be astronomical.
I wonder if there is a point where an image looks "too good."
Thanks for the response , Are there any freeware which lets me benchmark ray tracing images on two different CPUs
Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?
Your nerd cred just zeroed. A d6 doesn't have razor edges.
The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.
The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.
Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/
Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.
When GPU can render like this then people will switch .
Typically they don't (which I was going to call it on)...
But you can get them...