Go Back   AnandTech Forums > Hardware and Technology > Highly Technical

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 03-16-2010, 07:11 AM   #1
abhaybhegde
Member
 
Join Date: Jun 2007
Posts: 26
Default Rendering- CPU intensive or GPU intesive?

Hi, so what do you think? Is rendering a CPU intensive or GPU intesive? Also Why?

Cheers
abhaybhegde is offline   Reply With Quote
Old 03-16-2010, 10:42 AM   #2
Cogman
Diamond Member
 
Cogman's Avatar
 
Join Date: Sep 2000
Location: A nomadic herd of wild fainting goats
Posts: 9,806
Default

It depends.

You can do almost all rendering on the CPU if you like, as well, you can do almost all rendering on the GPU if you like.

Generally, people associate rendering as a GPU intensive process because it generally does do the lions share of it. However, there is no hard "This is always the case" sort of rule.

Check out raytracing for a good example of modern CPU rendering.
This pic always astounds me, it is CPU made.

Last edited by Cogman; 03-16-2010 at 10:45 AM.
Cogman is online now   Reply With Quote
Old 03-16-2010, 12:15 PM   #3
senseamp
Lifer
 
senseamp's Avatar
 
Join Date: Feb 2006
Posts: 23,198
Default

Depends on whether your renderer program is written for CPU or GPU. High end GPU is far more parallel, so will generally have a higher throughput, but it's doing same types of calculations. You can do ray-tracing on either now.
__________________
*Not speaking on behalf of any company*
senseamp is offline   Reply With Quote
Old 03-16-2010, 05:01 PM   #4
BassBomb
Diamond Member
 
BassBomb's Avatar
 
Join Date: Nov 2005
Location: Ontario
Posts: 8,370
Default

Well, some of my 3D models that I have rendered in 3D Studio Max have taken 72 hours (back on AMD 3200+) and they only use CPU.
BassBomb is offline   Reply With Quote
Old 03-17-2010, 02:49 AM   #5
I_Sinsear_I
Junior Member
 
I_Sinsear_I's Avatar
 
Join Date: Mar 2010
Location: Colorado
Posts: 9
Default

Depends on the program.
I_Sinsear_I is offline   Reply With Quote
Old 03-17-2010, 02:57 AM   #6
MagnusTheBrewer
Lifer
 
MagnusTheBrewer's Avatar
 
Join Date: Jun 2004
Posts: 17,744
Default

3D Max, Maya, Lightwave, Blender are all heavily CPU dependent. They are all written to take advantage of as many cores as you can throw at them. GPU is used for effects and applying textures but rendering is all about CPU. While there may be some rendering software out there that primarily uses the GPU, I have never heard of one.
__________________
"We love you children"
"Oh, yes we do"
"Boiled, Fried and, Barbequed"
MagnusTheBrewer is offline   Reply With Quote
Old 03-17-2010, 07:46 AM   #7
wwswimming
Banned
 
Join Date: Jan 2006
Posts: 3,712
Default

Quote:
Originally Posted by Cogman View Post
Check out raytracing for a good example of modern CPU rendering.
This pic always astounds me, it is CPU made.
it would be hard to tell if it wasn't for the dice. they look fake.
wwswimming is offline   Reply With Quote
Old 03-17-2010, 01:13 PM   #8
The Boston Dangler
Lifer
 
The Boston Dangler's Avatar
 
Join Date: Mar 2005
Location: The Democratic People's Republic of Massachusetts
Posts: 13,806
Default

Quote:
Originally Posted by wwswimming View Post
it would be hard to tell if it wasn't for the dice. they look fake.
no they don't. if anything, the ice cube looks fake.
__________________
If God had meant for us to walk, why did he give us feet that fit car pedals?

-Sir Stirling Moss
The Boston Dangler is online now   Reply With Quote
Old 03-17-2010, 01:44 PM   #9
abhaybhegde
Member
 
Join Date: Jun 2007
Posts: 26
Default

Thanks for the response , Are there any freeware which lets me benchmark ray tracing images on two different CPUs
abhaybhegde is offline   Reply With Quote
Old 03-17-2010, 01:46 PM   #10
Murloc
Diamond Member
 
Murloc's Avatar
 
Join Date: Jun 2008
Location: Switzerland
Posts: 3,929
Default

it looks weird even if you remove the ice cube.
It's too perfect and glossy
Murloc is offline   Reply With Quote
Old 03-17-2010, 07:51 PM   #11
silverpig
Lifer
 
silverpig's Avatar
 
Join Date: Jul 2001
Location: London, UK
Posts: 27,709
Default

Real-time - GPU
Pre-rendered - CPU

The image is definitely good, but the glass looks too clean to be real. Only the champagne flute should be that polished
silverpig is offline   Reply With Quote
Old 03-17-2010, 09:40 PM   #12
Matthiasa
Diamond Member
 
Join Date: May 2009
Posts: 5,127
Default

Nothing in that imaged looked right...
Matthiasa is online now   Reply With Quote
Old 03-17-2010, 10:29 PM   #13
DominionSeraph
Diamond Member
 
DominionSeraph's Avatar
 
Join Date: Jul 2009
Location: Equestria
Posts: 8,263
Default

Quote:
Originally Posted by The Boston Dangler View Post
no they don't.
Your nerd cred just zeroed. A d6 doesn't have razor edges.

DominionSeraph is offline   Reply With Quote
Old 03-18-2010, 02:07 PM   #14
Modelworks
Lifer
 
Modelworks's Avatar
 
Join Date: Feb 2007
Location: North Carolina
Posts: 16,237
Default

The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.

The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.

Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/

Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.

When GPU can render like this then people will switch .

http://forums.cgsociety.org/showthre...f=121&t=713053
Modelworks is offline   Reply With Quote
Old 03-18-2010, 03:29 PM   #15
abhaybhegde
Member
 
Join Date: Jun 2007
Posts: 26
Default

Quote:
Originally Posted by Modelworks View Post
The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.

The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.

Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/

Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.

When GPU can render like this then people will switch .
Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?

I wanted to benchmark certain class of CPUs with regard to Rendering. Could you point out any freeware/Open source Rendering Benchmaking tool (like Cinebench) which can be used. What i mainly intend to do is, to study the Time taken to render a frame on a Two different classes of Processor.

Thanks
abhaybhegde is offline   Reply With Quote
Old 03-18-2010, 06:14 PM   #16
KIAman
Diamond Member
 
KIAman's Avatar
 
Join Date: Mar 2001
Location: Sacramento, CA
Posts: 3,098
Default

Although a slight derail, I can often spot fake vs rendered because of how perfect a rendered image looks (aka, no blemishes, no dirt, nada). I call it "the shiny anime effect." The amount of extra information and calculations to map out the random imperfections could technically be astronomical.

I wonder if there is a point where an image looks "too good."
__________________
Heatware
Paypal Verified
KIAman is offline   Reply With Quote
Old 03-18-2010, 07:20 PM   #17
Modelworks
Lifer
 
Modelworks's Avatar
 
Join Date: Feb 2007
Location: North Carolina
Posts: 16,237
Default

Quote:
Originally Posted by abhaybhegde View Post
Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?

A lot of it comes from how 3d rendering evolved. When I first started in this about 15 years ago the best we had was simple ray tracers like povray. Then people started adding features along the way like caustics, sub surface scattering, bump mapping , etc. All those functions were designed with a cpu as the target. The whole method of how the calculations were done are based upon that platform. When you switch to GPU you now have to look at what you want the end result to be and figure out how to adapt that to the GPU and still have the exact same visual result. A lot of these features depend on registers that are unique to the CPU.

Quote:
I wanted to benchmark certain class of CPUs with regard to Rendering. Could you point out any freeware/Open source Rendering Benchmaking tool (like Cinebench) which can be used. What i mainly intend to do is, to study the Time taken to render a frame on a Two different classes of Processor.
There isn't anything out there yet that has the same cpu and gpu features in the renderer except maybe octane. I haven't checked out the latest beta. You would need to load the same scene and render it with cpu only and then gpu only and that would give a fair comparison. Cinebench I never liked , it really doesn't stress a system like it would be used in a real world. The scenes are too simple .

You could download the trial of Maya 2010 and load up some scenes and compare render times that way.
http://usa.autodesk.com/adsk/servlet...&siteID=123112
Modelworks is offline   Reply With Quote
Old 03-18-2010, 07:29 PM   #18
Modelworks
Lifer
 
Modelworks's Avatar
 
Join Date: Feb 2007
Location: North Carolina
Posts: 16,237
Default

Quote:
Originally Posted by KIAman View Post
Although a slight derail, I can often spot fake vs rendered because of how perfect a rendered image looks (aka, no blemishes, no dirt, nada). I call it "the shiny anime effect." The amount of extra information and calculations to map out the random imperfections could technically be astronomical.

I wonder if there is a point where an image looks "too good."

I wrote an article about that about 10 years ago. Computers like to generate perfect images and it does take time on the artist part to cover that up. There are quite a few tutorials out there that discuss how to make things look more realistic. I have a hard time playing games anymore because the graphics I see just make me want to grab the artist that did it and shake them . A lot of it doesn't have to look that way. My current pet peeve is Bloom effect. It is WAY overused in gaming graphics. So many people have jumped into the industry because they think it is the cool thing to do that it is really hurting the overall look of the industry.

I was talking with an artist who did a movie, the scorpion king 2, if you have seen it, the effects are horrendous. I told him what I thought of his work and he replied, well it was money and I worked as hard as I got paid to. I replied you shouldn't take the job if that is the kind of work you are going to turn out, everyone sees that and uses it as a reference to what you can do. He didn't care, it was only money to him, pride in your work seems to be losing ground.
Modelworks is offline   Reply With Quote
Old 03-18-2010, 07:53 PM   #19
HeXen
Diamond Member
 
HeXen's Avatar
 
Join Date: Dec 2009
Posts: 6,002
Default

Quote:
Originally Posted by abhaybhegde View Post
Thanks for the response , Are there any freeware which lets me benchmark ray tracing images on two different CPUs
There is some game related RT stuff. Quake 3 Ray Tracing, though i think its a hybrid of rasterization and RT, this including several others are using an Open source ray tracing engine...i think its called OpenRT. I dont know much about it but you can google it.

there is a simple RT called C-Ray benchmark. Another called BART. There is also Realstorm engine here
http://www.realtimeraytrace.de/

although non of these are like the CG stuff, dont look any better than your typical 3d game. Bart is one of the better looking ones but dont think theres an actual demo for it.
HeXen is offline   Reply With Quote
Old 03-18-2010, 11:12 PM   #20
Cogman
Diamond Member
 
Cogman's Avatar
 
Join Date: Sep 2000
Location: A nomadic herd of wild fainting goats
Posts: 9,806
Default

Quote:
Originally Posted by abhaybhegde View Post
Thanks for the detailed reply Modelworks. So bottom line is that CPU provides a generic approach where one can incorporate larger rendering based applications , whereas GPU is highly specific. Something like coding in High language and Assembly Language.. is it ?
Yikes, not to be too much of a stickler here, but the analogy isn't a good one. A better one would be the difference between multiplying by several additions vs a specific multiplication function.

CPU's do lots of things decently good, where as GPUs do very few things really well. (and this is a semi-recent development that we could even compare the two.)

Got huge arrays of data that need the same operation performed on it? That screams "Use a GPU.". Got tons of finite/branching operations on relatively small data sets? That's got CPU written all over it.

It'll be a long time (if ever) before we see GPUs used more like CPUs.
Cogman is online now   Reply With Quote
Old 03-19-2010, 11:38 AM   #21
PlasmaBomb
Lifer
 
PlasmaBomb's Avatar
 
Join Date: Nov 2004
Location: In a pub... in Cumbria
Posts: 11,758
Default

Quote:
Originally Posted by DominionSeraph View Post
Your nerd cred just zeroed. A d6 doesn't have razor edges.

Typically they don't (which I was going to call it on)...



But you can get them...

__________________

Last edited by PlasmaBomb; 03-19-2010 at 11:41 AM.
PlasmaBomb is offline   Reply With Quote
Old 03-19-2010, 12:21 PM   #22
William Gaatjes
Diamond Member
 
William Gaatjes's Avatar
 
Join Date: May 2008
Location: s orbital
Posts: 8,353
Default

Quote:
Originally Posted by Modelworks View Post
The CPU/GPU debate is about to get really heated in the professional 3d market.
The problem with GPU rendering in the past has been that it just didn't support the software. If I wanted to render a scene that used ambient occlusion I couldn't do it with the GPU, the renderer interface to the GPU didn't support it. There are lots of other things the GPU based renderers did not support , this has nothing to do with what features the GPU chip supported just the features the software implemented on the GPU.. When it came to render out work it was either split up the work flow between the CPU and GPU and re-assemble the end product somehow (rarely worked), or just leave it all on the CPU that supported everything.

The rendering software used now in the industry is extremely complex in the settings and most people in studios know how to tweak those settings to get the results they want. Try to replace that with some 'click to render' button and you better run for cover from the backlash that will result. For the GPU to become a viable alternative, someone had to sit down and implement ALL the features or people would not even consider it.

Some companies have claimed they have done it:
http://www.randomcontrol.com/arion
http://www.refractivesoftware.com/

Autodesk has Quicksilver hardware renderer coming out for 3dsmax 2011.
CPU rendering is the dominant method right now but GPU is something people are now starting to play with in 3d apps.

When GPU can render like this then people will switch .
Is this perhaps the real reason why Intel developed the Larrabee ?
To replace those CPU only render farms ?
__________________
To expand ones knowledge is to expand ones life.
<< Armchair Solomon >>
(\__/)
(='.'=)
(")_(")
William Gaatjes is offline   Reply With Quote
Old 03-24-2010, 12:32 PM   #23
gsellis
Diamond Member
 
gsellis's Avatar
 
Join Date: Dec 2003
Posts: 6,063
Default

In video, the technique was first applied by Pinnacle Systems. In fact, I do believe they were first across the board. The last version of this (Avid just announced that Liquid is now dead) would play M2V (1080i/p) in real time with effects (CPU type rendered, GPU rendered on the fly). Started as a project with ATI using DirectX on the ATI 8500 chipset. It was the first practical application that showed the benefit of the PCI-e bus vs AGP and was demoed at IDF when PCI-e was being introduced.

But as noted above, what do you want to do and that determines how fast.

Note that Avid is now using OpenGL render in Media Composer 4.x. I may take the $500 upgrade offer - now just need a Quadro card...
__________________
''Expecting the world to treat you fairly because you are a good person is a little like expecting the bull not to attack you because you are a vegetarian.'' -- Dennis Wholey

"I'm Juan Pablo Montoya. You crashed my car. Prepare to die" - Farker yequalsy's 1st grade son

"You do not need a parachute to skydive. You only need a parachute to skydive twice." Stolen from ROM/Rick Martin's sig in GemologyOnline
gsellis is offline   Reply With Quote
Old 03-24-2010, 12:45 PM   #24
TuxDave
Lifer
 
TuxDave's Avatar
 
Join Date: Oct 2002
Posts: 10,464
Default

Quote:
Originally Posted by PlasmaBomb View Post
Typically they don't (which I was going to call it on)...



But you can get them...
PFFFT!!! You can't fool me! Those are rendered too!
__________________
post count = post count + 0.999.....
(\__/)
(='.'=)This is Bunny. Copy and paste bunny into your
(")_(")signature to help him gain world domination.
TuxDave is offline   Reply With Quote
Old 04-04-2010, 09:06 AM   #25
SSJ4Gogeta
Junior Member
 
Join Date: Mar 2000
Posts: 6
Default

There's some aliasing which gives away that the image is rendered:


(I can't upload attachments so I had to upload the file elsewhere. What is the minimum number of posts you have to make before you can upload attachments?)
SSJ4Gogeta is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 09:05 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.