• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Animated Movies

Xenon14

Platinum Member
I know that these movies, like Shrek2, require lots of rendering power (exactly how much I don't... anyone?). In either case, the movie's quality looks good, but considering the quality of graphics my 2500+/9700pro render at over 30fps, why does a movie at let's say even 5 times the quality of half-life2 takes a lot more than 5 times the power to render at a much much slower speed?
 
well, "quality" is a pretty vague term is part of it. For a digitally rendered scene to look significantly better than another, it has to incorporate a lot more detail and calculate how the light bounces off a lot more faces. Basically, to make something look just a little bit better takes a LOT more calculation once you get really high up in quality.
 
They are ray traced. And any current film would have much more than five times the detail of HL2 (ignoring the fact the the game's graphics are crap 😛). 16x AA, volumetric soft shadowing, motion blur, animation and textures at a much higher resolution, etc.

A
B

In addition, your 9700 pro is optimized specifically for 3D work, and further for speed. Rendering labs use general purpose computer processors and optimize for quality.

But I suppose some people just don't notice the details.
 
I always thought the visuals were based upon what your graphics card can handle, but the rendering in itself is handled more at the processor level than the graphics card level.
 
I recall a movie, whose graphics were astounding even for today.. "Final Fantasy" I also recall reading that the "new" Nvideo graphcis card (I think it was either the Geforce 3 or Geforce4 at the time) could render those scenes at something like 1.5 fps? Can anyone confirm this?

Edit: also, this is from memory so I'm not sure how accurate, but they said ti took about 2 years of computing power (whatever that means in actual processing power i dont know) to render the film.
 
Pixar is using a renderfarm comprised of 1024 2.8Ghz Xeon CPUs to render 'The Incredibles".

A single frame can take minutes to hours to render, and you have 30 frames per second.
 
Originally posted by: digitalsm
Pixar is using a renderfarm comprised of 1024 2.8Ghz Xeon CPUs to render 'The Incredibles".

A single frame can take minutes to hours to render, and you have 30 frames per second.

That is mind boggling considering how far computers have come since the days of Jurrasic Park or even Toy Story.

There was an interesting article on Slashdot.org about distributed computing for rendering movies. Shrek@Home
 
Video game designers do all they can to make their graphics look OK when rendered on video card hardware, but movie models have FAR higher polygon counts and texture resolutions than any real-time rendering hardware can handle. Also, film resolution is thousands of pixels by thousands of pixels doing full raytracing (as mentioned earlier - if a person stands next to another the light reflecting off one person subtly affects the other and vice versa.) In order to do lighting and subtle effects not possible with specialized rendering hardware often each frame is rendered over and over as many layers - one layer for lighting, one layer for shadows, one layer for this detail, one layer for that detail then they are composited.

Also, video cards are built with limits and certain capabilities. If movie makers want to do a new hair or water effect they need to be able to write a program to generate the image they want, not work within the confines of pre-determined hardware rendering abilities.

Games can do a lot to look danged good, but when it comes down to it, video cards can't begin to replicate what hollywood render farms do in the slightest. If you rendered Final Fantasy on an Nvidia card you'd wind up with something that looks like Reboot 😛
 
If you rendered Final Fantasy on an Nvidia card you'd wind up with something that looks like Reboot 😛
I liked that show.😀

But that's a good example of how things have progressed. That was once ultra-high quality animation; no machine could render ReBoot in real time. Now studio animation has progressed and it isn't so impressive anymore. How do we progress when game cards can render Shrek in real time?
Before, we were staring at monochrome displays, now we're firmly capped at 24 bits per pixel. Will the same happen to polygon count?
 
Quick question for anyone in the know....

For a modern animated movie like Shrek 2, what resolution do they render at?

/frank
 
Back
Top