Ns1
No Lifer
- Jun 17, 2001
- 55,420
- 1,600
- 126
Adam Savage says the Engineers were done with practical effects (costumes and makeup). I don't see how that's possible.
"We ran some tests [on the Engineer] to basically convince Ridley that we could do better than prosthetics," suggests Martin Hill, Weta Digital's VFX supervisor. "He's like an Adonis, the perfect humanoid with white skin. They had a maquette built, which Ridley shot and lit. So we built the same bust from scratch and replicated the lighting and the skin quality and the translucency, but we made him move and made him articulate with blinks and expressions. On the basis of that Ridley decided to go digital.
"Ridley wanted to get as much in camera as possible, so it was very much the antithesis of a virtual studio in a way. On set, he had an actor completely made up with silicon over his whole body, which he shot for non-visual effects. And that presented a bit of a challenge for us because, if we want to make a visual creature, we add musculature and make it as physically correct as possible. But, of course, we have this slight dilemma here. We need to match the onset Engineer as well as other creatures later on and make something convincing and compelling and obviously very real. And so we built this digital Engineer and there are some interesting compromises. What we're actually representing is an actor and what we found straight away was that we can make a digital humanoid with pretty convincing skin. We've advanced the technique since Avatar for our subsurface algorithms. But trying to replicate the human with the extra silicon on it was a completely different situation."
The Engineer presented new challenges for Weta involving new subsurface algorithms to overcome a waxy look.
*
They actually carved vein patterns into slabs of silicon to get it right. And that presented a whole new series of challenges involving new subsurface algorithms. "To represent a very translucent piece of silicon, you want to increase the depth of all your subsurface," Hill continues. "And the problem is that you lose any sense of internal structure. The light bleeding the Engineer was so deep that he started to look waxy, so we had to advance all our technology to be able to put inner structures within our subsurface. This way we got a sense of the bone or cartilage inside the nose and the bones in the fingers that would actually block internal light. We added extensions to the quantized-diffusion model for rendering translucent materials that was presented last year at SIGGRAPH by Eugene d'Eon and Geoffrey Irving."
http://www.awn.com/articles/visual-effects/prometheus-bringing-alien-21st-century