Making 10,000 BC with the Motion Picture Company



10,000 BC by director Roland Emmerich takes audiences to a time when mighty mammoths shook the earth. In this in-depth article we examine just one part of this blockbuster film and drill down on complex task of producing photorealistic mammoths that, while not the heros of the film, certainly gave the film power, gravitas and some ripping good thrills.

small
Steven Strait as DLeh


In a remote mountain tribe, the young hunter D'Leh , (unknown actor Steven Strait) is in love. But when a band of warlords raid his village and kidnap his beloved Evolet, D'Leh leads a small group of hunters to pursue the warlords to the end of the world to save her. From the start of the film to the final climax, the film required herds of Mammoths seen both in breathtaking wide shots and close to camera. Free and enslaved, being captured and harnessed and all dramatically photoreal.

The story opens in a remote valley where the Yagahl tribe subsists by taking down one giant mammoth from among the massive herds that thunder across the land on their yearly migration. "The Yagahl are known as the mammoth hunters because they rely on these animals for their survival," comments director Roland Emmerich (Independence Day, The Day After Tomorrow) in the film's production notes.

"The mammoths represent what the buffalo was for the Native Americans. On the one hand the tribe hunts it, but they also honor it; they feel blessed by it. It's a very natural hunter/animal relationship." The Yagahl live on the edge of survival, just barely living on what they can find and cull from the herd. "Now they're coming to the end of the Ice Age, so the climate is changing. They realize the mammoths don't come as regularly anymore."


small
Director Roland Emmerich with camera operator Pete Taylor on set

Emmerich enlisted visual effects supervisor Karen Goulekas, with whom he has collaborated on past films including Godzilla and The Day After Tomorrow, to oversee the film's massive effects undertaking. "Karen is one of the most ingenious and visually inventive people I have ever worked with," the director states. "To her, nothing is impossible. I know I can count on her to bring even my most ambitious concepts to the screen—often more spectacularly than even I first envisioned them."

The most extensive work would involve the creation of the film's menagerie of mighty, ancient creatures – mammoths, the saber-tooth tigers and terror birds. Emmerich wanted life-like movement for these creatures and so looked to their modern-day relatives. During production, Goulekas and her team joined the actors and on-set crew armed with measuring sticks, flags and other objects painted blue, to eventually be replaced by moving digital creatures.

Nicolas Aithadi, visual effects supervisor for The Moving Picture Company, has numerous feature credits on his resume. They include Elizabeth: The Golden Age (digital effects supervisor: MPC), X-Men: The Last Stand (visual effects supervisor: MPC) Harry Potter and the Goblet of Fire (supervisor: MPC), Charlie and the Chocolate Factory (cg supervisor: MPC) since joining MPC in 2002. Aithadi began his career working as a draftsman for Cryo Interactive in Paris, where he learned to combine his technical drawing skills with basic 3D computer modeling.

10,000 BC required mighty mammoths, mighty wooly mammoths, with digital fur - meters in length. Not only did theses hairy beasts need to charge in herds but the film required these beasts to be trapped by nets made of digital fur spun rope, all on fields populated with digital grass. Aithadi, visual effects supervisor of The Moving Picture Company (MPC), described some of the vfx frames as having "more (digital) hair than I had ever seen in my life... it was a nightmare of hair....there were hair (simulations) everywhere," he jokes.

MPC contributed over 150 shots to the film, but shot count does not really equal workload as "every single shot was an extremely complex shot" states Aithadi. MPC did two principle sequences: 'Mammoth Hunt' and 'Giza'. To create the herds of mammoths the team carried out extensive R&D, developing 'Furtility', a fur creation and grooming tool, and enhancing their Alice crowd simulation software.

Basic Hair


MPC first got involved in the project in 2005. The first thing they did was work on a test based on a design done by
Patrick Tatopoulos, the famous creature designer (I am Legend, I,Robot, Godzilla, Underworld). These test resulted in MPC being awarded the project and Aithadi moved over to join the project in June 2006. To achieve the vastly complex hair, MPC used its own Hair program 'Fertility'. The fur system was the first thing done on the project. Around December 2006 the tool became stable enough to use.

Prior to that the team had been focused on refining the actual look of the Mammoth. "Roland Emmerich is someone who has a very clear idea what the mammoth would look like, without ever having to have seen a mammoth," says Aithadi. The team spent a lot of time refining the wool or fur look in the early tests. There was a wide variety of different types of fur needed at different points over the beast. "We played with thickness of the hair -- fluffy, not fluffy, straight hair," Aithadi relates. "We spent a lot of time grooming the mammoth in just about every way possible...we working on refining this over 6 months."

With the mammoths seen both far from the camera as well as in extreme close up, the fur or hair had to look completely natural at different distances. It also needed to emulate the properties of the matted, tangled and dirty hair of the everyday outdoor mammoth, so the texture library numbered more than 660 different examples. With CG fur growing up to 2 metres (6 feet) in length, render time was long with a plethora of vertices to calculate per frame.

Along the way, the team developed a set of small tools since just about everything they needed to do could not be done with commercially available software. As the hair is over 2 meters long, this translates to a set of 35,000 vertices or points to compute every frame. It was not viable to use normal software appliactions to achieve realistic simulations.

The close-up and distance shots actually required different parameters to make the fur work visually. This, in and of itself, presented no problem as the Fertility hair software was by then very flexible. The problem came when the camera moved within a take from a wide shot to a close-up. The final result looks seamless but the actual hair thickness is dynamically changing during the shot. MPC had to work carefully to animate the parameters to change the hair, while making it appear as though the fur was not changing during the camera move closer. Having their own system was the only path forward for MPC. If something such as this or complex digital grooming came up, then their own R&D facility could solve the problem directly.

This was particularly the case with the net scene. In one section of the film the digital mammoth had to be caught in a net made of knotted mammoth hair. There was a partial net on set that needed to be matched to, but the digital net spanned approximately 20 by 10 metres and was made with "hundred and hundreds of links", explains Aithadi.
Two shots show full screen hereo mammoths. For the sequence showing D'Leh almost trampled by a herd of rampaging mammoths, the team needed to integrate the main actor into the centre of the her. To achieve this the team created digital doubles of the actor which was composited into the centre of the crowd. Much of the grass was removed and furtility fueled cg versions were re-rotoscoped in, with grass seed pods adding extra realism.

Caught in the Net


Step 1:

By this stage the mammoth is already built and groomed with multiple styles of digital fur, all generated by Fertility in-house. The animation team had complex run cycles to solve as elephants always have three feet on the ground, and yet the director wanted the Herd to gallop.

For the net, the team started with basic geometry and began painting textures on the ropes.


Step 2:

MPC added displacement and realized that nothing short of a full hair simulation would work. The Fertility team worked on producing render solutions as the net would end up being more hair by area than actual mammoth. This was compounded by the fact that the hair needed to be treated around the core of the 'rope'. Again, the Fertility team came up with a way of twisting the hair along the path of the ropes, rather than just hanging.



Step 3:

At this point, the net as an isolated object was looking accurate and valid for the scene, but the next problem was how to make the hairy rope simulation interact with the hairy mammoth simulation. Hair collision was not the full solution, as the net needed to break and stretch and cause the correct interaction with the mammoth hair. The team approached this by animating the mammoth hairless to look like it was hitting the net and being restrained by it. This naked mammoth then feed a Syflex cloth simulation. Syflex was used to drape the net over the naked mammoth.


Step 4:

The MPC team then added the hair. At first the team tried using normal collision dynamics, but this proved computationally unacceptable. So the team once again tried to think laterally and come up with a mix of occlusion vectors and normal vectors. Occlusion maps normally show where ambient light is not strongly present, but ignores any actual lighting of the scene. Since this tends to happen in creases and where objects rub together, the occlusion map provided a black and white matte of areas of fur interaction, but independent of scene lighting.

This black and white map, combined with motion vectors of the direction and location of the net movement, provided just the insight needed to know where to compress the fur on the mammoth and where to allow it to spring up in between rope sections. If the rope was pressing at any point the occlusion map would be dark and the rope motion vectors would indicate how the fur should be depressed on the mammoth. If there was no rope sections near a patch of fur, the ambient occlusion would be white and the fur could look un-flattened. "The darkerthe occlusion the closer the net was... and the direction vectors also told us how affect the net," says Aithadi. "It worked very well and we had no occlusion, the net moved up and down and the fur responded correctly." The same occlusion map feed hair system was used at the end of the film when the mammoths were harnessed.


Step 5:

For scenes where the mammoths also had chains and harnesses, the team did rigid body dynamics to animate those particular pieces.



Step 6:

Depending on the environment and the state of the beasts, the team then needed to either apply sheen to the hair or dust and matte it down.
The mammoths in early scenes need to have dried mud and foreign objects caught in their matted hair. MPC R&D provided the solution in allowing the animators to attach geometry to the hair and it would move with the hair. Thus, the mud is actually just rigid body simululation.



Step 7:

The mammoths then needed to be placed into the environments. In the beginning of the film, the team realized that to have the feet and legs of the Mammoths walk in the high grass of the open fields, each blade of grass would be need to be rotoed. Digital grass was added, again using Fertility. To match the on set photography exactly, the digital grass needed husk or seed pod tops, not unlike wheat. Ironically, as the grass was using the hair system, the same solution was applied as was used for the mud. Geometry was again attached to the grass, but this time resembling wheat. It moved and reacted just as the original grasses had during principal photography, but now trampled by the herd of mammoths.


Gaza and a Crowd like Alice


For the Gaza sequence, a similar problem needed to be solved. "An amazing miniature had been built in the desert in Namibia" explains Aithadi. Emmerich sought to film the plates using helicopter to get the same effect as some of the earlier shots in the film. He enlisted his effects team to create giant models of the pyramids which he could shoot using a Spydercam, a remote control-operated camera attached to wires.

The team erected the miniature replicas of the pyramids, the palace, the slave quarters and the Nile River close to the practical pyramids. Built on a scale of 1:24 in Munich and then transported to Namibia in fifteen sea containers, the set covered approximately 100 square meters. The Spydercam allowed the director to move freely through the miniature set, providing spectacular 360 degree aerial shots that harmonized with the film's earlier aerial sequences.

Gary Brozenich (Rome, The Da Vinci Code, Kingdom of Heaven), MPC's on-set supervisor, photographed the model in every way and in every light. "When we get these plates we realized we need to get people behind all sorts of things - including behind pieces of scaffold that were then less than one pixel in size. So to roto every aspect was just not going to be feasible," says Aithadi.

There were a large number of shots in which up to 40,000 digital extras to be added. So MPC came up with the plan to model the entire set in 3D. Based on the accurate photography they had from set they spent a month modeling the whole environment, and then re-projected the photography of the miniature back over the 3D geometry. This was done meticulously to match the plate photography just as it had in the grass scenes earlier in the film. When the environment was finished the slaves were added no roto was required.

MPC used their proprietary in-house crowd software 'Alice' to generate the digital extras. The program was developed specifically for Troy and at that time the company also made an investment in motion capture hardware. MPC also used this software for two other feature film projects the company was working on at the time: Oliver Stone's Alexander and Ridley Scott's Kingdom of Heaven. For both films, MPC completed standard motion cycles and infantry capture. The team also spent two weeks with Artem Digital capturing horses, producing data that was used for both Alexander and Kingdom of Heaven.

For the mammoths of 10,000 BC, the Alice software needed to be extended. "We had to upgrade it to handle the herd of mammoths running along at the beginning of the film, as we had 110 mammoths: all running -- all with fur" explains Aithadi. "So we had to update Alice to work with giggle - which is our geometry scripting tools - which is the way we define geometry for renderman". Alice also had to be updated to include 'Alice Cloth' which allowed the crowd to have correctly simulated rags, cloth, and pieces of cloth. The render farm requirements were huge; without complex data handling the shots could not have been possible.


Workflow


The team used Maya for animation and modeling. Lighting and shading used MPC's 'Tickle' software, which is also the bridge to Renderman. Maya lights are not used anymore - 'Tickle Lights' are used. Everything is then generated from 'Giggle script', which is MPC geometry description script which feeds Renderman. About 20 to 25 passes go to Shake for 2D compositing. This includes many of the standard passes such as specular, diffuse, etc. There are also some special normals passes which are used to manipulate a pseudo-3D environment MPC has inside Shake. Even though Shake is 2D, the special 3D normal passes allow last minute adjustments in 2D that actually almost appear 3D. The mammoths required extra passes such as transmission, sheen and others depending on the shot.

For the live action film, many of the shots with the mammoths actually are now almost entirely 3D. It is testament to MPC's craft that they intercut with full live action shots perfectly.

Nicolas Aithadi's next project is as VFX supervisor for MPC on Harry Potter and the Half Blood Prince.
Add to Onlywire

No comments:

Add to Technorati Favorites