Jon Favreau and VFX Supervisor Rob Legato, ASC, revisit (a virtual) Africa for another Disney classic, The Lion King, shot by Caleb Deschanel, ASC.
by Kevin H. Martin
Walt Disney Feature Animation’s The Lion King was among the early 1990’s biggest hits, cementing the studio’s resurgence in the industry. The coming-of-age tale of an African lion prince striving to succeed his late father as ruler was even successfully translated into a Tony-winning Broadway musical. Jon Favreau, who directed a hit reboot of another Disney animated classic, The Jungle Book, which employed virtual world-building, was a natural choice to lead a new version of The Lion King. But rather than incorporating a live-action element with virtual environments (as in The Jungle Book), The Lion King’s entire universe, with its abundance of animal life, would be simulated. This process was enhanced through the use of virtual reality (VR) to plan for the practical aspects of the shoot, which would be driven by real-world film methodologies alongside cutting-edge new tech.
Favreau reunited with many key players from The Jungle Book, including Visual Effects Supervisor Rob Legato, ASC, and Technicolor’s MPC Film (the only VFX house on the show). MPC Co-Supervisors Elliot Newman and Adam Valdez oversaw 1,200 artists and animators, whose efforts were enhanced via an innovative collaboration with Magnopus (headed by virtual production supervisor Ben Grossman), along with Vive, Oculus, and Unity, resulting in a unique workflow that leveraged VR, as well as CG, advances. New enlistees for the virtual odyssey included Production Designer James Chinlund and Director of Photography Caleb Deschanel, ASC, who was supported by a team of Guild members, including A-Camera/Steadicam Operator Henry Tirl, SOC, and Key 1st AC Tommy Tieche.
Jon Favreau (Director):When you’re taking on a well-known and beloved story, you are doing so knowing that the element of surprise is not going to be the factor it would be in a new film as it unfolds. There is some additional content here – we’re about a reel longer than the original – but the basic beats are essentially the same. As a result, the effort has to be focused on the quality of the thing, meaning you have to put on a wonderful production.
Rob Legato, ASC (VFX Supervisor): We’re using the same approach as The Jungle Book, but without having to insert live-action into our created-from-scratch virtual world. We got rid of the backbone of our old virtual camera system, [Autodesk] MotionBuilder, and did everything on the Unity game engine, for its VR capability. This let us walk around [the capture volume] with headsets on, “seeing” the environment like we were scouting an actual location. Working as it if were live action influenced the look of the film by taking it towards traditional cinematic language, shooting multiple takes, which let us get away from the more rigidly storyboarded execution of traditional animation. On Jungle Book I had worried we wouldn’t have any happy accidents, but then I found that by walking a couple of feet to the right, I’d see something that hadn’t been intended but looked better. I learned to take advantage of that, and I think Caleb went through a similar experience on this film.
Caleb Deschanel, ASC (Director of Photography): I think what Jon saw in me is a guy who would bring a realistic look, while still being open to exploration. Wherever you are shooting, the job is still to find the best angle, and then determine how to shoot it and move the camera to express the emotional content of that scene. At first I was thinking we would miss the serendipity of a sudden thunderstorm, or an actor doing the unexpected to add character nuance. But there were many examples of serendipity owing to being able to do so many takes quickly, which also allowed for experimentation.
Favreau: For a 2D movie, the 1992 film did a really nice job depicting Africa, so we took inspiration from that, decoding which parts of the real country they referenced before taking our own excursion on location [with Blue Sky Films.] None of the location shooting was intended to be used for finals, just as reference for the animation. But I did throw one real shot into the movie – just to see if anybody can spot that out of the other 1,400 [laughs].
Deschanel: We spent two-and-a-half weeks in Africa, shooting with the [Panavision Woodland Hills-supplied] Arri Alexa 65, capturing animals, sunrises and different locations. For the spots we really liked, MPC Film’s guys photographed a lot of details – vegetation, rocks, sand and other textural bits, which helped us capture atmosphere.
Tommy Tieche (Key 1st AC): In Kenya, we used Panavision Sphero 65 prime lenses and a custom modified Panavision 150- to 600-millimeter zoom with a built-in 1.4 extender. Additional photography in the high desert of California for moon and night sky elements used a Panavision 2400 millimeter – [actually a] Canon 1200 millimeter with customized 2× extender [that] covered the Alexa 65 sensor.
Favreau: These hundreds of textures and thousands of reference pictures let us build up Africa from scratch. Like any animated film, the artists we had were the key, and if you looked at our preproduction art, it was like walking the hallways of Pixar. We worked our way toward photorealism, first with animatics, then with more fully rendered CG 3D environments that we could pull into virtual reality. As a group, we’d do a scout – six of us wearing VR headsets, walking around the “location” we were filming. Caleb had only done live-action films, but he settled right in, saying, “I would put a crane here, and lay dolly track there,” just like he was doing a proper tech scout out in the field.
Deschanel: You still scout locations, but instead of getting in a 15-passenger van and driving on dirt roads, you put on goggles and can fly around the environment using these little handsets with laser pointers. That let us fly around Pride Rock and the various watering holes and other locales built for and featured in the movie. If we found a spot where we wanted the camera for a particular setup, we could set a marker that looked kind of like an iPad, dropping it in place with a particular lens on it as a reminder.
James Chinlund (Production Designer): We tried to incorporate what we found on location, while also remembering we had to create a bridge between our picture and the original. We built the whole location as one continuous environment with a true geography, with an eye toward the distinctive formations in the original, but assembled from the best of what we found on location. That film had a generic sort of jungle rainforest, but we discovered a beautiful “cloud forest” on the slopes of Mt. Kenya like nothing I’d seen in my life. Injecting that into the forest space gave us a new way to depict that locale while revealing an exciting and unique ecosystem.
Deschanel: I worked with [lead lighting artist] Sam Maniscalco, picking one of the 350 skies we had, then we’d move the sun around to where we wanted it. On a conventional movie, the sun comes up and moves across the sky while clouds come in and out. You take all that into account, shooting in one direction during the morning and the other in afternoon. In the computer, we could have decided to not let the sun move at all – but found it worked better to move the sun for nearly every shot, marching it to our orders instead of nature’s.
Legato: When you looked through the portal, it excludes everything that is not in the composed frame, so you can’t plan your move around the off-camera logistical elements. You can switch and look at things in VR, making it all appear more like a live-action stage, because your brain is cemented into a whole space beyond what the camera sees – you know that there is a tree just off-camera to one side and can plan your move accordingly with the same sense of assurance that you’d have if you kept your other eye open while looking through a regular viewfinder. The VR approach can help inspire you. When you’re supposed to be atop a cliff, you’re going to have a visceral response to looking down from such a vertiginous space. It isn’t real, but it feels to you like you could fall a thousand feet if you missed a step, and that triggers you as a cameraperson to react accordingly.
Tieche: When we began stage work after Kenya, we filmed actors playing out different scenes on a small-scale theater, so animation could use the footage to create mannerisms and expressions for the characters.
Favreau: What’s nice about having a human performer as a basis for animation is that you inherit all the acting and vocal choices. Unlike motion capture, where you retarget a performance onto a CG rig, this was all keyframe animation. [Production Animation Supervisor] Andy Jones and his team would see Billy Eichner express an emotion, then have to figure out how a meerkat’s features would move to convey that same feeling.
Henry Tirl, SOC [A-Camera/Steadicam Operator]: I’ve operated camera for plenty of blue screen, green screen, and miniatures for the last 39 years. Shooting things that don’t exist isn’t usually a big deal, but this process was really exciting, to the point it twisted my head around in a circle. When I was first shown what I thought was a reference image of a baboon, I studied it and then asked to see their computer representation of this character. I got these funny looks and they pointed at the screen. It was only then it dawned on me that I was not watching some National Geographic 4K documentary footage!
Chinlund: We launched heavily into character development at the start. Our intention was to make the characters photoreal and believable as what they were, rather than caricatures. We looked at a lot of documentaries, and Jon’s goal was to have an experience resembling what is found in nature. There are some adjustments to that along the way, but what we really learned is that you can’t improve on nature.
Favreau: In the animal kingdom, you don’t see warthogs grinning and raising their eyebrows. Rather than mash-in a human-looking performance, we leaned heavily into genuine animal behavior. If you listen to Bambi’scommentary, you hear Disney wanting it to feel more real than Snow White, and they discuss having to embrace animal behavior, and not draw too much attention to the artifice inherent with animals talking. With libraries of meerkat reference, you might find the animal expresses that emotion through jumping. So the actor performance is the emotional core of the animation, which in turn is influenced by what we observe in nature. With lions, they don’t have a lot of facial expression, so the animators had to become experts in body language as well, since the lion’s posture would reflect its emotional state.
Deschanel: When we shot on stage, [MPC Film] had already rendered a pretty good-looking but limited version of what we would ultimately wind up with – the animals had been animated very realistically, but the leaves and grass weren’t all the way there. Jon worked with Andy to get the performances out of the animals that were possible given the limitations of this game engine. The one rule that Jon really wanted was that the animals wouldn’t do anything they couldn’t do in real life. Lions don’t pick up food with their hands and eat with their fingers. They do talk and sing – but aside from that, the movements are very much like the real thing. Mandrels have hands like ours, so their gesturing can be much more human while still remaining in character.
Tirl: The filmmakers wanted something less clinical than a perfect computer camera move. Ultimately, we wound up with all the usual filmmaking tools when a human operates camera, including dolly, Steadicam and fluid heads. We worked from what was essentially a sophisticated previs animated in 3D space. I made meticulous tracing marks on the carpet showing where I needed to move. These indicators would show up on my monitor when I hit the right spot. That gave us such a good representation of the characters and environment playing back during shooting, I had to fight to keep from getting dizzy a couple of times.
Tieche: We began virtual production as scenes arrived from animation and Magnopus [responsible for the GUI interface], hybrids of the environments we filmed, like the stormy sky from the Masai Mara over the rolling hills of Borana Conservancy, or a Samburu sunset on the lush marsh and trees of Amboseli. [There was] a virtual dolly, a virtual crane and a virtual remote head, which all could be modified in terms of scale once we had plotted the shots through VR goggles. After the goggles came off, we’d hop behind our monitors –which essentially became the eyepiece of the virtual camera. Our virtual camera, designed with Alexa 65 specs, [let us] choose lens sizes and T stops and pull focus between characters.
Tirl: In traditional filming, I would have a transmitter on my Steadicam sending the image out, but here, they had the transmitter and I had the receiver on my Steadicam, so depending on where I was on stage, I would be seeing the correct view. Since I had no actual camera on my Steadicam, Panavision Woodland Hills helped fashion a camera plate that mimicked the weight and inertia of an actual camera as you panned it. They took extensive measurements, taking into account where the lens would be relative to the fulcrum and tilt points, then added a little helicopter blade of sensors on top of the plate arrangements.
Deschanel: There were OptiTracks with cameras all the way around the stage that read LEDs that were in the place of the camera on the Steadicam, so when Henry moved up/down or right/left, it would move in the virtual space in a corresponding way. Henry had to learn to trust what he saw on a seven-inch monitor instead of what he felt under his feet. Steadicam operators typically learn from the feel of climbing stairs or hills, but in this case, the computer would build a ramp to represent the slope of a hill, while he was in actuality only moving on flat ground. So there’s a disconnect between what he saw on the Steadicam versus how he felt moving around. It was a different kind of choreography for him to learn.
Tirl: Nothing in Africa is flat. And Pride Rock was not plumb to the ocean, with an 8- to 10-degree slant. So when a character comes toward me and I’m backing up, I realized that I was suddenly ten feet above them, because while they were descending the rock, I was on a flat-carpeted surface. The programmers said, “No problem” and entered a correction that put me in synch with the characters, without having to squat or change my shooting. They tilted the previs image to suit my perspective. And if I ran on a path that would take me through a tree, VFX could tag the tree and pull it out during the part of the shot when I’m passing through, then plug it back in before it reenters frame. It was like being on a stage where you fly walls, but with infinite flexibility.
Tieche: The idea was to always achieve each shot with a practical approach as if we were making live action. We didn’t want to stretch beyond the use of any modern tool or device you’d find on a live production. For example, if a shot needed a crane move at the top, but [with] Steadicam stepping off the crane to follow, we would plot the camera move that way, always rooted to the reality of practical filming. Even if the shot essentially required an arm car or motorcycle with a stabilized head on the back, that’s what we’d try to emulate. We even had drone work at our virtual set with a real drone.
Legato: We wanted the drone pilot to do a VR walkthrough to get the sensation of flight beforehand, so he could sense obstructions like tree branches. That experience added to the flavor of the shot and the reality of the world, but this approach would be useful on a more traditional shoot, too, letting you work out all of your issues on the ground beforehand, so when you took the vehicle airborne, you could get the shot in record time.
Tirl: A bunch of the characters are walking along singing a song, with a lemur bouncing on the warthog, and they walk for a mile during this passage. I was supposed to walk along with them, circling around behind, kind of me dancing around them with Steadicam for five or six minutes. No stage can accommodate that size move, but they devised “the magic carpet.” It was like I was attached via a very flexible bungee to the nose of the warthog. As the trio walks along, I remain tethered, so I can come in close enough to kiss them, but then range back twelve feet. It was like I was on a virtual circular treadmill.
Deschanel: After a while, it really felt like making a regular movie. Except that you didn’t need handlers to wrangle 500 wildebeest back to their starting point for another take. And then we didn’t have to wait for the dust to settle!
Favreau: All the elements of a real environment are integrated into our scenes. When Simba walks across dunes, the desert winds blow dust behind him. That kind of detail was a direct benefit of our committing to MPC Film as the sole vendor because we could allocate the full effects budget in a way that let them make investments in R&D. They were able to develop and greatly refine fur simulations, atmospheric effects, and stuff that may not make for headlines in periodicals, but when you watch the movie, is the kind of stuff that makes the picture breathe with its convincingly naturalistic detail.
Legato: We relied on photographic reference to stay close to the way dust hits look in sunlight. Though it is expensive computationally, these simulations can recreate the properties of real-life, like water simulations when you have a hand stirring liquid. Light simulations deal with how it bounces off walls and absorbs certain colors. Shooting reference is now preferable to shooting actual elements. I tried to shoot a live dust element and couldn’t get it to work because it just didn’t make for a perfect fit with the world we were building. Sometimes the real thing looked more impressionistic, strange as that sounds. A lot of our most difficult stuff gets simulated through real physics, but it means having to go for it because you don’t just get a rainbow from prismatics in the air. The caustics in light are valuable because when you bounce light off a shiny object, you’re going to reflect on a wall like an underwater effect.
Favreau: Another thing that helped was that we weren’t changing concepts or doubling back on creative decisions halfway through. Live-action films tend to adjust constantly throughout production and post, which leads to a lot of rushed work happening before release. If you don’t give the artists time to dig in and do the work properly, the results are not going to be at the same high level that they are otherwise capable of delivering.
Deschanel: You try to get things to come through as they were first filmed, but since every day had been a process of discovery and problem solving, there was still some serious DI. But what I found phenomenal was how excited I got when seeing these final tiny VFX details, like the way fur looks and moves when hit by wind.
Favreau: [Supervising finishing artist] Steve Scott is a colorist who came out of compositing, which is fitting given that there’s only a fine line separating VFX compositing and current DI tools. Getting all of the elements together to go into this film is like putting on a magic show. Part of fooling the audience is telling a compelling story. If you ever saw [magician] Ricky Jay perform, the storytelling aspect was even more compelling than the illusion. When I saw The Lion King on stage, I knew what was going to happen from the movie, but the way they did the puppets and staging along with how the music was interpreted all added up to something compelling and changed the way you watched it. There were a lot of clues for me when watching [the Broadway production] that I kept in mind while making this film.