Chris Edwards – The Third Floor

It’s a fair bet that a number of ICG cover stories from recent years include mention of one of the industry’s top previsualization firms, The Third Floor. The company’s CEO and creative director, Chris Edwards, began his career in layout and animation at Walt Disney Feature Animation before working for George Lucas’ Skywalker Ranch (on the third floor), previsualizing the final Star Wars prequel. Soon after, The Third Floor took form, tackling a variety of conceptual challenges ranging from sci-fi [Gravity, Oblivion, Divergent], fantasies and action [Disney’s Oz: The Great & Powerful, Marvel’s The Avengers] to military movies [Battleship and War Horse] and motion capture − driven features [Beowulf].

Completing as many as 60−70 projects annually from its L.A. base and satellite facilities in the UK, Canada and Australia, The Third Floor also previsualizes commercials and episodic TV [Black Sails, Game of Thrones season 4], designs video game cinematics, and has branched into theme park work. Edwards, a founding member of the non-profit Previsualization Society, which promotes previs as an art form, has actively lobbied for cinematographer participation throughout the previs process, even making a presentation at the ASC, as Kevin Martin discovered in their conversation.

 

For filmmakers used to working from shot lists and storyboards, previs was a hard sell. Has that changed? Edwards: Working with filmmakers who never considered adopting previs as a part of their toolkit is exciting for us. When I first met director/cinematographer Barry Sonnenfeld, it was clear he wanted to make sure we could communicate using his visual language. Once he was convinced we knew film language – that we weren’t just computer guys – our creative collaboration flourished.

Is previs for everyone? I think even low-budget shows without heavy VFX can benefit from the process, since it helps filmmakers focus their creative vision, and ultimately saves time and money on set. We have an in-house story department and use storyboards ourselves when appropriate. But there are some things that are more difficult to draw than to previsualize – and the opposite is also true. If a superhero is flying through the city, you’re going to need a highly skilled artist and a ton of boards to convey an accurate sense of motion and timing. The boarding and previs teams can riff off one another’s ideas to produce the most effective final scenes.

So your clients are a bit savvier about what you offer now? I used to get questions like, “Why do I need previs?” or “What is it?” Now I’m more likely to hear, “What type of previs should we use to tackle our production most efficiently?” To answer that, we consider the unique challenges and ambitions of each project, and develop an approach that we pitch back to our clients.

How extensive might such a plan be? If the project hasn’t been green-lit, it might start with pitchvis, which can be a trailer-like sales tool or a proof-of-concept for a studio. A final version of one of the pitchvis shots we worked on can be seen in one of the trailers for Godzilla. It’s a tilting shot up Godzilla’s torso, but not framed in a conventional see-everything way; we don’t even get to his head, which makes it fun and effective, since you’re really only seeing what a human on the ground would be able to see. Going beyond pitchvis, we have our conceptual previs process, which culminates in techvis to facilitate technical implementation of any previsualized shots on set. In many cases, postvis is the end of our creative journey with a filmmaker, who uses the previs-enhanced footage to focus their cut before final VFX are rendered. Postvis alone accounts for roughly 30 percent of our feature film business.

Techvis explores the logistics of achieving the shot? Techvis is the process of deriving real-world information from previs to aid the practical recreation of shots on set. It is something we keep in mind while fashioning the previs, to ensure that all the shots are actually achievable on set or on location. A techvis shot may illustrate that the camera will need to be underslung on a Technocrane, or show how the move might work if it was a 30-foot Techno instead of a 50-foot. We generate metadata as our move plays out, including speed of camera, height, distance to subject, tilt angles, GPS coordinates, time of day. We often come up with multiple deliverables for any given shot, so it behooves productions to involve as many key people up front as possible, so the previs team can provide real-world results that represent the collective wisdom of VFX supervisors, production designers, cinematographers, et cetera.

What are the most significant developments in previs? One cutting-edge development is pre-lighting. Back when I spoke at the ASC, my focus was on composition and framing options. Several cinematographers indicated that while that would be of special interest to the camera operators, they themselves were more concerned with how lighting was used to put their unique stamp on the final imagery. We didn’t get any takers initially, probably due to the perception that it takes a long time to render accurate lighting; DPs weren’t interested if we could only provide a general idea of how bounce lighting might look in a scene. But now that graphics cards are powerful enough, you can approach this in a pseudo-real-time manner, which is akin to striking lights on a set. To me, that’s extremely exciting, because composition is the combination of the camera’s position, lens choice, and the subject’s position in frame. However, that subject’s silhouette changes dramatically, depending on the DP’s lighting choices. On Gravity, Chivo [Emmanuel Lubezki, ASC] was so involved in the virtual pre-light that those decisions were the driving force behind the practical lighting of the actors on set, leveraging Framestore’s lightbox.

What other creative benefits can you give filmmakers? When films shoot native 3D, they can benefit from stereoscopic previs to plan the depth settings from shot to shot. Stereo previs shots can be pre-edited and tweaked to design a smooth and meaningful depth flow, minimizing eyestrain and maximizing audience engagement. Also, as a result of the Zemeckis’ performance capture films, we’ve developed a virtual camera apparatus.

Give us your take on virtual cinematography. It’s a hot topic. Virtual cameras are coming into their heyday as a standard tool for directors and DPs who want to shoot previs from specific perspectives. Once the virtual action is created, coverage can be shot from a variety of angles. There’s also a supercharged version of our virtual camera process we call VRP – virtual rapid prototyping – which features performance capture in concert with virtual camera and editorial simultaneously. This is especially useful for scenes that are best blocked live with actors, be it fight choreography or a dramatic dialog scene. And then there’s Simulcam, a system that enables DPs to see real-time previews of virtual backgrounds and characters, which tends to improve the spontaneity and authenticity of their camerawork when live-action and animated elements need to be composed together, through their viewfinder. And this isn’t just an approach for dealing with giant robots; I think it fundamentally changes the way visual effects supervisors and DPs can collaborate.

Can you discuss particulars of your work on Godzilla and Maleficent? Godzilla director Gareth Edwards was very specific about composition, lighting and mood. And then there was the matter of determining the manner in which 300-foot creatures fight and how to cover that cinematically, which convinced us of the need to set up a virtual camera device on location that Gareth could use to explore all of the epic possibilities. Gareth understood and embraced every aspect of the previs process, in part due to his background in VFX. Maleficent director Robert Stromberg also came out of visual effects, and we had worked with him on his production designing jobs for Avatar, Alice in Wonderland and Disney’s Oz: The Great & Powerful. On Maleficent, we had a crew in Los Angeles and in London that created previs for key scenes, then took those edits via iPad down to the director and DP Dean Semler [ASC, ACS] on set. Their daily feedback helped our team improve each iteration, until they were ready to shoot.

That’s the on-set visualization team? It’s what we used to call on-set visualization, but now call virtual production. The idea is to have a previs supervisor and a small support team on the shoot. We can generate validating imagery derived from spur-of-the-moment ideas devised by the crew. We can also do postvis right there on set, to be sure it all works, before you strike the set and go home with confidence. Building out a standard workflow for virtual production is a major deal to us; the phases of preproduction, as well as VFX, are converging more toward on-set, too. There’s a previs/techvis/virtual production/Simulcam/postvis synergy that can be useful to the entire chain of collaborators, all the way through VFX and the final edit. We’re rapidly approaching having a holodeck’s worth of possibilities available to filmmakers at a moment’s notice.

We’ve talked about what previs can offer production; how can production present the best model for working with you? The more input we have, the better, and the more useful the previs can be. Increased interaction with DPs – not just to find out what lenses they use, but to hear their approach – is one thing our team is enthusiastic about. We sometimes have that interaction, but often the DPs aren’t on the show at the early phases when we are working. It would be great to see some changes that would allow a DPs time to be accounted for during early preproduction meetings and previs reviews, and even through postvis as the shots are evolving. A retainer could justify such consultations, even if they are done via Skype and cineSync.

Why the expansion into other industries, like gaming and theme parks? Many other markets can benefit from traditional filmmaking expertise, therefore diversification into these areas is our conscious plan and the key to our survival in Hollywood. Industry-wide, I see a powerful trend toward convergence in games, film and themed entertainment. Many stories can benefit from such a trans-media launch, and the first teams to effectively coordinate these multifaceted creative efforts may find themselves leading the next generation of media.