Talk about a résumé builder: Douglas Trumbull’s first feature-film VFX credit was Stanley Kubrick’s groundbreaking visual symphony, 2001: A Space Odyssey. No one would have been surprised if subsequent efforts paled in comparison. But after directing the low-budget charmer Silent Running − a space film shot entirely within an aircraft carrier and a plane hangar − Trumbull went off and invented Magicam, an analog equivalent to today’s real-time virtual production techniques. He remained active in feature VFX work throughout the1970s, first with Robert Wise on The Andromeda Strain and then ending with the same director on Star Trek – The Motion Picture. While developing Showscan, a high-frame-rate 65-mm film process, Trumbull also supervised the stunning photographic effects for Close Encounters of the Third Kind and Blade Runner. What’s more amazing than Trumbull’s work on those landmark films is that his achievements went unrecognized by the Academy until earlier this year, when he finally was given an Oscar® in the form of the Gordon E. Sawyer award.

Although three decades passed before Trumbull returned to feature VFX with The Tree of Life, he’s remained busy − directing Showscan shorts and creating Universal’s Back to the Future ride and a trio of special venue films for the Luxor hotel. Every visual effects artist in the industry owes a debt to Trumbull, who continues to search out the highest possible quality in immersive theatergoing experiences.

ICG: In your eyes, is cinema starting to catch up with where you’ve been pushing it to go all these years? Trumbull: Yeah, I suppose. [Laughs.] The baggage carried by the industry for so long is no longer an impediment because we’re digital now, and this transition is enabling a new kind of movie experience in the vein of what I tried to do with Showscan. The Academy even wants to store the negative for New Magic [his first Showscan short] in perpetuity. I’m really proud of that film because it proved conclusively that a movie could be indistinguishable from reality.

So where are your goals and interests aimed now? I am on a particular trajectory toward extremely immersive cinematic experiences involving higher frame rates and brightness levels, curved screens, greater bit-depth and other attributes that help put the audience in the movie instead of watching it. Bloggers seem to think this is a threat to the status quo, that all movies will have to be made in this way. But the 24-fps standard is still perfectly appropriate for most movies, though some classics could have benefited from a more immersive presentation, like The Wizard of Oz.

Does the industry have sufficient motivation to embrace these new approaches? Today’s Hollywood trend is big tent-pole films, but their search for a more spectacular means of exhibition isn’t getting them anywhere. A lot of expensive movies aren’t doing great business, and it might be related to subconscious disappointment from audiences because of dim screens and presentation defects. Movies costing between $140 and $400 million are bottlenecking, getting squeezed out through this tiny throttle of 4:2:2 narrowband, a presentation format that isn’t bright enough to be truly spectacular. With a higher frame rate and brighter screens you could show all the millions invested in production values to audiences. But now that all gets filtered out by the medium itself.

Have any recent films come close for you? I really admired Hugo, and Tintin exceeded my expectations. I just wish they were being seen in more spectacular venues. We’re behind the curve right now because statistically, young people aren’t going to theaters as much. More than 70 percent of movie revenues come from digital media, indicating a tremendous sea change in the way media is consumed.

Can 3D and IMAX be gateways for audiences to see filmgoing as an event. First we need to overcome the dimness of 3D. The average theater image measures 2.5 foot-lamberts. The eye is not sensitive to color at that level, so you can’t respond ideally to the 3D. By way of comparison, 16 foot-Lamberts is Academy standard, while Showscan was over 30 [post-polarization.] Twenty-four fps messes up 3D as well, due to blurring and strobing, which really bothers me and doesn’t sit well with younger eyes that are used to gaming visuals at 72 fps.

Your passion for immersive cinema dates all the way back to 2001. I got completely hooked on the idea while working for Kubrick. He went with Super Panavision, the 70-millimeter five-perf successor to three-strip, designed for deeply curved 90-foot screens, which was perfect for conveying a first-person experience. He was breaking out from traditional film grammar to take them on this trip. But today people aren’t aware of everything Kubrick tried to do because they can’t see the film in Cinerama. Small theatrical screens and home viewing are inadequate for conveying that experience. Even with a 70-foot flat screen, the effect is lost.

What are some other landmarks for you beyond 2001? A high point in my career was Universal’s Back to the Future ride, which let audiences feel like participants instead of [offering] just the usual passive viewing experience. I’m writing screenplays as highly immersive experiences and hope to explore the uncharted territory of this new cinematic language. It still shocks me nobody attempted this after Kubrick, though I tried with Brainstorm.

How did you determine the optimum higher frame rate when developing Showscan? I had audiences hooked up to monitors and graphed their responses, showing them the same story and actors at different frame rates. Recently Jim Cameron, who wants to shoot the Avatar sequels at a higher frame rate, studied stuff shot 24 fps at 4K, comparing it with 48-fps material at 2K. He preferred the latter, as do I, though I found at 60 fps the imagery, which no longer suffers from blurring and strobing, becomes kinesthetically powerful enough that audiences begin accepting it as reality. Peter Jackson’s doing The Hobbit now at 48 fps, with the option of taking it down to 24 for conventional screening.

Is your capture system downward-compatible with traditional formats? My setup here lets me shoot 3D with two Red cameras running at 120 fps with a 360-degree shutter. The exposure is identical to 60 fps with a 180 shutter, just twice the data. But the 360 shutter lets you merge any number of frames together to recapture the blur perfectly as seen in lower frame rates of 60, 48, 30 and 24, with no artifacts. Currently digital theaters can deliver 144 fps, though they’re just showing the same 24 several times over.

Virtual production is making inroads in TV and beginning to factor more in feature work as well. What’s your take on it? Here on my property in Massachusetts I have a virtual stage with an 80-foot-wide greenscreen. Those 3D cameras running at 120 fps shoot live actors who are instantaneously composited into a virtual environment assembled beforehand. Our virtual production employs real-world lighting as well as lighting cues built into the CG environment. On the other side of the stage, I’m having a hemispheric high-gain screen being installed next month with dual 120-fps 3D projectors. So that will let me test on stage, then pipe it to the projector and watch right away, keeping this all an iterative learning process. My notion is to take the cast on stage and shoot a rehearsal, composite that with the best CG virtual background I can get in real-time [utilizing Unreel Pictures], and then treat all of this as an animatic.

Pixar Animation goes through something like this several times prior to final rendering. Seeing a version of the whole movie cut together in advance lets you know what works in the story and what doesn’t. You can alter your approach as needed in terms of scenes, elements and characters, or relight and change environments, before you have the actors back for principal to do it all for real, and switching from any set to another location happens at the flick of a switch. Pre-lighting can be kept as metadata; push a button on your DMX controller, and boom. The desired lighting condition returns. You shoot the whole film in two to three weeks.

But aren’t there trade-offs shooting under those conditions? Most directors aren’t comfortable in a virtual world, something I found out long ago with Magicam. Many actors, having learned their craft on a near-empty theater stage, are more comfortable. And I found that showing actors the composite on stage thrills them. “Finally, I don’t have to fake it.” If you don’t have something to show them, you wind up like 300, where everybody’s faking it because they have no solid idea about the virtual environment! My next step – something I haven’t done before except in brief experiments – is to replace the computer-generated, real-time virtual set with a miniature, which I find much more photo-realistic and believable than anything generated in a computer. Then I use Nuke and other comp techniques as needed, though I’m aiming for every shot to have at least 80 percent physical reality, rather than settling for the algorithm of the month. My tastes have always run to more organic approaches to visual effects.

Your segments in The Tree of Life reflect that. We can make miniatures look absolutely real, that isn’t a variable. I recently looked at Blade Runner, Close Encounters and 2001 in my screening room on Blu-ray, and I could see everything that was in the original prints. Sometimes it is even better, because the grain and slight weave of physical projection is gone. All these years later the miniatures hold up and are not the slightest bit obsolete due to CGI. Miniatures are used so rarely, they are practically a lost art, though Hugo shows how successfully they can still be employed. It was sad to see Kerner Optical [the physical production/model work/motion-control aspect of ILM] go belly-up; that had once been a big part of Lucas’ personal VFX facility. General Lift’s Joe Lewis has engineered a lot of motion-control work for decades, and he bought up a lot of Kerner’s stuff, which will help us with shooting the miniature end of things here.

Do you have something special in mind to showcase these innovations? There’s a space-based “hard” science-fiction feature that I hope to get made. Plus I’m thinking of redeveloping a very futuristic project called The Ride that was at Warner before management changed [laughs]; it’s a kind of Dr. Strangelove-meets-theme-parks idea.

Interview by Kevin H. Martin. Photo courtesy of Douglas Trumball.