And The Winner Is…For Real!

Game engine technology is showing up all over the film and TV industry, and now it takes a bow in the world of live awards shows. 

by Debra Kaufman

 


Hollywood and game engines have forged an interesting alliance. The use of game engines to fuel virtual production has emerged as an unprecedented way to marry real-world and CG elements (ICG Magazine, October 2019, Epic Images). Less known, however, is the use of game engines married to LED lighting and augmented reality to provide real-time graphics and images for live TV, including concerts, events, and even awards shows. 

Unreal Engine Enterprise Team technical product manager Andy Blondin shares that real-time rendering in television is not brand new. “But the quality wasn’t amazing,” he notes. “A lot of us in the broadcast space started experimenting with game engines, and the Unreal Engine being one of the highest-fidelity tools rose to the top.”

Courtesy of Philip Galler/Lux Machina

David Morin, head of Epic Games’ Los Angeles lab, reports that “live events, concerts, and presentations are one of the many uses of the Unreal Engine. “The same tools can be used for displays in domes, virtual production with multiple screens and live events with the same toolset that Unreal Engine features,” Morin describes. “The Unreal Engine has been enhanced with a lot of features relating to broadcast. The engine ‘talks’ with traditional video hardware used in broadcast and on set – something you don’t normally get out of game engines.” 

Pioneers in using the Unreal Engine for real-time broadcast, virtual production, visual effects, and broadcast include Zero Density, a Turkish company that early on worked with Dutch broadcaster NEP and has since partnered with The Weather Channel and FOX Sports’ Virtual NASCAR Studio among others. The Future Group, based in Oslo, Norway, used it to marry actors and virtual backgrounds for Fremantle Media’s 2015 game show Lost in Time. Since then, that company developed Pixotope, an “all-in-one, real-time virtual production system” integrated with Unreal Engine. The subscription-based software has been used for broadcast applications by The Weather Channel, as well as creating AR (Augmented Reality) elements for such events as Riot Games’ League of Legends 2018 Worlds opening ceremony and AR graphics for the Eurovision Song Contest 2019. 

Early pioneers of Unreal Engine for real-time broadcast applications include Turkish firm Zero Density, who partnered with American broadcasters like FOX Sports’ for the broadcast giant’s Virtual NASCAR Studio (pictured above). / Courtesy of Epic Games

Los Angeles-based Lux Machina is one of a select few companies that has leveraged the Unreal Engine for real-time elements in an awards show – the 2019 Golden Globes. Lux Machina chief operating officer Zach Alexander and chief technology officer Philip Galler came out of PRG (Production Resource Group), one of the live event industry’s premier lighting/AV vendors, where they spent years working with film and TV clients. For Oblivion (2013), with cinematographer Claudio Miranda, ASC, they built some custom lighting elements and worked on the movie’s many projections. Creating Lux Machina Consulting as a specialty boutique, the two men continued to work at PRG. “But we realized that a lot of the work we did on Oblivion was interesting to other people,” Galler recalls.

One of Lux Machina’s first big projects was the Claudio Miranda-shot Disney feature, Tomorrowland (ICG Magazine, May 2015) helping to design, build and execute several immersive environment sets. “We created a giant array of LED fixtures surrounding a large gimbal turntable to create the Oracle set,” recounts Galler. “To get overhead reflections, we did one of the first-ever really large overhead LED wall deployments.” Lucasfilm reached out to the nascent company, and Galler and Alexander began consulting on display technologies, projection and LED. “We tested products and also focused on getting a better understanding of how the film market worked,” Galler adds, noting that Lux Machina has worked on Star Wars: Episode IX – Rise of the Skywalker among other Lucasfilm projects.

Then came live events. “Everything changed in 2015 once Unreal Engine became free,” notes Blondin, who adds that it formerly required a custom license fee that could cost hundreds of thousands of dollars or more. “Epic founder Tim Sweeney wanted to open the doors to developers, so the company made the full source code freely available.” At that time, Blondin was designing virtual sets in Unreal Engine for the Men’s World Cup in Russia. Seeing game engines as the future, he moved from FOX Sports to Epic Games to “help build an ecosystem around Unreal Engine for live events and television.” Notable live TV broadcasts using the Unreal Engine for real-time graphics and AR elements in addition to Lost in Time included The Weather Channel’s mixed reality broadcasts, the 2019 FIFA Women’s World Cup and FOX Sports’ NASCAR virtual studio, among others. Unreal Engine also powered live-in-person events such as Childish Gambino’s PHAROS tour, held inside a dome.

Los Angeles-based Lux Machina came out of premier live event/AV vendor PRG (Production Resources Group), and were later approached by Lucasfilm to create on-set immersive LED environments, like that shown here on Solo: A Star Wars Story, shot by Bradford Young, ASC / Courtesy of Lucasfilm

Blondin says a few companies have ventured into live awards shows, producing AR elements and relying on Unreal Engine. Sequin AR produced Madonna’s five-minute dance music routine for the Billboard Music Awards, in which she was scanned to create a 3D model that could be viewed from all angles. The Game Awards 2019 used the Apex Legends character Mirage, animated live by an actor in a mocap suit, talking with the host. But, in general, Blondin continues, “awards shows are usually characterized by big LED screens and dramatic, dynamic animations,” with Unreal Engine users Lux Machina being the leader in such setups.

For Lux Machina’s principals, live TV was a familiar environment. Galler says that background made it natural for the company to pioneer using display technologies for lighting on TV sets. It also, step-by-step, led them to innovate the technology for live awards shows. When Lux Machina opened its doors, playback of content on screens during awards shows relied on EVS playback devices, which were standard in the broadcast industry. Or as Galler describes: “Traditional broadcast shows are slow to evolve.”

The first evolution took place at PRG, where boundaries were stretched via the new technology of Versa Tube lighting. “It’s basically LEDs behind a diffuse cover,” Galler adds. “Element Labs made this LED fixture and Barco purchased them. What’s interesting is that it was probably the first product intended for the live TV market that controls video as well as lighting.” At the time, everyone ran video with media servers, a single device to store and share media. “But TV lighting designer [and Local 600 member] Bobby Dickinson wanted us to use the Versa Tubes on the Peoples’ Choice Awards and other shows. So, Zach designed a system that allowed media servers and Versa Tubes to interact, and that was the beginning of using display technology that required playback technology.”

Because so few media servers were being used in TV, Galler states, “there was a need for media server programmers to work with lighting designers for live TV.” “Zach and I both found it fascinating, as it was a new niche. We learned how to program and use media servers and worked with manufacturers to improve them and build ways that let us interact with lighting fixtures more effectively. It was pretty much the first time that media servers were used on TV shows, to drive the Versa Tubes, and we ended up on quite a number of live shows as media server programmers.”

Virtual Production for broadcast applications have come from companies like Norway-based The Future Group, which introduced Pixotope, an “all-in-one, real-time virtual production system” integrated with Unreal Engine and available to broadcasters via a subscription-based software / Courtesy of Epic Games

The 2010 Univision Latin Grammys was the first show to use such a combination with about 100 Versa Tubes, followed by the Peoples’ Choice Awards in 2010, which used 800 Versa Tubes. “Once that was achieved, pretty much every other awards show around the world began using these products,” Galler says. “We used them on the Oscars, the Grammys. I can’t think of a single awards show that didn’t use them.”

Although neither Alexander nor Galler viewed the Versa Tubes as enabling virtual production, it was something new: interactive lighting and in-camera tool driven by an engine. “It was initially all pre-rendered,” explains Galler. “But we were interacting in real time, changing speeds, adding effects. First, it was all in 2D and that changed in 2015 when we started using 3D engines and, in 2017, when it became real-time.”

When the possibility of LED video walls presented itself, Lux Machina transitioned from Versa Tubes, which were difficult to hang and low resolution. Video walls became a much better solution for something that wanted to be seen, not just felt, and soon robust, touring-capable video walls began to replace scenery. Lux Machina built what they call “creative screen control,” to enable them to control the look and feel of video walls.

“By 2014 to 2015, we started to see production designers use technologies similar to the Versa Tube, with LED walls and projections,” Galler recalls. “Not for hero elements but for background screens that need a bit of flair to keep it interesting on camera. This transition allowed the media server industry to adapt and create solutions to drive more of these screens. It was the advent of the first modern media servers and modern display technologies for use in the broadcast industry.” 

Versa tube use in awards show was the first step toward enabling virtual production – interactive lighting and in-camera tools driven by an engine. Lux Machina’s Galler says it was initially all pre-rendered. “But we were interacting in real time, changing speeds, adding effects,” he adds. “We used them on the Oscars, the Grammys. I can’t think of a single awards show that didn’t use them.”/ 2019 Oscars pictured above / Courtesy of Lux Machina

Among those products was the London-based disguise D3 software/hardware solution that was a stage simulator, timeline-based sequencer, video playback engine, and content mapper. The user could load it into his or her laptop or run it on dedicated disguise hardware. “Disguise was born out of a need for a professional, flexible and efficient video production toolkit that supports you throughout the show process, from design to delivery,” notes the company’s press material. 

Galler says that “disguise was the beginning of the replacement of traditional screen playback.” In 2014, for the American Music Awards, PRG had 16 EVS signals feeding different screens, with one media server channel, the latter of which would enable video playback. Then, in 2016, came Notch Playback, a license for a USB dongle that allowed for easy transfer of content between media servers and other devices. Notch made it easier to work with live video feeds and footage on media servers, compositing, mapping and outputting to screens or projectors. 

“Our first use of Notch was a televised conference for the NBA, where we used it to handle motion graphics in 2017,” Galler remembers. “Screens producer Drew Findley made the first best use of it for Beyonce’s 2016 tour. She was going to different arenas and wanted to make sure she always looked good on arena walls. They measured the LED walls and built a look-up table to transform color. They would put the LUTs into Notch for color generation.” 

Notch excels at motion graphics and particle systems, but not world-building, which forms a significant portion of what Lux Machina does. That led Alexander and Galler to gravitate to Epic’s Unreal Engine. “We started to use Unreal Engine in 2017 and then realized we could do everything we did in Notch and then some, so we switched,” says Galler. “We built out a media server in Unreal Engine specifically for our use in live broadcasts.” 

Lux Machina came on as media server programmers for the 2018 I Heart Radio Awards (above) along with Loren Barton and screens producer Jason Rudolph / Courtesy of Lux Machina

The 2019 Golden Globes was Lux Machina’s first use of Unreal Engine on a live awards show. “We’d been working on the Globes since 2013, so there is a workflow that never changes,” he says. “We were able to keep that workflow and do everything we normally do but had the benefit of working on the graphics systems in real time. With Unreal Engine, we could explore how to build real-time content – like rain particle systems and 3D graphic elements – that we could control.”

Lux Machina prepared for the 2019 Golden Globes by building out drawings of scenic pieces for the LED walls, as designed by the production’s art department. 

“We’ll break them down and figure out what technology is needed,” Galler details. “We have discussions with production on budgets and what we need to do. Then we begin building templates, guidelines for the production and producers to get their content onto the screen. The art department designs things, but they don’t always know how they work, so they come to us to find out if it’s possible – or they assume it is. Sometimes, we’re the bearers of bad news. Once we deliver the templates, they start making the content to those templates, whether it’s pre-rendered or live.”

“We still do screen producing, a role that developed on live TV shows – it’s like a lighting director, but dealing with the screens,” he continues. “The screen director tells what will happen on each screen, the screen traffic and how to manage it and make it as interesting as possible. We make it look as creative as possible without making it too difficult to accomplish.” A week out from the show, Lux Machina checks all the hardware and finishes up any custom software before loading up the hardware on the stage or in trucks. “Then we go right into rehearsals, dress rehearsals, and the show,” he says. “A large show on the screen side has a team of four to five people. Screen ADs call the show, and the screen producer makes it look good.”

Adding Unreal Engine to the mix has improved both the process and the result.

“Unreal Engine is a is an incredible toolkit for high fidelity world-building,” Galler concludes. “It’s hard to do it any other way. It’s the only package that lets us build an entire photoreal environment, from Malibu to my apartment, with speed and fidelity. With pre-rendering, it could take days to render out a two-minute video. Now, we can do it in milliseconds.”

And while photoreal previsualization has already become commonplace,” Galler says “it’s only a matter of time before the broader industry adopts it for actual productions. It’s inevitable because it makes life so much easier.”

 

The 2019 Golden Globes (in prep pictured above) was the first use of Unreal Engine on a live awards show. Because Lux Machina has worked the show since 2013, they were able to keep that same workflow in place, “and do everything we normally do but with the benefit of working on the graphics systems in real time,” Galler explains. “With Unreal Engine, we could explore how to build real-time content – like rain particle systems and 3D graphic elements – that we could control.” / Courtesy of Dick Clark Productions