Thought Leader Of The Week: Jeroen Hallaert

Jeroen Hallaert, vice president for production services at PRG, oversees feature film and television offerings in virtual production. His team is the 'force' behind PRG’s Enhanced Environments feature film and television services, creating dynamic lighting so talent and crew alike see and perform against real-time backgrounds, without the guesswork of an empty green screen. Live Design chats with Hallaert about the new virtual production stage at PRG and what it has to offer.

Live Design: What is a virtual production or xR stage (not sure all of our readers know) and how does this differ from a regular sound stage for film or TV?

Jeroen Hallaert: Put simply, a virtual production stage is an innovation advanced by ongoing technological evolution that allows any filmmaker to lay out and establish a recording in a digital environment before going on a physical set. This includes defining lensing, set dimensions, asset placement, and exact camera movements. Once on the xR stage, all of that is completely under the control of the filmmaker in real time. Everything can be changed within a set of parameters. Ultimately, filmmakers who want to utilize virtual production will have access to stage-like digital environments that feature virtual equipment that precisely replicates its real-world counterparts. You can, in real-time, move around objects and assets on your set as well as experiment with camera angles and lighting. Plotting these elements in real-time lends itself to a more efficient workflow without multiple recreations of assets and going back and forth. When you have a classic sound stage with painted set pieces and scenic elements, when you have a whole lighting rig installed, props on stage, etc...all of those elements require significant labor and time to change or manipulate.

In the same way that digital cameras allow immediate review of shots, virtual production (the xR workflow) allows us to create high end images in real-time so we can see straight away if we got the shot, or if we need to go again.

No physical stage means no limitations…

LD: What are the technologies involved for creating an XR experience...is it in the studio or the content creation?

JH: Creating an XR experience is quite involved and is based on in-studio technologies and content creation.

Sometimes crisis breeds innovation so with health and safety in mind, PRG redesigned a classic production set up by mixing elements from live touring, broadcast and TV studios and combined that with the depth of PRG’s gear inventory and the knowledge of our technology experts. PRG’s xR studio is just under 4,000 sq. ft, the actual LED floor is 25'x25' surrounded by a 50' wide and 13' tall LED all; both need to be perfectly aligned and level to the millimeter. The floor is ROE’s Black Marble 4 and the walls are ROE’s Black Onyx 2. There is a lighting rig built in; it currently has a variety of lights, including PRG’s very own (proprietary) Best Boys Spots, but also GLP impressions X4’s, Vari-Lite VL2600 Profile Leds’, Robe Pointes BLK, GLP JDC’s…all this for not only key light but also show lights. It also has a range of ARRI sky panels. Camera tracking is done through two REDSPY systems from STYPE. Lighting is tracked through BlackTrax. We’re also using ROE CB8 transparent for real time environmental lighting, which is very useful for reflections and life like lighting.

In the control room, we have a Grass Valley Kayenne (6ME). Audio is ran through the Calrec Artemis, and for live performances, we also run a DiGiCo SD10 console and a Yamaha CL5 Dante Control Mixer. Wedges and sidefills were d&b which we change out into Meyer Sound depending on the requested flavor. We can leverage any camera in our inventory (Sony 4300, Venice, F55 or Alexa’s / RED). With four disguise gx 2c units, we run a solid media server set up supported by two Barco E2s.

We sort of need to be everything from camera tracking, media server programming, camera calibration, color balancing, etc. That is largely because this xR workflow acts as the keystone of all other parts of production. Normally, we are used to video being a part of the show. In this case, video is the show. The whole concept is that we are telling a story that's a journey through a virtual world. In order to make that virtual world, we needed to align every element of the physical world into one system.

With this workflow, graphics presented in a performance are completely run in real time, and scenes instantly react to the movements of tracked cameras, people, and objects. Using this approach, digital graphics can be rendered and displayed on physical LED screens, which are seen on the broadcast feed only and/or composited together. The result is 3D content that appears both in front of and behind the performer. Using real time visuals allows talent to visually interact with their environment (the digital one combined with real props) adding a new layer of engagement that lends realism to the performance. 

The xR stage is the only way to do these kinds of high-level real time in-camera shoots where green screen and VFX workflows would normally be used.

LD: How did the Katy Perry team use the xR Studio for her American Idol finale...where they the pioneers?

JH: Pioneering as in one of the very first high end produced productions under COVID-19 restrictions? Yes…

Katy needed first and foremost a safe and clean working environment. Because PRG has its own xR studio in its own facility, we could make this happen. True innovation also happened in defining the whole production schedule and process. It is through Baz Halpin, who has been the long time tour director for Katy, that the idea grew to do the xR Video. Baz was working closely together with JT Rooney on the content side. Aside from three weeks of content creation, which in this case is basically a combination of pre-vis and content creation, the production took only three days.

Day one is used to load all content, from backgrounds and 3D assets to color changes in lighting. Making sure all cameras, lighting, tracking systems (STYPE) talk to the media server set up; everything is aligned and calibrated. Notch was used as the graphics engine for this specific performance, but we like to mention that the whole backbone is also set up for future UNREAL and UNITY shoots lending a more photo realistic to the shoot. Day two was used for rehearsals with a stand-in in order to try all camera movements, lighting cues, and content check. As the performer sees in real time the xR elements on the LED walls and floor, it is easy to find references for the choreography. The flying and floating elements throw a shadow (in the content) and that also made it possible to react in a very natural way. It was only in the afternoon of day three that Katy came to the studio for a three-hour shoot. It didn’t take her more than a run through, some camera tests to do the shoot in a few takes. As she walks off the stage to the trailer, she could see the finished result on screen. xR doesn’t need fixing in post…it’s all very well-prepped in pre-production.  

Watch Katy Perry's American Idol finale here.

LD: What other projects have taken place there to date, and is LA the only PRG xR stage? If yes, are others coming down the pipeline?

JH: xR stages in London and Hamburg (EU) are in the making.

It really is enhanced environments (virtual production) that PRG is focusing on. Replacing green screen (the fabric version) with LED walls. In feature film and TV, and everything around that.

We have, here in our xR Studio in LA, a production every 10 days. From something as simple as a digital green screen to a complicated xR shoot for a commercial using the latest advancements in gaming engines (UNREAL and UNITY). This is for more photorealistic shoots. The stage we have set up is super agnostic; we have a range of cameras (even cinematic) available but also lighting, LED, broadcast side – it’s available to us because we operate our stage inside our warehouse. We welcome all sorts of productions.

LD: xR seems to be the future. What does this mean in terms of integrating live performance and video?

JH: I would prefer to call it virtual production to be honest…it’s where xR really falls under. While virtual production can make producing a feature film or any shoot in xR easier in a post-COVID world, it also comes with a slew of challenges, like lack of training and experience among industry professionals because the technology is constantly evolving. As virtual production experts, we at PRG claim that training is mostly a matter of exposure and hands-on experience that lays out what the creative possibilities and boundaries are, so existing crafts can combine with these new technologies. It is also something we have on a daily and ongoing basis here at our xR stage in LA. The technology does not stop; it's an ongoing ever-evolving process. A combination of our expertise and the bedroom R&D virtual production is underway on the many online forums. It is super interesting to see.

The film and TV business has always been powered by the capabilities of currently available technologies and the adoption of new technology is clearly impacting the way we produce and consume movies and other video related entertainment products. A few years from now, xR powered experiences will become accessible for large, mainstream markets. We continue to explore workflows for creating Extended Reality (xR), applicable for a wide range of applications and productions. The workflow and graphics presented in xR are completely run in realtime; scenes instantly react to the movements of tracked cameras, people and objects, and allows for creative changes on-the-fly. It completely fills in the gap where today’s cinematographers are struggling to have full control over both the lighting and the filmed scenes in a real time recording environment. Big studios need to realize that audiences deserve and expect more than just the superficial trappings of what a big-budget film can deliver. Technology should serve the story, not the other way around.

xR, the technology that enables us to add a layer of digital information on top of the real world, is here to stay. The research firm Gartner publishes its “Hype Cycle for Emerging Technologies” every year, tracking the way new technologies go from hype to disillusionment to eventual adoption and growth in the market. xR has escaped from the Trough of Disillusionment this past year and is now riding up the Slope of Enlightenment towards the Plateau of Productivity.

By the way, Hollywood movie-making will become way more virtual in a post-coronavirus world...and xR will play a massive role in that.