Xite Labs Creates xR Worlds for Bryson Tiller’s Innovative Online Show, “Trapsoul Series”

With live concert performances and touring still suspended due to the coronavirus, musical artists are looking for new ways to present material and engage fans with virtual productions.  Singer/songwriter Bryson Tiller and his creative team at The 92 Group working with production company HPLA took advantage of the xR expertise of LA-based design studio Xite Labs for Tiller’s innovative ticketed online show, “Trapsoul Series,” which debuted March 18 on Moment House.

Xite Labs was the creative and technical resource for the Extended Reality (xR) portions of the show, which comprised a good portion of Tiller’s long-form performance.  The project showcased the Trapsoul genre introduced by Tiller in his eponymous album of 2015 that combines elements of Trap Hip-Hop, contemporary R&B and soul.  It featured songs from his new album, Anniversary, as well as fans’ favorite hits.

The show, was directed by Mike Carson for 92 Group, with content direction provided by Amish Dani and Samantha Ashcraft. The program presented Tiller in a series of six different worlds linked by a narrative that flows throughout the songs.  Xite Labs was responsible for the stunning, immersive xR content for 14 songs performed by Tiller in four virtual worlds.  They are bookended by opening and closing live-action worlds, created by the 92 Group/HPLA, where Louisville, Kentucky native Tiller performed solo on Louisville Slugger Field then was joined by fellow townsman Jack Harlow for a duet finale that ended with fireworks over the stadium.

The four xR worlds featured different themes and looks.  The first found Tiller in a virtual lounge that fell away to reveal a fractal world of galaxies, nebulae and spaceships.  The second had a time theme with clock faces, a clock tower and hourglass, a mountain desert landscape and a flight through a moonlit sky.  The third opened with photoreal imagery of guerrilla warfare, which transformed into a neon jungle.  The fourth world took Tiller to stark hallways with bold, flat lighting, color-changing walls and silhouettes.  Throughout, Tiller appeared to perform on a moving platform, which served as the anchor point transporting him from one other-worldly environment to another.

“This project took virtual production to another level,” says Creative Director Greg Russell, who is partnered with Vello Virkhaus in Xite Labs.  “The difference was that Bryson doesn’t just stand on stage with the worlds behind him.  We did a lot of front plate work in the final output to marry Bryson in the worlds in a more believable way using atmosphere, particles, lightning, plants and foliage to add mystery, intrigue and an extra layer of magic.  The show is several orders of magnitude more complex than the xR viewers have seen before and, hopefully, its complexity is transparent to them.” As an ardent gamer himself, Tiller had expressed interest in working with xR technology and environments created with Unreal Engine.

Russell notes that the opportunity to work with xR technology can be irresistible to an artist. “Being in a place which is not really there, where you can change locations and worlds with the click of a button is pretty awesome.  It’s exciting for anybody who’s ever gotten on a stage – a way to look impressive and cool.  Performing against greenscreen kind of removes you from the action, but with xR you’re bathed in light and the kinetic feel of that light and motion around you.”

Creative development for the show began at Xite Labs in October 2020 with the shoot scheduled for just before the holidays on Xite’s xR stage in Calabasas, California.  The sheer quantity of original content for 14 songs filled one marathon shoot day.  Producer Amish Dani helped create a dynamic and new collaboration running smoothly between all creatives. He had previously teamed with Xite on stage elements and motion graphics for Bad Bunny’s world tour. 

The xR stage, built out with generous partnership from Evolve Technologies, Lightswitch and Robe, is outfitted with an Absen LED video back wall with 45º angled wings left and right.  Tiller performed in this LED volume where the ROE tiles that comprise the LED floor acted as the moving platform, which transported him to different worlds.

Content for the 14 songs in the xR worlds was custom built in Unreal Engine and Quixel Mixer, using source models created in CINEMA 4D.  Adobe After Effects was used to craft video textures; front plates were made in Notch. Resolume software was used to playback pixel mapped content into unreal engine for virtual screen suraces. A disguise gx 2c media server served as the mixed reality controller system paired with an rx render engine running renderstream.

Creative Director Vello Virkhaus helped supervised playback and the real-time technologies on set from the stage’s control room.  “I was like the conductor of an Unreal orchestra,” he laughs.  “Using set extensions to the degree we did required a high level of precision and perfection.”

The participation of Tiller, The 92 Group and HPLA’s creative teams and Xite Labs’ technical crew, all working within COVID-19 protocols, gave the feeling of a full-on production that everyone had missed during the pandemic, Virkhaus notes.

“The stage was running at top level and the production environment came to life again, which was a lot of fun after all the isolation.  Everyone enjoyed that,” he reports.  “Since the clients had not used xR before, they had to get the hang of it, which was very challenging but fun.  xR requires a great deal of focus on a lot of new things that you never considered before as well as some limitations you learn to work around and work with.”

Virkhaus explains that the cinematics which Xite Labs created in postproduction enhanced the creative possibilities of xR.  “Following the physical shoot we output the cinematics, turning off the stage cameras and putting cameras in different places to create untethered cinematics featuring jungle flames or sky sparks or more glowing rooms.  This provided richer visuals and more options for show editorial and potential use in other devices.  The ability to create cinematic sequences with ease once a show has been shot is a real value-added proposition.”

While Virkhaus calls the project “one of the most challenging I've done in my career,” he’s proud that Xite Labs “made the extra effort to push things as far as we could creatively and deliver the highest quality results possible.” For us it was about pushing boundaries with this new medium.

Since xR has evolved so quickly during the pandemic, Russell sees the technology continuing to be applied to live music, corporate events and theater once the world returns “to normal-ish.”  He expects future concerts, for example, to be a “hybridized version” of xR and live staging.

“You’ll have a live audience and a much large home audience paying to be there virtually.  The home audience will get to choose between seeing the FOH view or another vantage point plus they’ll see a virtual performance for a higher-level experience than the straight stage show.  Meanwhile, the audience at the show will watch IMAG screens showing the virtual experience of the home audience, so they get a bonus too.  This may cost more to produce but look how you’ve expanded your paying audience!”