U2 At Sphere: Q&A With Peter Kirkup, disguise Solutions & Innovation Director

The extraordinary video content on the massive LED screen for U2: UV Achtung Baby Live at Sphere in Las Vegas is run by a system of 23 disguise gx 3 servers, which are fitted with 30TB of storage each meaning a total of 690TB of drives available to play content at a moment’s notice. Live Design chats with Peter Kirkup, disguise solutions & innovation director, about their involvement with U2, and the challenges faced for this new show. Let's take a peek behind the screens.

Read more Live Design coverage of U2: UV Achtung Baby

Live Design: How/when did disguise get involved in this project; was it just on the U2 side of things for the new show... who was the initial contact and what were they looking for in terms of the media server system?

Peter Kirkup: We’ve worked with U2 since their Vertigo tour in 2005. Back then, it was difficult for the band to envision how the low-resolution content would look in its final version on the big LED video screens, so disguise founder Ash Nehru wrote code to help them pre-visualize the content. That was the origin of disguise’s Designer software. We first became involved in the Sphere through our partners at Fuse, the rental company that installed disguise servers and software into the venue for the U2 show. The project was about a year in the making from concept to show time, and disguise were involved for six months. Our initial contact was Creative Technical Director at Fuse, Stefaan “Smasher” Desmedt. Fuse was looking for a media server system that could seamlessly integrate content, real-time visuals and iMAG and they’d been working with disguise on U2 shows for decades, so it was a perfect fit.

LD: With the 23 disguise servers—where/how do they power the show's visuals onto the massive LED screens?

PK: Smasher needed to engineer a disguise system powerful enough to enable playback of pre-rendered NotchLC content at 60fps, as well as the option to run real-time Notch effects at the record-breaking resolution. This involved 23 disguise gx 3 media servers.

First, creatives used our Designer software to help pre-visualize and edit all the visual content (created by content agency Treatment Studio) on their computer, with the help of a 3D digital twin of the Sphere stage. Our software then split and distributed the 16K video into sections, across all 23 media servers.

This powered the content on Sphere’s wraparound interior LED screen, which lined 15,000 square meters of the dome’s interior. It wasn’t a single piece of video, it was video carved out into 23 individual pieces—each of which were perfectly in sync—so that it was able to play out seamlessly across the dome.

Photo by Rich Fury
(Photo by Rich Fury )

LD: What were the challenges in such a large system (is this the largest number for one project)?

PK: A big challenge was being able to handle such a large volume of content across 256,000,000 pixels— in real time. There were 18,000 people watching the show and they all had their camera phones ready to broadcast to even more people, so we really had to make sure the show went well.

This is definitely one of the largest number of servers we’ve seen on one project. We’d done big projects before, but they involved breaking up pieces of canvas across separate areas—like the wall and floor, for example. With the Sphere, we had one single, continuous canvas that had to all be in sync. Last year, we developed a technology called single large canvas, or SLC, to handle this. SLC lets you allocate pieces of canvas to different machines so that it can be rendered and synced in high resolution, in real time. This was applied to the Sphere project.

To do this we used 23 gx 3 servers —our flagship server for the live events industry. Each of those servers was upgraded with 30 terabyte hard drives so we had local storage machines for playout, and 100gb networking back to the content store for file transfers and media management.

Photo by Rich Fury
(Photo by Rich Fury )

LD: What was the process between the content creators/visual artists and disguise from beginning to end of the workflow?

PK: The content roster for U2:UV Achtung Baby Live at Sphere was the work of leading artists including Es Devlin, Marco Brambilla, and John Gerrard, plus a cinematic piece overseen by Industrial Light and Magic, all brought together by the creative minds at Treatment Studio and Willie Williams, U2’s longtime creative director and set designer.

disguise and Treatment have worked together on a number of shows in the past, most recently the Adele residency also in Las Vegas (which also features gx 3 servers). This involved largely the same team, including Brandon Kraemer and Lizzie Pocock, so there was a lot of workflow experience in the room, so to speak. The wonderful thing about working together on consecutive projects is that you can build on the collective knowledge of the team—an essential part of the success of a project like Sphere’s U2 show.

Photo by Kevin Mazur/Getty Images
Nevada Ark by Es Devlin (Photo by Kevin Mazur/Getty Images )

LD: How does the technology serve the aesthetic concept for the show?

PK: Our technology is perfectly suited for these kinds of productions. It allowed the creatives involved on the show to preview their ideas on the computer using a digital twin of the Sphere stage, so they could test concepts before going on site. It then enabled the ideas to be taken from the computer through to the massive LED screen on the day of the show.

This was crucial considering we couldn’t just press ‘play’ and let the show run. If the band wanted to spend an extra five minutes doing a new riff, the visuals had to change accordingly. That meant the creatives had to be able to adapt the visual content on the fly, and iterate it in the space so that it was always in time with the music the band was creating. disguise’s technology helped do this by enabling real-time Notch content to be rendered so creatives could adapt to the changing show as the run progresses.

Our technology also helped to composite content on the fly—so for example we could take real-time footage from cinematic cameras filming the band during the show. We could then composite the band’s faces into graphic bubbles that were floating all around the Sphere screen. That involved handing off the information between all 23 machines as they had to be perfectly in sync during the show. So we’re doing a full real-time 3D render of the bubbles plus playing back two layers of video with and without alpha, then compositing that together. It was a lot of work!

Stats:

● 16K x 16K display canvas amounting to a total of 256,000,000 pixels rendered.

● All content is outputting at 60 frames per second.

● 23 disguise gx 3 servers are fitted with 30TB of storage each meaning a total of 690TB of drives available to play content at a moment’s notice.

● Upstream 100G networking to connect storage and slicing servers to the production system.

● The project was about a year in the making from concept to show time.

● This show marks 18 years of partnership between disguise and U2.