Jay-Z (Shawn Carter) is one of the most—okay, perhaps the most—successful hip hop artists in the world, with over 30 million copies of his albums sold in the US alone and a trophy case’s worth of Grammy Awards, MTV Music Awards, BET Awards, and American Music Awards that show off his 13 years of recording thus far. Upon his latest release, The Blueprint 3, Jay-Z became the record holder for most #1 albums (at 11) by a solo artist, beating out none other than Elvis Presley—not bad for a musical genre that’s only been recognized by the Grammy Awards for 20 years.
In support of his album, Jay-Z hit the road with a design that was originally unveiled during his Answer the Call concert at Madison Square Garden, which aired live on Fuse and benefitted the New York Police & Fire Widows’ and Children’s Benefit Fund. The production features show design by UnitedVisualArtists (UVA)—a team that includes creative director Matt Clark, technical director Chris Bird, and animation designer Dave Ferner working on the stage concept and video content—lighting design by Patrick Dierson of Artfag, LLC, and additional content and screens direction by Drew Findley.
The team at UVA was actually recommended by Jay-Z’s art curator and from the beginning wanted to move away from the look of more conventional concert video and lighting. “Our aim from the start was to create a stage design more like a light sculpture—an abstract, digital city,” says Clark. “The screen was never really designed for pure video playback; it was deliberately a step away from that. It was designed, though, to also support I-Mag, so it is a very versatile video surface and light sculpture in one.”
That screen consists of 265 PixLED F-15 modules built into 15 custom double-sided frames, all of which came from XL Video. “That particular LED was chosen due to its light weight and high brightness, and the size was really as big as we can make it without it hitting the ceiling,” says Bird. The module edges meet to form right angles that face the audience, creating the appearance of square columns that mimic, among other things, the New York City skyline.
Findley, who has worked with the artist for six years, calls this “the most versatile video playback and control system out there.” He designed the control system and controls all the visuals for the show, working alongside video director Dirk Sanders. “The overall design goal was to try to find different ways to help accentuate the vertical towers of the screens,” Findley says. “Although the towers were designed with an urban cityscape in mind, we needed to find different ways to transform them for different songs. For instance, in ‘99 Problems,’ the towers become speaker stacks thumping to the beat, and in ‘Big Pimpin,’ they become a full stage equalizer. As Jay put it, he wanted to ‘see the music.’”
The content itself was created by UVA, Findley, and William Hines of Skitch TV. UVA used Autodesk Maya for initial renderings of the stage, says animation designer Ferner, “as a way of quickly showing what it might look like and then built the entire set in our custom, in-house software, d3. This allowed us to edit content in realtime while seeing how it would map onto the unusual screens, and it was invaluable in communicating with the client. It’s hard to explain how an image will look when it’s wrapped across a city of LED without it.”
When it comes to show time, Findley mixes the content live, handling video cues and cameras. The production uses an MA Lighting grandMA driving four PRG Mbox Extreme media servers, all outputting 1080p HD-SDI and running into a custom Control Freak Encore system. “This gives me 18 layers of Encore, nine cameras, the director’s cut from Dirk, and all the Mboxes,” says Findley. “This whole system is controlled from FOH. All the LED towers are contained in one HD-SDI pixel-map that allows a lot of flexibility in how we construct images for the screen.”
Findley notes that each song in the set is approached differently. “Some songs have content that is SMPTE-tracked, others add cameras to the content, some are completely driven by cue points, and others have a mix of SMPTE layers and live content playback. Our system is extremely flexible, and we work it to the max.” Video also has an additional networked grandMA for backup, and the system runs Art-Net and MA-Net over a PRG Series 400 Fiber Ethernet snake to backstage, where all the Mboxes, MA Lighting Network Signal Processors, and Control Freak racks are located. The whole system can be managed remotely from FOH using Apple Remote Desktop.
Making use of a new feature on the Mbox that allows content to be targeted on a layer to only certain sections on the screen, Findley says that, “having both the Mbox and Control Freak setups are clutch. I have the ability to source multiple layers of cameras, content, and affected cameras with low latency and exact precision. When I asked Matt Corke and Mark Hunt at PRG how many of these screen sections I could create, they told me I could have up to 255 but that I would never use that many. They were right, but I do use almost 100. It really allows me very quickly to target exact areas of the screen to playback and manipulate content. To chase it is as simple as rolling a dial.”
As for challenges for the video department, Clark says the artist has so many songs, and the show is so spontaneous, but they created enough content to keep it fresh. “Jay-Z has far more tracks in his set list than most artists, often playing different songs from night to night, and switching the set list according to the mood of the crowd,” he says. “Add to this the many guest stars in the first show, and it was a very challenging show to create but an exciting one to watch.”