In Control For Bon Jovi’s Because We Can: The Tour, Part 2

Dirk Sanders, technical designer for Control Freak Systems (CFS), worked on both the arena and the stadium versions of Bon Jovi’s Because We Can Tour, putting together the complex video control systems. In this second part of our Q&A, Sanders discusses the integration of the lighting and video control systems.

Check out part one here.

LD: What are some of the moments where the control ability is really integrated into the way the content is presented?

Dirk Sanders: We were really successful leveraging the content and the control. Moment Factory’s content for “We Got it Goin’ On,” “It’s My Life,” and “We Weren't Born to Follow,” are good examples because, for all three of those songs, they wanted to leverage all the screens, yet we always wanted to keep keying IMAG up there. So we developed a way where they could, through a keying mask, be able to control where the IMAG goes. It allowed us to transition cleverly the IMAG placement. Like “It’s My Life” starts out with this big machine content where the gears appear to be turning content over all the screens, and then a gear will smash in and sparks will fly, and it will open up to reveal a camera image. That will close up and another gear cog will turn and reveal IMAG on the side screens. We could map out and direct focus with the content linked with IMAG.

Photo Ryan MastMeteor Tower ©2013

One PRG Mbox was configured for dual output and to just play alpha images. That server plays different masks and basically serves up masked images that we leverage in [Barco] Encore. At the same time, the other three Mboxes, each driving a set of screens, plays the other content, and the key is done in Encore. Other times, it's done leveraging the live input of the media, and the content has an alpha channel built in to it. So we can pick the right approach to it, depending on how the layering of the content needs to integrate with the IMAG.

Integrating Lighting and Video

LD: Any other thoughts about the integration of lighting and video on this control system?

DS: It is an impressive the amount of data moving across that Art-Net network, and it offers us a lot of options since both lighting and video are controlling it at the same time. It allows us to put live IMAG in those headlights with minimum amount of delay, considering it is a live input going from media server and then merged in the lighting console and then going to the LED fixtures. You can't really tell; the lines are incredibly blurred between those being lighting fixtures or an LED wall.

Photo Ryan MastMeteor Tower ©2013

With the Chromlech Elidy fixtures specifically, we use Mbox to pixel-map them and send an image out. So if we want to do complex animation or timecode-driven content to the headlights, we can do that, and that merge happens inside the lighting desk so lighting can paint on top of them, use them for a traditional blinder or do another layer of chasing that was tied in with what they were doing. Read more about the car setpiece.

The reverse of what I just explained is when the lighting guys have the ability to control the CFS Multi Tap Server. This opens up some interesting ideas. Multi Tap is a core piece of software that CFS developed back for The Beastie Boys. It is built around the idea of a tap: a bunch of small screens that reside in one raster and coming up with an effective way to control those individual screens where a typical media server may run out of layers. We used the Color Block mode inside Multi Tap extensively for Bon Jovi; it lets us treat each grille tile as a RGB fixture. Each one of the V-9 Lite LED tiles in the grille and each one of the tiles in the turn signals can be targeted out and used like a light--like an actual RGB fixture.

See the full stadium leg photo gallery.

That is what led to the idea of creating the virtual GLP X4 impression lights, where we could have the lighting guys coloring them, and then, inside the Mbox, we would do a multiply layer addition where we would take that color coming in off the live input and merge it on to the texture coming from the Mbox. That look is for “Born To Be My Baby,” where a picture of an X4 light is created on all the V-9 tiles, and then the lighting guys send the color information to the media server; basically they control the media server to change the color. It makes it look like a wall of lights, virtual X4 fixtures. The actual grille configuration is a V-9 tile next to a real X4 light next to a V-9 tile, next to another real X4 light. So when you create these virtual X4s on the tiles, it becomes this huge light wall. It was a really cool look and it is a great example of the high level of integration there is between the video and lighting control.