MY TWO-WEEK VACATION WITH BOB & COLLEEN

Well, on day one, the “vacation” got extend-ed to 10 weeks. I'd craved working for Bob and Colleen Bonniol again since the Nokia project. Their creativity can truly go in any direction, and they came into this business as people with wrenches, same as me. Sinatra represented my biggest career challenge to date, and I felt in good company. I can't thank Bob or Josh Weisberg enough for letting me take a crack at programming Sinatra and having faith in my programming skills.

From the first day in rehearsal at Bromley-by-Bow to the days before opening night, the gig itself was a roller coaster smashing through obstacles all the way. The boys at Green Hippo have had our backs for a while, but for this job, they treated us like family and never left us. Aside from the massive resolutions, the amazing thing is how many different feeds the final video system twists together to stay on the screens, keeping the lead in lip sync, and in the right suit all night.

A Highly Technical Rundown

Before I flesh out the details of the Hippotizers, I want to go over how the grandMA was set up. “Gobo” and “Pan/tilt” got relabeled “Content” and “Automation.” The somewhat derogatory feeling “video” section is where I left the timecode channels. After a few inefficient paradigms of programming, what I settled on was making the “faked” automation cue stack separate from the one that would set up layers for content/effects/cross-fading/timecode values/etc., because the location the screens moved to really was a separate beast from what was on them (especially after rehearsals, when automation moved the screens and our system was to follow).

These two cue stacks, as well as edge blend changes, got stretched along a timeline in the desk. The grandMA can think of a show as a timeline of events. When you're discussing a sequence — between audio guys, who have the hard timecode value for something; video designers, who can say how long into a clip; and directors trying to give you a word in the song/script — you want to have a window to drag cues around in. This way, you don't have to constantly update delay/wait times to make one event a second later and the next a second sooner. The experience of building Act I, where every song was initially cued while rehearsing the previous song, let me layout a preset structure that made Act II monumentally more stable, a barebones modular structure to drop in what each layer would need to do song to song.

The Hippotizers' flexibility was proven in the customization required to deliver the direct pixel output for the content and keep it glued right to the screens. The DMX over Art-Net feed from the console lays in the bulk of their commands, but they listen intently to the timecode feed and the automation feed at all times, for when and where to spin their magic, as well as five different camera feeds that any machine can use for live content.

Automation

The automation feed is quite simple from this side of the gig. The various computers reading the motor encoders output a value in a predetermined range for where a screen is in the X and Y directions, so the first step is to give each layer the minimum and maximum values for this travel. They did have to make a look-ahead algorithm that takes a value to be dialed in to smooth out the somewhat jittery feed and overcome its general quarter-second lag. (Weisberg specifying nonreflective black edges to the screens gave the system the leeway it needed to overcome the delay perfectly.)

The “smoothing” values for the algorithm generally were the same for all screens, so after that is set and the min/max is put in, the control gets passed to the lighting console. The console retains the 16-bit X and Y values for where a layer needs to live and then has additional 16-bit channels for how much to move the video for changes in the values from the automation system. So within an hour or so of moving the screen back and forth and watching how fast your layer moves, and by updating the speed response by where it's put in the output, it's done and can be locked.

Once done, the servers listen to that feed and move the content around. You can watch layers move around in their preview screen and know if automation has a screen you can't see, which is very neat. Two nights with Nigel Sadler of Green Hippo, and this was dialed in for the entire system and only needed to be changed when projector positions were changed (or with bad system editing on my part).

Breaking Rules: Layers

We spent a lot of time breaking rules. We weren't supposed to be able to play files with different frame rates on different layers. Sean Westgate of Green Hippo did some math and sent me into the XML files of the Hippotizer to adjust the layout for the play speed channels. This let me specify what frame rate to play a file at. The Hippotizer deals with files and frame rates much more accurately than firing a frame to a specific timecode value. It takes the frames like a stack of cards and plays them at a rate determined by the play speed channel and stretches them along time, so each layer can easily play files at whatever rate you want. Brilliant design! Our flavors were 23.976, 24, 29.976; it can do 60fps, but just be cautious of your now doubled bit-rate versus the read rate ceiling of your storage media.

Breaking Rules: Screens

The rule I most enjoyed debunking was that screens A and B (the front two screens with two projectors in an edge blend to cover them with video) couldn't be in at the same time. This is due to having to rearrange the pixels in the middle of the raster, as it won't look right if the wrong blend values are in. The rule held only until the second song, where screen A is stage right and screen B comes in with Frank snapping his fingers right down the middle in the convergence area. Bob turned to me and said, “Okay, figure this one out, Mr. Brainiac.”

The solution? By now, our other programmer, Peter Acken, was there, so I had someone else who could slam the daily tome of notes into the desk. I had him move the blend change to a safe point before the song, and then after it was popping over, he said, “Let's toss a half-second fade on that business.” Translation: the two cues that toggle back and forth the edge blend change. Thus, discreetly before Frank comes in, the content slides over slowly without anyone noticing, and screen B drops right in, with Frank on cue looking sharp as ever. It is very cool when testing files with all the screens in and having someone say, “Hey, you're off the screen,” and hitting that cue stack to watch the layer glide and stretch itself to the screen.

Show Control Wizardry

The show control wizardry was accomplished by Digital Antics, super genius show control programmers who don't like using any software platform, preferring to do their work the real programming way, in Visual C++ or other true programming languages. The show control we had as previews started was primarily a way for the grandMA to communicate with the projectors and video switcher, as well as browsing into the DP Lightning 30SX Projectors. It could be operated from its own control screen, but the goal was to have one operator. The shutter control was critical, as the three machines at the back of the stage were blinding to the people at the balcony rail, the best seats of the house.

Our project manager John Ackerman sat at the show control computer for the first shows and took the shutter cues as screens came in and out. We originally intended to record timecode values and put shutter cues along our timeline, but automation was not running to timecode. The cue gets called, and the automation crew has to watch monitors to make sure they don't crush someone, so the relative timecode points of these movements were different by several seconds, when we wanted the shutters to pop open at the exact moment the screens dip into view.

If our system was already tracking the screens, why not just link the shutter cues to the automation values and let them operate themselves? Sure enough, Digital Antics' Craig Edwards and Quintin Willison whipped up that solution in short order. You can hear these guys' brains throb with energy when they start thinking hard (Bob would make the sci-fi “whob, whob, whob” noise when he saw them). To his mad genius credit, after about three days of watching and playing on the spare desk, Edwards was able to jump right into the grandMA and read values and make his own test show control cue stacks. This business is lucky he's not building space stations.

So after near-daily inventions, the result is a system that reads timecode triggered from the musical director on stage, screen movements, cameras throughout the theatre, and 300 to 400 cues in a lighting desk. And it only requires 10 go cues for the operator throughout the show. It is a living, breathing entity that reacts to everything else in the theatre. Call me crazy, but programming Sinatra was the most fun I've ever had.