Timing Is Everything: The Birth Of Right on Cue Systems® (ROCS)

When a small, semi-professional community theatre in Utah brought two Broadway stars to town for a benefit production of 110 In The Shade, the show’s music director faced a big challenge: how to conduct a MIDI orchestration for the performances with the same nuance and control as he would a live orchestra.

Thanks to a family relationship with the theatre founders, four-time Tony award-winning actress Audra McDonald (Carousel, Master Class, Ragtime, A Raisin in the Sun) and Tony-nominated actor Will Swenson (Broadway revival of Hair), came to the Hale Center Theatre in Orem for a couple of weeks this past July to reprise roles they had performed in New York City.

When McDonald starred as Lizzie in 110 In The Shade on Broadway in 2007, Swenson, the brother of Hale Theatre co-managing director Cody Swenson, had a small role in the show (and was understudy for the role of Starbuck). During the production, he made a big connection with McDonald. After the two started dating, they talked about working together again. At one point when the two were visiting the Swenson family in Utah, Cody Swenson proposed that the two performers do a master class at the family’s 300-seat theatre in association with its 2010 summer production of 110. McDonald and Swenson considered the invitation and then countered with the proposal that they perform in it.

Cody Hale, another of the theatre’s managing directors, says that things then fell into place. “While both actors were extremely busy, it just so happened to fall in the slot where Will was done with his run in the London production of Hair and Audra had a break in the filming of Private Practice.” So the two Broadway actors committed to perform in the show during two weeks of the planned six-week run, and those performances became a fundraising event for the non-profit company, which is hoping to move to a new, larger theatre.

Due to space and budget limitations, most musical performances at the Hale, and many other similar small theatres around the country, have to rely on simple accompaniment. “What we’ve used in the past is prerecorded tracks or, at the most, just a piano,” says Hale. “When we asked Audra and Will what they wanted, they said they were so used to performing with an orchestra that could follow them, and asked if we couldn’t put together a small ensemble.”

So the producers began exploring how they could make that happen. The first idea was to have Dave Zabriskie, the music director, re-orchestrate the score, written for 19 musicians, for an ensemble of two keyboards and four-to-six other players. There wasn’t enough space inside the theatre for the small ensemble, without removing seats, so Zabriskie proposed that they place the musicians in the theatre’s basement below the tiny 320sq-ft. stage and connect them to the theatre space with video camera, monitors, and microphones. After struggling with adapting the score for a while, however, Zabriskie shared his frustration with business partner Mike Leavitt.

Leavitt offered up some tracks he had done for other musical productions, which Zabriskie liked, but couldn’t see how control of the tracks would be possible. They were familiar with existing MIDI controllers or sequencers, such as OrchExtra® that requires a pianist to play the melody in rhythm on a mini-keyboard, or Notion® that requires notations be made on a computer score and activated by the rhythmic pressing of a key on a computer keyboard, but knew that they and the actors could not be happy with the musical inflexibility and complex controls of those technologies.

So Zabriskie and Leavitt determined to develop a new MIDI tempo control system with simple and intuitive management of the sequence score and provide the conductor with the same control he would have with live musicians: control of the tempo, pacing, and other dynamics of the accompaniment to the variable performance requirements of the actors on stage. Leavitt, who has sequenced about 70 shows, first demonstrated how they could control playback on his Apple iPod Touch. “It was controlling it,” says Zabriskie, “but it wasn’t musical. You had to keep a perfect beat to make the thing work; it was exciting, but very mechanical.”

Leavitt drove to a local store, bought a Nintendo Wii game console, implemented its wireless controller as the interface, and they loved the results. Zabriskie immediately sat down to write some “humanizing software” to provide nuance and subtlety to the control. “I had all the algorithms figured out, but when I looked for a programmer to move the project forward, they said it would take six months to a year,” says Zabriskie.

With the rehearsals starting in a couple of months, Leavitt suggested they use Java-based Cycling '74 Max interactive visual programming software, and Zabriskie, who had been a computer programmer earlier in his career, sat down, learned Max, and wrote the MIDI control software. “I tried out my algorithms, they worked, and here we are,” he says. And with that, Right on Cue Systems® (ROCS) was born.

The ROCS system comprises:

- an Apple Mac computer running the Right On Cue software, the sequence score for a specific production, and a MIDI sample engine of the actual musical instruments (the Vienna Symphonic Library was used for 110 In The Shade)
- a MOTU 828 audio interface to connect with the theatre sound system
- a standard Nintendo Wii Remote™ and Nunchuk™ controllers

The developers applied for a patent of ROCS in April, two months before rehearsals began, and continued to refine the system during the rehearsal period, giving the conductor a greater range of control. At first, they pursued the idea of hiring a small ensemble and then augmenting those live musicians with MIDI for a richer fuller sound that would reproduce the original 19-instrument score, but when the costs of adding the extensive wiring, cameras, and monitors, and contracting top-notch musicians proved to be prohibitive, Zabriskie and Leavitt proposed that they just use a MIDI sequence, but one that now could be “conducted” live.

Zabriskie and Leavitt alternated conducting the show from the sound booth for the first two weeks, and Zabriskie then left to focus on developing their new ROCS business venture. With the two Wii controllers in his hand, with every button programmed to control some function of musicality or technicality of the musical sequence, the conductor had the ability to lead the music as if conducting a live orchestra, using the controllers rather than a baton.

Once things were working well in rehearsal with the accompaniment of the performers, Leavitt also collaborated with the show’s lighting designer. Together they integrated MIDI timecode into the sequence to control the light cues as part of the musical conducting. This worked well since there were only a handful of non-musical light cues in the show.

“There was a moment when Audra first appeared on stage, when she ran through a door, jumped on the stage, and then, on a specific dramatic beat, she put up her arms,” says Leavitt, “At that moment, we ‘hit it,’ and the music and lights transformed together. It was real; it was live. And that happened throughout the show.”

While the original plan to integrate live musicians with additional MIDI orchestration for the production was abandoned, the team still had the opportunity to explore that capability, given that one of the cast members played harmonica and guitar live onstage along with the digital orchestra. “Ultimately, the purpose of the system is to bring back ‘live’ to the theatre, not to replace musicians,” Zabriskie says. “The system can be used to augment a small live ensemble with a full orchestra sound, just like many of the national tours are now doing, but instead of a rigid click track, we get rid of that. We give the conductor full control.”

During the Hale production of 110, Zabriskie recalls, “There were many times in the show when Audra would take an artistic moment, a pause in the music. Well, you can’t do that with tracks, but she knew she could with our system. She’d take a break, we’d wait with her, and when she was ready, we could follow her and come right in with her.”

The system allowed the conductor to hold—like on a fermata—when the performer wanted or when the audience response warranted. Or take a break between musical phrases. Or vamp a musical phrase when desired. Ultimately, the cast loved the experience of performing to the digital “orchestra,” feeling comfortable that they were being followed and able to do what they wanted during each individual performance. Of the experience of using the new technology for the production, Hale says, “It worked beautifully. The timing was just so impeccable on every single number. It was just like having a live orchestra.”

Since the show closed in August, Zabriskie and Leavitt have continued to refine and explore options, including integration with the Wii Balance Board. “All orchestras are based on four sections: strings, brass, woodwinds, and percussion,” says Leavitt. “We can set it up so each of the four quadrants of the Wii Balance Board controls one of the four orchestral sections; the more weight you place on a certain section, it will increase the volume of that a certain amount.” They have also continued to develop the software, making it easier to jump to specific points in the music, to specific songs, forward and backward, which is particularly useful in a rehearsal setting.

“We’re really striving to make this musical, intuitive, and easy,” says Zabriskie. “We’ve lost a lot in theatre by going to straight tracks, and we need the ‘live’ back.” Leavitt adds, “Our super highest priority is to make this sound as real and rich and full as possible, so that people don’t go, ‘Oh, that was computerized.’”

Economic realities force many smaller and mid-sized theatres and tours to utilize prerecorded accompaniment or click-track recording with a small live ensemble. But there are very real drawbacks to those options, as the existing technologies greatly limit the ability to make creative changes in dynamics and tempo during each performance. “As exciting as this technology is, we have not forgotten that our real true purpose is to make the sound and the artistic endeavor the best it can be,” Zabriskie concludes.

Eric Fielding is professor of scenic design and resident set designer for the department of theatre and media arts at Brigham Young University. He has also taught theatre design at Goodman School of Drama, University of Texas at Austin, and University of Utah. A 30-year member of the USA 829, his freelance design credits include scenery and/or lighting for more than 250 productions. He is a fellow, former vice-president, Founders’ Award recipient, and Lifetime Member Award recipient of USITT. He was editor of Theatre Design & Technology journal from 1988-95.