One of the most celebrated musicals on Broadway and contender for multiple Tony Awards, Natasha, Pierre and The Great Comet of 1812 (hereafter abbreviated “The Great Comet”) has earned critical acclaim for electrifying performances by performers that move freely throughout a set that encompasses nearly the entire audience.
From the outset, The Great Comet was written and designed with audience and performers commingling in a common space. Sound design was manageable in the production’s first small venues, but the challenge of keeping amplified voices and instruments – as some orchestra members roam about as well – properly localized in space became increasingly vexing as The Great Comet advanced into larger theaters. When the show prepared to open in Broadway’s expansive Imperial Theater, sound designer Nicholas Pope was tasked with devising a system that offered a degree of dimensionality and fluidity never before realized in any theatrical production.
The Great Comet was first presented in a venue seating about 100. Although the musical style required basic amplification, direct sound coming from the vocalists and instrumentalists was sufficient to localize their positions sonically. And because distances between performers were limited, there were no timing issues because of the time it takes for sound to travel between performers, and from loudspeakers to performers.
Moving into a Broadway theatre normally seating over 1,400 introduced unprecedented for challenges for keeping amplified sound constantly localized to the roving actors and musicians, and for keeping musicians and singers on beat when at opposite sides of the theatre.
At the heart of Pope’s solution is Meyer Sound’s D-Mitri digital audio platform, the largest of its kind ever used on Broadway. The system’s 288 x 288 signal matrix allows any combination of input signals to be routed independently to any of the XXX loudspeakers placed throughout the space. Meyer Sound loudspeaker arrays and cluster are placed throughout the theatre so that performer’s sound will originate from any plane and at any level, anywhere in the space. Loudspeakers had to be selected carefully to ensure consistent voicing so that voices and instruments would move seamlessly anywhere in the room.
The free roaming of performers throughout the show required pre-programming hundreds of trajectories into Meyer Sound’s Space Map 3D panning software. (Meyer Sound states that, to their knowledge, it is the most complex Space Map program ever assembled.) And because real-time control of panning was essential, Pope worked with programmers to devise a custom iPad interface that enables a dedicated “follow sound” operator to fine the apparent localization of any performer with “hot grabs” on the screen, ensuring the sound never wanders from the visual location.
The extraordinary challenges of the production also required creation of a unique “hybrid” digital mixing system. A DiGiCo SD7 console served as the front end for inputs, augmented with a Waves SoundGrid server for effects. Using multiple AES3 digital connections, the DiGiCo was spliced into the D-Mitri platform so that about 250 inputs could be freely routed to – and panned among – any combination of outputs for loudspeakers.