Webcast Preview: Q&A With Nicholas Pope On Great Comet

Live Design, in partnership with Meyer Sound, invites you to join our free webcast with Nicholas Pope, who will break down his brilliant and complex sound design for the hit musical Natasha, Pierre & The Great Comet Of 1812, which has taken Broadway by storm and received 12 Tony nominations, including Best Musical and Best Original Score. 

The webcast will take place on June 28 at 2pm; click here to register.

In the meantime, get a jump start with our preview chat with Pope, who was recently awarded a Live Design Award, along with the rest of the show's design team.

 

Sarah Jones: This show has evolved from an 87-seat production to a full-scale Broadway event for an audience of more than 1,100. Can you talk about how that growth informed your sound design?

Nicholas Pope: In the smaller venues, we were able to rely very heavily on acoustical energy from the performers themselves, but now obviously all that's gone. By the time we've gotten to this size theatre, you don't hear anybody or anything unless it's coming out of the P.A.

SJ: You are tracking performers in a 3D space, correct?

NP: That's correct. I knew that going into this version of the show, we were going to be heavily amplified, and I wanted to maintain the actors to have a realistic aural presence in show. I feel like it allows for the audience to have a much stronger emotional connection with the performers when they are real, instead of having disembodied voices that you often hear. That was the driving force behind doing all that tracking, so the sound system knows and localizes to every performer, and all instruments, at all times during the show.

SJ: Is the show immersive the whole time?

NP: That's a very common term used to describe the piece. The audience members for the most part are not part of the performance of the show, which is why we kind of stay away from that “immersive” term. It generally implies that the audience has some role in the piece. There are a couple of minor moments when the audience becomes a part of the piece, but for the most part there's a pretty strong fourth wall in the show. That fourth wall just happens to be inches wide, if you will, instead of the more traditional 60' away.

But yes, we perform throughout the room, throughout the entire performance. It doesn't matter where you sit in the room; there are actors around you throughout the performance.

SJ: With the audience being just inches away versus being, say, 20' from a proscenium, how does that direct the aesthetic of your sound design?

NP: It's a huge aspect of it. It allows the audience to be brought into the world we have created; it's not as though you’re sitting back and observing something as an audience member. You really become part of the world. In order to create that world, it drove a lot of things on the sound design. It drove the localization, it drove doing room acoustic manipulation, and all kinds of stuff. It also means that the P.A. for the audience turns into much of the monitoring system for the performers, although there's an extremely large monitoring system for the musicians, because there's a really interesting musical time problem in the show: Due to the size of the room and due to performances throughout the entire space, there are about 75 milliseconds of delay differential between the different locations, which means that you're out of musical time; you're not even close. The monitoring system provides the information to stay in musical time.

SJ: You're using the Meyer Sound Space Map and D-Mitri system. What does that setup allow you to do?

NP: That's the driving force behind the system. I took an SD7, which is our front-of-house console, and chopped off the back end and chopped the front end off of D-Mitri and glued them together to kind of make one console. I have the best of both worlds: I get all the front-end processing out of the DiGiCo and then I can take advantage of the truly enormous 288x288 matrices that can live in a D-Mitri system. We're using almost all of that 288x288. Its 80,000-plus cross-point matrices allow me to track all of the movement in the show and all the localization in the show.

SJ: Do you think that it's becoming more common to take this kind of complex approach to theatre design?

NP: Well, yes and no. This show definitely is pushing boundaries in the industry. This show literally couldn't have been done a couple of years ago. The processing horsepower just didn't exist. In that sense, we're definitely pushing boundaries with the piece. The number of digital protocols in the show is just extraordinary, and five years ago, half of them didn't exist. From that standpoint, absolutely. From another standpoint, I think that producers have become more willing to spend a little bit more on sound. I think they understand that sound plays a huge role in the audience understanding and enjoying the piece. I think there has been a willingness to provide the necessary funds to make the music and lyrics as clear and transparent as possible.

SJ: What can people expect to learn about in your webcast?

NP: I'm definitely going to talk about the localization, and I'll talk about the system in general with Comet. It's an extremely large, very complex P.A.; I’ll go into that and certainly talk about the show in general.

SJ: We look forward to it!

Sarah Jones is a writer, editor, and content producer with more than 20 years' experience in pro audio, including as editor-in-chief of three leading audio magazines: MixEQ, and Electronic Musician. She is a lifelong musician and committed to arts advocacy and learning, including acting as education chair of the San Francisco chapter of the Recording Academy, where she helps develop event programming that cultivates the careers of Bay Area music makers.