Game Of Thrones Live FOH Engineer Greg Collins: Mixing The Sounds Of Westeros

It’s difficult to overstate the lavish production values or dramatic complexity of HBO’s epic Game Of Thrones medieval fantasy series, so taking the show on the road was bound to be an exercise in overachievement.

The Game Of Thrones Live Concert Experience arena tour, a joint production between HBO and Live Nation, gave the series’ legions of devoted fans a chance to re-experience the world of Westeros in a rich, live setting that combined an orchestral experience with a dazzling multimedia production. VER provided LED, lighting, media servers, and audio for the tour.

The two-and-a-half hour stage spectacle, built around Ramin Djawadi’s iconic score, featured the composer conducting a 65-piece orchestra and choir on an elaborate multilevel stage flanked by actors, stage sets, pyrotechnic effects, and enormous moving video screens offering more than 800 linear feet of video wall.

Sound reinforcement meant managing 136 speakers, handled on a DiGiCo SD7 at FOH by engineer Greg Collins—a task he likened to “mixing a film in an arena every night.”

I caught up with Collins in Los Angeles, shortly after the tour completed its 24-city run, to learn more about mixing the sounds of Westeros live.

Sarah Jones: You’ve worked in both live sound and the studio world; how did you find yourself out mixing this tour?

Greg Collins: I was out with Prophets Of Rage last year and Robert Long, the PM on that tour, showed me the Game Of Thrones concept that he and his partner Sooner Routhier were working on and asked if I’d be interested. It’s an incredible design, and I nerd pretty hard on Game Of Thrones, so I said, “I’m in!” Having done a lot of work with orchestras, and mixed for film & TV, this felt like familiar territory.

SJ: This was such an epic stage production. How many elements were you dealing with, and how were they synched?

GC: Visually, the show is built around scenes from the Game Of Thrones series, presented over several massive LED structures, many of which are automated. We had lighting, video, pyrotechnics, automated sets, and audio all running in timecode sync. Our master sync source was the audio playback rig, which fed timecode to all the other departments. Our playback operator, Clayton Janes, was the "trigger man" for the whole show. 

SJ: Working in the round always brings its own unique audio challenges. How were you managing sound across more than 130 speakers? Can you give me an idea of how that broke down?

GC: The system consisted of eight arrays of Meyer Sound Lyon in a left-right alternating configuration, presenting a stereo mix to the audience in four zones. Each array consisted of Lyon-M, Lyon-W, or a combination. Due to weight restrictions, all subs were located on the ground, and we had UP Juniors for front fills around the various stages. The inter-angles of the flown arrays would change daily based on the venue's design, but the point locations stayed the same. 

Our system engineer, the fabulous Michael "Monk" Shear, started his tuning process from the middle and worked out from there, making the center of the arenas the "zero point." Tuning was done with multiple wireless mics, depending on the size of the venue, and a tablet running VNC software that allowed remote access to SMAART and Compass. Once the arrays were aligned and EQ'd, Monk continued using the Compass software to finalize the alignment of all the fills and subs that were on the ground. Those as well were built into zones that stayed in a consistent location with the stage, so once they were in their usual place, we would tweak for the room and fine-tune delay times. 

SJ: Original plans called for a DiGiCo SD7 at FOH and an SD11 to deal with P.A. tuning; how did the workflow evolve?

GC: The earliest version of our production plan had us running the front of house mix from a control room set-up back stage, so the SD11 was there to give Monk test-and-tune capabilities independent from FOH. During production rehearsals, we decided to move the FOH position into the arena, but having the SD11 still proved to be very helpful in that configuration as well. 

SJ: You were managing 156 inputs. What kind of realtime challenges did that pose, and how did you deal with them?

GC: It really was a lot like mixing a film in an arena every night. Although on a film mix, you often have two or three different mixers working independently on the dialog, sound effects, and music. Being able to fire snapshots via timecode was utterly essential. We had more than 100 mics on stage, many of which were wireless and mobile. The complexity of the music itself also meant a lot of drastic shifts in focus. The detailed "scope editing" capabilities of the SD-7 snapshot system allowed me to automate only the parameters I needed, and manually balance the music.

SJ: What did you like most about working on this tour?

GC: Well as a huge Game Of Thrones fan, it's a lot fun to re-experience the show's story lines through Ramin’s amazing music. We also had one of the best audio crews I've ever worked with, and on a long tour it really comes down to the personalities you're working with day in and day out. With this group, it was truly a pleasure!

Game Of Thrones Live Concert Experience Audio Crew:

Monitor Engineer: Adam Stewart

Playback Operator: Clayton Janes

Audio Crew Chief: Shaun Henry

System Engineer: Michael "Monk" Shear

System Tech: Joey Armada

Wireless/Audio Tech: Ashley Zapar

Audio Techs: Shannon Fitzpatrick, Lliam Von Elbe

Backline Techs: Tanner Robbins, Redd Yoakum

Sarah Jones is a writer, editor, and content producer with more than 20 years' experience in pro audio, including as editor-in-chief of three leading audio magazines: Mix, EQ, and Electronic Musician. She is a lifelong musician and committed to arts advocacy and learning, including acting as education chair of the San Francisco chapter of the Recording Academy, where she helps develop event programming that cultivates the careers of Bay Area music makers.