Following The Action: zactrack Does The Math At The Super Bowl

Eric Marchwinski, cofounder and partner at Earlybird, took on the role of lighting programmer and director for the Apple Music Super Bowl LVII Halftime Show at State Farm Stadium, Glendale, AZ. He talked to Live Design about their use of grandMA3 as well as zactrack’s automated tracking system on Rihanna’s spectacular, multi-level Apple Music Super Bowl Halftime Show.

Live Design: Had you used zactrack or another tracking system before?

Eric Marchwinski: Over the years we have had some great opportunities to utilize all of the available tracking systems on the market integrated with lighting, automation, and video. The advantages and implications of automated tracking is something we’ve gained a lot of experience with over the years, and this experience has allowed us to assess which system is the best for a given application and creative needs of a particular show.

We spent a majority of 2022 working on a large zactrack system in Las Vegas for the show Awakening at the Wynn, where we became extremely comfortable with this software, and its more advanced features.

LD: Did you specify zactrack for this project?

EM: Yes, we chose zactrack because we were very familiar with it after our time in Las Vegas and we really like the way it manages data at a console and fixture level when it comes to control and programming. We have a strong relationship with the developers in Austria and the distributor here in the US, ACT Entertainment. This additional support also solidified the choice as the stakes were high. It felt like the natural choice because we knew we could rely on the software and its features in this environment, which is often full of hurdles with no time. Having worked on the show in different capacities for several years, we quickly learned how many logistical hurdles there are to putting on a show in the middle of a football game. Even given the rehearsal days, there is no time for development or exploration within the production schedule; so we didn’t want to explore something that was newer or less proven.

When we saw the platform design and we knew how dynamic it was going to be my immediate thought was that we needed a way to actively light these positions. The alternative would require us to use up valuable rehearsal time trying to constantly update focuses and timings or even more extreme; add an additional 14 follow spot operators on the show. Neither of these were going to be feasible, so zactrack became the clear choice.

The zactrack system allowed us to change the way we lit the platforms depending on how important they were to the visual story being told throughout the 13 min number. At times we could create strong graphic focuses onto each, other times we could use more oblique angles, and for a hero look we could assign an entire truss of fixtures to a single platform.  Any moment when Rihanna was on one or the choreography was central, the platforms were featured. When Rihanna was at ground level or performing in another area of the stage, we would light the remaining dancers more off axis for a more theatrical or scenic look.

LD: How many tags and anchors did you use on this project?

EM: We actually did not utilize the RF trackers and anchors, on this show due to the saturated RF frequency spectrum.

In a traditional zactrack system you would set up a series of anchors which in turn communicate with RF trackers that an actor would be wearing. The combination of these two devices result in a “localized GPS”, where the system knows where each anchor is and then triangulates where the tag is in relation to that at a rate of up to 30Hz.

Earlier in 2022, we had asked the developers to create the ability for the ZT system to receive incoming PSN data, and tie that positional data to a “virtual tracker”, treated exactly the same as one of the RF trackers above. The implementation of this feature opened up the software to be able to receive the absolute position of any axis of automation in the system, and in turn we could assign fixtures to this axis. For the halftime show, we were simply using this feature with PSN input from TAIT’s Navigator Automation Platform.  

We worked with ACT and ZT to enable a way for the system to be run without any anchors, and only PSN input. This was a new request, as the anchors are integral to the system understanding the space it is working within, but both ACT and ZT were able to help us implement the software on this show without any anchors. Aria Hailey was onsite working closely with Mark Humphrey on the system calibration and integration. They spent about 2 night dialing the system in and the first time we started assigning lights to platforms, the results were perfect.

zactrack

LD: Was this the main challenge that you had on the project?

EM: The main challenge on this project is always getting to the starting line. Putting on a stadium concert in the middle of a football game is one of the most unique logistical challenges you could encounter in this industry. There is a lot about the way we all create shows that simply cant happen in this environment. This was a sharp learning curve our first year on the show, but since then we have figured out how to tackle this unique challenge. For this show, we only get to see the full scale model (the whole set, all dancers, all cameras, the full lighting system, etc) for three 3 hour rehearsals. Most of that time the set is not sitting there either, it’s load in and load out is being simultaneously rehearsed as we are attempting to do notes in-between passes. All of that being said- we have to show up to the first rehearsal with our homework done. We spend a lot of time in previz simulating the camera direction, platform moves, choreography, and creative direction in order to program the lighting to about 90% of what ends up on air.

LD: Were there any unique features or programming opportunities that arose from using the zactrack system?

EM: There was one very unique circumstance that was only made possible by the combination of our zactrack system, and TAIT’s “Nav Cam” both being on the show. In addition to the halftime cameras around the set, there were two automated flying cameras being run off of the same system as the platform automation. This created an opportunity for zactrack to receive the position data from these two additional axes, and in turn allowed us to assign lighting fixtures to a moving camera. This is the same concept as the platforms, but a more rare opportunity with outstanding results. The shots being taken from these two cameras were meticulously scripted, and this allowed us to cue in the fixture assignments to these cameras when they were live. On one of the last shots in "Diamonds", Rihanna is on the center platform at about 60’ above the field; the Nav cam is a few feet from her, and there are about 200 lights on the rail behind her all perfectly pointed into the camera creating a beautiful textured effect from these fixtures. As the camera pulled out along its profile all of those fixtures continued to remain focused directly into the lens, a feat that would be impossible without all of these systems in place. The camera paths were very dynamic and we were able to find about 5 separate opportunities to utilize tracking the cameras.

LD: Do you feel this type of technology will become more prevalent in the future?

EM: The zactrack system gave us the ultimate flexibility, and brought incredible efficiency to a situation where time and speed were paramount. The end result was almost imperceptible, or even just assumed, if you didn’t know how it was being done. I do believe that someday this will be as common place as previz or even moving lights on a show. Someday we will know where everything in the room physically is, and be able to take advantage of that spacial information across all departments.