The Treble Alliance, Part 3: Video For Star Wars: In Concert

Cohen says he was "fortunate to have a design brief that was full of images for me, so I took the design language of the films and interpreted it for a live audience—the lighting rig, set, video display—without touring too much expensive video. The style was given to me, but I really had to translate it."

Jackson, who worked closely with Cohen on all the visual concepts, recalls that he got involved with the production about a year ago, when he and video director/content manager Mark Haney were having "an especially difficult period with an artist we were working with and sent an email to Steve expressing all the reasons he should let us work with him on Billy Joel," he says. "He responded in hilarious form, and we figured we had our fun, but a few weeks later, he called about this project. With the timetables he was in with Billy Joel/Elton John and Star Wars, he needed a lateral partner who could handle design, directorial decisions, and the like while he was jumping between Star Wars rehearsals in London and Billy/Elton on the West coast."

Haney, who worked with Cohen on Britney Spears’ 2001-02 Dream Within a Dream Tour, adds that it was crucial to determine how to approach the show from a video perspective. Eighteen film sequences, reedited by Lucasfilm video editor Jeremy Stuart, were used for the show, all with timecodes. From those clips, a master script was created on which to build the show, noting how to route video, lighting cues, and general appearances of each track. "Jeremy is also a drummer and has great rhythm and timing," says Haney. "Everything is cut to the beat—lighting, video, pyro, and laser cues—and built around Williams’ music and what is playing on the video screen."

The visual design had to create an environment to celebrate the orchestra and footage fed to a 60'x30' Daktronics Mag10-HD LED screen (processing also by Daktronics), without the screen appearing like a giant TV. "The screen and surround setup is an homage to the window [in Han Solo’s ship] the Millennium Falcon—like peering through the window," says Cohen.

"It needed an organic quality to it, so you forget that it is a screen," adds Jackson. "The images on screen become ‘real,’ so that you start to lose yourself in the imagery and the music and forget you are watching a movie." To enhance the main screen, two triangular Upstaging HUD trusses, each holding four Main Light SoftLED screens—dubbed "Star Destroyers" because they mimic the Empire’s massive spacecraft from the films—form a sort of roof over the orchestra, angled as if one is coming toward the audience and one away. The latter is mounted on the audience side with iPix BB7s to look like firing engines.

Flanking the main screen are eight additional SoftLED curtains—four on each side—as well as an expansive aluminum frame. "Everything on the set is themed from the Star Wars films," says Jackson. "The frames, the tubular surrounds, and even the conductor’s platform are a nod to something in one of the six films. For example, the LED ring of light around the deck of the conductor’s platform is straight out of something in Vader’s quarters in Empire Strikes Back. We poured over the movies looking for details." Jackson and Cohen usually both work in Autodesk 3ds Max. "He did an initial concept design, gave it to me, and then I took his file and started transforming that into the actual structures." All Access built the set and provided staging.

The live feed itself mixes the film clips—comprising about 90% of the content—with shots of the live orchestra and Daniels’ narration from six cameras: four operated Sony 1500 HD Broadcast cameras and two Sony BR-700 HD POVs. Film footage had to be transferred from its original 35mm format to 1080i HD, and this tour marks the first time many audiences see parts of the films in high-definition. Two Martin Professional Maxedia media servers, one each for the overhead and side SoftLEDs, are triggered by Jackson from a Martin Maxxyz Plus console. The content is routed to Haney, who handles a Ross Synergy 3ME HD digital production switcher and mainframe with three buses of buttons, one each for the main screen, the overhead SoftLEDs, and the side SoftLEDs.

"I could have built it differently, but I like the ability to finesse all three separately," says Haney. During playback, a streaming track, similar to those used in scoring sessions, is used for timing. Occasionally, the look goes to what the team calls a "global," where the content on the SoftLEDs matches the main screen. Curtis Cox created much of the Maxedia and media content, while Chris West was responsible for low-res content and is the SoftLED tech and jib operator on the tour.

NEP/Screenworks (Danny O’Bryen) supplied all video gear for the production, including the high-definition video playback system designed by Gen2Media (Ian McDaniel and Mark Argenty). The Green System, as it’s called, comprises four Apple Mac OS X Servers—two HD and two SD—and custom software that drives the video. A DoReMi Labs VI-UHD high-def disk recorder and a V1x2 standard-def disk recorder are used for redundancy, rehearsal, and back up, while an Akai APC40 Ableton Performance Controller MIDI-triggers the X Servers.

The video crew also includes engineers Jim "Coach" Malone and Bob Larkin, lead LED engineer Bradley Reiman (FOH camera op), LED engineer Phillip Evans (HH camera op), lead camera and jib operator Brian Littleton, and playback AD/utility Dan Savage. The NEP/Screenworks support team includes director of tour engineering Ryan Ratajczak, director of LED technologies Wally Crum, and tour support by Marty Kell.