Add It Up

George Brown is a man with a mission. As chair of the Department of Theatre Arts at Bradley University in Peoria, IL, he stepped up to the plate in a big way when asked if he could take advantage of the university's membership in Internet 2, an open-fiber connection with a data-transfer speed of 10 Gigabits per second. “They were looking for people to use it,” explains Brown. “Dean Jeffrey Huberman asked if we could use it for theatre.”

Brown's answer was an emphatic, “Yes,” as Internet 2 allows for telematic (long-distance via Internet) performances, with actors in multiple locations appearing in the same production. The first foray into this world of high-tech theatre was The Antigone Project, produced with the University of Central Florida (UCF) in 2004. “This was a 20-minute experiment with an edited, contemporary version of Sophocles' text with the teleconferencing as part of the story,” explains Brown.

After an additional series of short projects with UCF and the University of Waterloo in Canada, Brown opted to produce and direct a longer project: Elmer Rice's 1923 expressionistic play, The Adding Machine, which examines man's fears of being overcome by technology. The result was an innovative, interdisciplinary, inter-institutional collaboration that integrated virtual scenery, video, and sound via Internet 2 using recorded images, avatar performers, photographs, and graphics with sound. Almost 100 students, staff, and faculty were involved in the process.

For Brown, the choice of the script was largely structural: “The scenes had two or three characters on stage at one time, allowing you to video-conference individual characters. The script was very conducive to this type of project,” he explains. “Rehearsals and production meetings took place via teleconferencing using Polycom and Apple iChat programs. For the performances, we used Digital Video Transport System, or DVTS, which enables digital video distribution via the Internet.” Four recycled PCs with Pentium 4 1.5GHz processors and 512MB of RAM ran Windows XP with FireWire cards. “These systems required 30 megabits of bandwidth each, and at times, we required up to 120 megabits for the production,” Brown adds.

A video camera was connected directly to each PC, which, in turn, was connected to Internet 2 to stream noncompressed video signals, images, and sound 1,000 miles in about a second. Isadora software sent the images to one of three Panasonic PT D3500U projectors in the theatre. “The projectors created the illusion of one large image, 36' wide across the stage,” says Brown. “Much of the video was shot as greenscreen to key images into the graphics.”

The audience was seated in the Hartmann Center for the Performing Arts at Bradley University, where Zero, the lead character, and the role of Daisy were seen live, with additional actors beamed in from three remote sites — Waterloo, UCF, and a studio space at Bradley. “One actor, John Wayne Shafer at UCF, played three characters: a boss, a judge, and a lieutenant via makeup and costume changes. He sat at the computer in his office with a greenscreen on the wall behind him. At Waterloo, Brad Cook played the role of Shrdlu,” Brown points out. Cook was in a small studio theatre with a greenscreen and virtually whisked to the stage at Bradley via Canada's Canarie high-speed Internet service.

“The production was almost two years in the making,” says Brown. “I teach a class in theatre and new media, and we developed some of the elements as part of that class. We started the brainstorming and storyboarding in December 2005.” In December 2006, the production process ramped up, with less than six months to opening night on March 6.

Going Digital

Jim Ferolo, director of Bradley's multimedia program, was co-creator and art director, working on the conceptual development. “I created the CGI backgrounds, projection system, and a series of predesigned composites and backgrounds,” he says.

The computer-generated scenic elements ranged from a New York City tenement to a courtroom scene in a black-and-white, monochromic palette and were architectural in nature. “The contrast was in the Elysian Fields scene, with heavenly animations and digital trees, organic forms, and water,” says Ferolo. “In a version of hell, the adding machine represents a dystopian, post-industrial blending of man and machine in a degraded technological state.”

The actors were keyed into the CGI composites in realtime. “It's like live television,” Ferolo notes. “All of the elements pulled together to tell the story. The audience was interested in the story, not where the bits and bytes were coming from.” Ferolo found that the biggest design challenge was making sure that the quality of the work was up to par: “We built a rendering farm with 20 to 40 machines running at the same time in college computer labs at night. We would run back to the stage to see what worked and what didn't, like looking at dailies on a film,” he recalls. “We needed to know what we wanted to build in CGI so we did a lot of drawing and planning in advance.

Luckily, the university had the computer labs available — Ferolo guesses that the render time alone would have been $90,000 to $100,000. “We not only had all these resources available, but it was a great experience for the students in compositing and live-keying software. There were thousands of pieces of media by the end. We replicated a commercial production environment, which is a pretty big deal in a university setting.”

Erich Keil, a scenic and lighting design professor at Bradley University, served as the production's technical director, scenic designer, and lighting designer. “It was really interesting for me,” he says. “I had done shows with lots of projection, where the media was developed first. In this case, we wanted physical parameters and a definition of space first, then the media came later, for the most part. The initial conception was a question of what could we actually do, as well as what we wanted to do. What is the biggest image we can have with the virtual actors? We wanted the media to have a big physical presence. I like non-traditional surfaces and odd shapes.”

That said, the goal was to give the students as few hurdles as possible, and a rectangular shape was selected for the screen that measured 9' tall × 36' long and was made up of three 9'×12' sections of Rosco Light Translucent RP Screen. The screen was placed upstage to give actors playing area downstage, with the screen 8' off the deck.

Stairs on the stage led to a platform allowing the live actors to get closer to the actors on the screen. The live actors in scenes without video played on the lower level to help pull the audience's attention to the action.

“Everything in Zero's life is work, and he lives in a very industrial environment,” says Keil. “His life is in a factory or seems like it, even if it's in his head/nightmare. The three screens looked like windows in a factory setting that shifted to projection surfaces.” The back wall, which encompassed the screen, was 20' tall, with the screen/windows 9' high and 36' across, with 2"-wide mullions in 3'-square grids.

“Some of the media pieces worked really well with the mullions, with different images in the squares for the death and murder montage, for example. And in the crypt screen, each square was a projected crypt front,” notes Keil.

“Having no projections in the mullions also worked well. The mullions did cause some problems when some of the large-scale images of the actors were projected,” he continues. “I asked, ‘Did we want the mullions or just big screens?’ but ultimately we liked the aesthetic of the mullions. I might have sized them or shaped them differently, as some of the mullions hit where the actors' heads might be, but a large white rectangle wouldn't have looked so good in the middle of the set. One idea was an entire theatre of 3" squares with more projectors.”

The tall upstage wall with the RP scenes blocked many of the theatre's upstage lighting positions. “We didn't want backlight on the screens except for a few color washes when there were no images on the screens,” says Keil. “Basically, we had a unit set with furniture and props to indicate various locales, with some scenes in Zero's head versus reality.”

The theatre has no flies, no electrics pipes, just basic booms on the floor and catwalks spaced 12' apart. Keil also has just 96 dimmers, all wired dimmer-per-circuit in raceways along the catwalks. “We had to use a lot of cable to use the upstage dimmers because of the wall,” he points out. “The lighting was primarily to help establish the locales,” says Keil, who describes Zero's world as “an old muted photograph.”

A grant of $1,500 (out of a $12,000 Special Emphasis grant from Bradley University for student/faculty collaboration; the production had an overall budget of $50,000) augmented the $1,000 in the production budget for lighting. Keil turned to Design Lab in Chicago where he rented two High End Systems Studio Spot 250s. These were used primarily as specials for Zero and hung on the first electric as principal key lights to match the color temperature of the projectors. He also rented 12 Wybron Forerunner scrollers to use with ETC Source Fours as a back diagonal wash to color the stage floor, which had a neutral-gray concrete look, and the rusty gray of the wall. “The lighting varied between a dingy look and a brighter look when it needed to,” Keil explains.

For the Elysian Fields scene — the one moment not in a factory environment — a LeMaitre G300 fogger (with a low smoke-generator accessory to chill the fog with CO2 to keep it on the floor) was used to make the acting platform look like it was floating in the clouds and make the industrial world disappear. A Rosco 1600 fogger was located upstage for entrances into crypt and the hell scene at the end.

An ETC Expression 3 is the theatre's resident console and was employed for the moving lights and scrollers. “We used a lot of DMX cable as there is only one DMX outlet in the booth. So DMX cable ran from the console in the booth to the catwalks. We must have run 600' to 700' of DMX cable,” says Keil, who also used fixtures from the in-house inventory, including 46 ETC Source Four ellipsoidals of various sizes, 41 Altman 360Qs, and eight PAR64s. To create a big exhaust fan over the factory door, a ceiling fan was mounted sideways and run from the lighting board via a dimmer.

“My immediate concern was to modulate intensity and color temperature of lights versus projections,” says Keil. “Bright projectors on low lamp mode helped, as I could never match the color temperature with the rig we had. A lot of the media was inherently pretty dark, aesthetically, so that helped.” Keil also played with the light and projections to create various looks. “There is one scene where Zero is alone on stage with his wife projected and screaming at him,” says Keil. “I put dark space between them, with high-intensity light on Zero and the high-intensity video images on an overall dark stage to create tight isolation for the characters.”

In the Elysian Fields scene, everything was more open and brighter, with lavender and magenta in the scrollers, and the same colors in the fog and on the floor. For the crypt scene, which has almost all projected actors, Keil put a deep blue in the scrollers. For the office scene, once again he created very tight isolation with two actors sitting at a table talking. He added four archaic scoops as practicals on the walls for backlight, with an amber wash for the office.

In the hell scene, Zero is attached to the machine. “George wanted a rock look, but we'd already used most of the dimmers and fixtures,” says Keil. “I used the HES fixtures with movement and gobos, not a big ballyhoo, and went back and took the color palettes from the earlier scenes to create a rock 'n' roll show as if it was all passing before his eyes.” Thirty-year-old Altman strip lights and old 1kW PAR64s were plugged into upstage floor pockets and served as practicals when the adding machine was rolled out onto the stage.

“This was a big production, especially based on the resources at Bradley, where the shop is a quarter of the size of the stage and only 8' tall. We built the wall in 22 sections of flats put together with bolts on ladders — the old-fashioned way,” says Keil.

Sound Delay

Rand Kuhlman of Kuhlman Companies (an AV company in Peoria, IL) specified and provided the Panasonic projectors for the production. He was also asked to solve a very specific sound problem encountered along the way: “The initial problem was the latency between the remote locations and the main stage, with about a 750ms delay,” says Kuhlman. “In addition, the microphones on the stage sent sound back to the remote locations as they were picking up the delay signal, as well as the headworn mics used by the live actors onstage with the same delay, creating several feedback loops.”

To solve this problem as much as possible, Kuhlman took an old Behringer XR2000 Intelligate expander/gate/ducker and used it “upside down” using all of the sound being sent to the house speakers as the key signal to shut off the audio feed that was being sent back to the remote locations. When the remote actors finished their lines, the gate would instantly open again so they could hear the lines being spoken by the actors in the theatre. This process allowed the audience (and the actors on stage) to hear everything from all sources, but prevented the unwanted and confusing delayed and echoed sounds from being sent back to the actors at the remote locations.

“I still had to ride gain throughout every performance and make instantaneous decisions about when to override the ducker, because sometimes the remote actors would make unconscious or extraneous noises while listening to the lines being spoken by the actors on the main stage. This had the unintended effect of cutting off the lines the remote actors relied on as cues, but by the third or fourth performance, we all adjusted nicely and everything worked pretty well after that,” Kuhlman notes.

All sound was mixed through a Mackie 16.8 board with a submix sent to Kuhlman to manage feedback and other problems that involved the use of a small powered desktop monitoring system using JBL multimedia speakers; a Behringer Shark DSP 110 for gating unwanted stage, audience, and line noise prior to processing; the aforementioned Behringer XR 2000; and a Behringer DSP 1124P to improve overall sound quality within the theatre and suppress feedback and unwanted line noise.

In addition, one Behringer EX 1200 Ultrabass Pro 2 way crossover and LFE processor were used with the theatre's existing sound equipment as follows: two Crown CH1 power amps, four Electro-Voice S-300 PIX two-way loudspeakers, and one custom-built 18" bass-reflex subwoofer. The two actors in the remote studio at Bradley used earbuds, while the stage managers used ClearCom headphone systems. Audio editing was done using sound effects software on a Windows platform.

In the end, The Adding Machine charted new waters in theatre via Internet 2. “The irony is that all of this high-tech gear is used to tell a story that is basically anti-technology,” says Brown. “This was the most extensive theatre piece ever done like this. And, of course, the cast party was held in cyberspace.” Mission accomplished!

For more on Theatre
For more on Projection

Suggested Articles:

Lighting designer Christopher Akerlind shares how he is coping with COVID-19 closings.

Scenic designer Christine Jones shares how she is coping with COVID-19 closings.

GLP's 10 Out Of 10 with award-winning West End & Broadway Lighting Designer Neil Austin, can be seen right here.