Creating A New Reality For Live Theatre At Honda Dealer Meeting

 
Projection mapping has been all the rage for several years now. It has exploded onto buildings and show sets everywhere. Performers have bounced off the walls of skyscrapers and danced across projected scenery transformed into various virtual worlds to the delight of audiences around the globe.
 
At Martin Brinkerhoff Associates (MBA), we have been pleased to be among those creative groups at the center of this revolution. We have produced projection mapping spectacles for more than six years, including the 2012 IAAPA award-winning Disney Dreams® in Paris, but this is really just the most recent extension of our work.    
 
Our passion has always been live performance and new technology, and for many years, we have worked with programmers and tech developers to help us deliver on our ideas as we dreamed up each show. On one level, this year is no different, but on another level, this year is very different. Why? Because many traditional cost/technical barriers to live show production have finally begun to fall.
 
This year’s Honda National Dealer Meeting provided us an opportunity to merge live performance with an array of brand new media technologies to create a theatrical experience unlike any other. 
 
 
Inception
 
Honda has long been known for its clean, safe, and fun products. It is also a technological innovator across many platforms (automotive, motorcycle, power equipment, jets, and robotics). So when the manufacturer requested that this year’s meeting embody the youthful fun and the technologically advanced aspects of its brand, our creative wheels began to turn. Originally inspired by the TED Talks series, the idea was to bring the series’ technology, entertainment, and design concepts to life on the live stage.
 
5D Virtual Reality
 
We have been producing stereoscopic 3D shows for more than 20 years, exploring its use with live performers in particular. 
 
Recently, we have been developing a “5D” theatrical format that combines a blend of rear-projected (RP) stereoscopic 3D nested within a front projection-mapped (FP) dimensional set, a theatrical world where live onstage performers take the audience on a hybrid journey through “real” and “virtual” dimensions.
 
 
For the Honda show, we did all of this but also added live motion-tracking of projected props to the mix. Honda allowed us to bring this concept to life on stage, blurring the border where live performance ends and virtual worlds begin, thus creating live 5D virtual reality theatre. 
 
The In-House Laboratory 
 
Months before the show, our R&D process began with the construction of a scale physical model of the actual set at MBA’s studio located in Irvine, CA. 
 
Combined with an array of small video projectors, this model allowed us to continually pre-test, visually brainstorm, choreograph, and refine media while interacting with all departments: scenic design, motion graphics, choreography, and lighting. 
 
Cross-department collaboration has always been important but never more so than in today’s shows. If one design element is off, the entire production is off.
 
We have found that merely pre-visualizing live shows in CGI is too abstract and limiting. So we always push to make the creative process tangible and physical. 
 
 
Scale scenic models with simulated projection are a great first step, but we needed to go further, since we also wanted to explore live performance with live motion-tracking projection. We then built a nearly one-to-one performance space in our studio. D3 Technologies d3 media servers and Cast Software BlackTrax live motion-tracking systems were installed so we could try out various ideas and play with the medium for several weeks. 
 
Our business is, of course, a highly collaborative art. So our show team shared our daily discoveries and creative goals with software engineers at both d3 Technologies and Cast. Their dedicated teams wrote original code regularly to allow for their tech platforms to expand capabilities for our show needs.
 

Exploring New Ground

 
As you might expect when exploring new ground, there were many challenges along the way. First we had to match the center stereo 3D RP projection zone to the flanking projection-mapped dual level scenery. We ended up with a well blended 6,500x1,200 media canvas across the entire set, and in order to ensure that the stereo 3D media would line up with the framing 2D media, we developed our own custom 3D production process for this project.
In front of this 5D media world, we then sought to create an independent motion-tracking FP projection zone for our live performers. For this, we needed to see FP media tracking on performer-manipulated props while masking out unwanted side FP scenic media and vice versa—not an easy task.  
 
To pull this off, we collaborated with d3 Technologies to write custom code to dynamically “counter-mask” front-projected media as desired. Entrances and exits for FP props and performers required custom mattes, as well as the continuous rewriting of code to allow projection to seamlessly appear/disappear instead of moving onto the set walls.
 
Of course, anyone working with live interaction of performers and media has to deal with a long-dreaded tech barrier: signal latency. This has been a drawback for the past 15 years or so that we have been working with live interactivity. We worked with both d3 Technologies and Cast to effectively overcome this issue for the Honda show within very acceptable limits. Their software coders were a huge help.
 
 
We also routed signal paths a bit differently. By going from d3 to the DVI matrix to the Barco Image Pro to the SDI router to the projectors, latency issues were significantly reduced.

New Discovery

However, overcoming latency and counter-masking issues was just the start of our work. After playing with the set up for a couple of weeks, we became very excited by a discovery that we stumbled upon: dynamically-tracked, x, y, z axis-mapping with transparency. 
 
It’s a mouthful to say, and we’re still struggling to find the right words to describe it, but here it goes. Basically, we built a few white foam core boxes of different sizes and asked performers to move them through the FP tracking zone, each box tracking its own test “video cube” file.   
 
 
What did we see? Well, yes, each virtual cube revolved in sync with the real manipulated cube, maintaining correct perspective. This, we expected. But then we added transparency to each virtual video cube file in an attempt to create the illusion of a virtual object contained inside the real cube (all while spinning and moving). This also worked well.  
 
But then something surprising happened. As each real cube passed in front of each other we saw through the downstage cubes to the ones moving behind, all manipulated by the live performers—very interesting.
 
So we then added more live foam core objects to the mix, finally ending up with 14 independently tracked projection props on stage. We even played with virtual beams of light bounced from object to object as they moved. Honestly, it felt like we had cracked open a door to a fully dimensional world of virtual media for live performance. We are currently exploring this further for upcoming productions.
 
 
New Experience
 
We experience virtual reality usually through media such as video games, projection mapping on buildings, and stereo 3D movies, but combining these techniques together with live performance for a theatrical audience is an exciting new frontier for dynamic storytelling.
 
Our show began with a completely projected, closed curtain (2D and 3D) on what appeared to be the front of a proscenium stage. When a projected biplane instructing the audience members to put on their 3D glasses initiated the opening of the curtain and revealing the 3D space, there was an audible gasp throughout the theatre, and we were on our way. 
 
 
It’s so powerful when technology can be harnessed and expressed in such a way that the audience feels its magic and is drawn into the story with anticipation and youthful enthusiasm. And because this was a business meeting, the 5D process allowed speaker support artists to be more creative throughout the executive speeches presentation as well. Since the entire speaker support structure was mapped to the same system, it was never formulaic or predictable. The artists had a new range of freedom to build fresh environments that could transform in an instant.
 
For us, this meeting was a milestone event. We were able to bring live dynamic performance onto a media-immersed stage and create an exciting 5D theatrical experience for our audience. 
 
We’re only just beginning to imagine what’s next.
 
For more, check out the October issue of Live Design, which is now available for free download for iPad or iPhone from the Apple App Store, and for Android tablet and smartphone from Google Play.