XR

Shocap and The 7 Fingers: Transforming Live Circus With Virtual Production

Athomas Goldberg, co-founder of Shocap Entertainment, and Samuel Tétreault, artistic director of The 7 Fingers, discuss new technology that will revolutionize circus performances. Shocap Entertainment, created in partnership by Lifelike & Believable and Animatrik, produces live cross-reality entertainment using real-time visual effects technology and on-stage human performance to create unique shared experiences simultaneously performed in front of live audiences, online audiences connecting from VR headsets, gaming PCs, and video streaming services. They have been working with the Canadian contemporary circus collective, The 7 Fingers (Les 7 Doigts de La Main), combining acrobatics with theatricality and storytelling. Founded in 2002, the company is expanding its sense of reality.

Live Design: What is the new technology and how does it work?

Athomas Goldberg: Ultimately, we’re trying to find ways to link multiple new technologies together into one seamless experience. At Animatrik’s studio in Vancouver, we’ve already got access to motion capture equipment, virtual cameras, a giant LED wall and much more. Instead of using just one of these technologies, our upcoming series, LiViCi, is a unique performance brought to life through the use of mixed reality, combining these technologies with Epic’s Unreal Engine to generate live shows that exist simultaneously in-person and in the digital sphere. 

Samuel Tétreault: The 7 Finger’s performers drive the movement of digital characters and animated effects in a fully-created interactive virtual world while wearing motion-capture suits – something previously reserved for the film and video game industry. It’s a true cross-reality achievement, demonstrating the incredible potential of the Unreal Engine and its ability to enhance live productions in new and exciting ways. With a LED Wall and other projection surfaces portraying live graphics behind the on-stage performers, in-person viewers gain access to an extended reality, shared with the remote participants. 

AG: In terms of a new individual technology, there is one highly unique aspect to our LiViCi offering. While still in development, we have been actively building the Ringmaster Virtual Performance System, a new tool for live show creation and control that connects a range of technological solutions, including motion capture systems, virtual cameras, LED walls and projectors, as well as DMX and OSC devices under centralized real-time control and while there are a number of other commercial solutions for live event production, what makes Ringmaster unique is its ability to simultaneously control virtual lighting, cameras and effects in connected virtual reality and multiplayer game environments, as well.

LD: How will The 7 Fingers incorporate this to transform live performances?

ST: From our point of view, we intend to extend live performance art to a wider audience than ever before. In order to achieve that and reach people in new ways, we need new and exciting storytelling tools. With Shocap Entertainment, we are able to expand our arsenal using technology to create new artistic languages. We don’t want to just showcase video effects like they’re tricks, we want them to be meaningful and to serve the story directly.

AG: The idea is to introduce a synthesis of on-stage acrobatics and computer-generated, real-time visual effects where performers interact with digital counterparts live and on-stage. The 7 Fingers are already incredibly talented – this isn’t about revolutionizing every core element of their performance, but rather, using technology in a way that uniquely extends what’s possible for both the acrobats and the viewers. 

LD: Is there also a virtual component for people who cannot attend "live"?

AG: With live virtual performances like LiViCi, we’re aiming to enhance and extend the audience experience, giving remote viewers access to entirely unique and interactive perspectives while in-person audiences witness an expanded on-stage experience. The interactive possibilities when viewed virtually include navigating the virtual space of the live event as an avatar using a virtual reality (VR) headset. VR is one of the most transforming aspects to the LiViCi experience and quite literally acts as a door to a new reality.

ST: In order to reach as wide an audience as possible, our approach to performance will allow for the simultaneous viewing of a performance by in-venue audience and remote viewers. The remote experience of the performances will provide each person with an element of direct participation. I think this is really important and often lacking from other approaches to virtual performance. There is a great opportunity to offer new exciting and engaging forms of virtual interactivity making it possible to match the inherent intensity of an in-person event.

LD: What hardware and software is required?

AG: For each performance, there will be an in-person perspective, a cinematic stream through a computer, Virtual Reality headsets, and an option to enter the show using an avatar like a video game. This last option could be achieved using an every day computer with an internet connection or a VR headset. Each point of view will have its own unique benefits and ways to interact and influence the performance, and viewers won’t need high-end hardware to view them. In terms of streaming the performance live from our studio, we are partnering with a number of innovative partners specialized in facilitating live, virtual performances. 

LD: What are the technical staff requirements, is this where the partnership comes in?

AG: When introducing new technologies, and new ways for audiences to connect, there’s always going to be a learning curve, but our goal with Ringmaster and all of our tools is to leverage existing workflows as much as possible, and to make show creation and control as intuitive as possible. Our DMX and OSC integration is intended to enable lighting and sound operators to use their existing devices and consoles to create and control cues in both the physical and virtual environments.

Creators of the digital content used in onstage projections and online immersive environments will need to be experienced with Unreal Engine, and other digital content creation tools, but our tools do not add any additional requirements to this part of the production pipeline.

This is the real strength of our partnership. Between The 7 Fingers’ deep history of live theatrical circus production and our decades of experience in virtual production and real-time computer graphics we’re able to make the most of this new medium for live performance and do it at a level of production values that we hope will lead audiences to forget about the technology altogether, and simply immerse themselves in the life-affirming human stories, stunning visual artistry and incredible death-defying performances.

LD: What/when is the first live performance in this vein?

ST: We’re aiming to premiere the first in our LiViCi Concert Series in early 2022. We have already shared multiple teasers, completed a number of workshops and produced a preview performance, including a live demo as a taster to the audience, showcasing the project’s remarkable technological achievements and offering. This is a long term partnership and project and we’re still developing all of the necessary components for a large-scale acrobatic performance using Shocap’s mixed reality technology. 

AG: The workshops completed so far have already successfully demonstrated what LiViCi can achieve —we can’t wait to share more over the next year. 

video preview: