Hybrid Events and Extended Reality: Q&A With XiteLabs’ Greg Russell and Vello Virkhaus

The bar for event production and innovation is constantly being raised as organizations continue to explore digital offerings while simultaneously looking for creative ways to draw audiences back to in-person.

LA-based design studio XiteLabs has been pushing the envelope on event tech throughout the pandemic, particularly when it comes to extended (XR) and augmented reality (AR) productions and blurring the lines between reality and the emerging metaverse. Recent projects include tech design for Lil Nas X’s “Montero Show” in September, 3D stage design for Tame Impala and Bad Bunny’s most recent tours and a world performance for Grammy-nominated R&B artist Bryson Tiller. They’ve also designed full-scale productions for corporate clients like Microsoft, Facebook and Fox Sports and supported production teams for events with Walmart and Vizio.

Last month, Xite worked with Volo Events on Galacon: Into the Galaverse, an innovative three-day event hosted by blockchain gaming company Gala Games. The event included two days of immersive in-person sessions and activities as well as a global livestream on the final day. For the production of the event, Xite implemented exciting tech including Ndisplay/Unreal Engine for real-time rendering of graphics and visuals.

XLIVE caught up with XiteLabs co-founders Greg Russell and Vello Virkhaus to discuss their recent work on Galacon and the implications of this type of technology for the future of events and productions.

XLIVE: Could you share a bit more about Galacon and your work on both the physical event and the livestream components?

Greg Russell: We weren't really that familiar with Gala before. They're very much an up-and-coming company, but they're founded by people who have been very successful in making video games. I think the basic way to look at this show is that this is their coming out party on some level, after their coin was released on Coinbase. This was a live event to let the world know what kind of games they're building and how those games are going to be backed by IPs like The Walking Dead, and how those games allow players to own parts of the game through the sale of NFTs and earn cryptocurrency on the blockchain through gaming. A lot of the stories that they highlighted are about people who have been able to make money playing these games and get out of debt and make  positive progress in their lives as a result of just being excellent gamers and partaking in this whole process.

The exciting parts about the production, which we worked on with Volo Events, were several fold: it involved super large-format projection mapping, which we’re well known for, and also using Unreal game engine to play back some of this content in real time so that lighting programmers could control what's happening on the screens and what's happening in those 3D world. So us using a game engine to build some of the visuals in this show I think has an interesting relationship to the fact that we're dealing with gaming companies and their shows.

As far as we know, no one has used Unreal in this type of manner where it's being used in a live setting, being modified and updated and reactive in real time with live performers on the stage. We're breaking the mold there a little bit in terms of how the game engine is used. Things like projection mapping and renders in Cinema 4D are beautiful, but we've done that before. And other people have done that before. So I think what's interesting is the implementation of Unreal Engine in terms of content.

The livestream featured a bunch of content — a mix of things that were shot onsite during the show and for a week or two before that as things were being built, like giant character props and a 30-foot-tall dragon. Xite went out and did all the video documentation, capturing all the behind the scenes footage, edited all this material together onsite and turned around edits and delivered all that as part of what we do. That was the second major component of our work on the job. This wasn’t necessarily the most unique part of the project, but it speaks to the breadth of capability of our company.

XL: How has the pandemic affected the popularity of these types of projects? Do you see more of this type of technology being implemented moving forward?

Vello Virkhaus: It's definitely growing. I feel like we’re a part of something really exciting — a great tool that was a little experimental. But what we were able to do in real time blew all of us away. It was unexpectedly successful. What we were able to do would have been really hard to do in a pre-rendered way, where we had this strong flexibility in the worlds to create these theater shows. And also the timetable — that's what is fascinating about it. We had like three weeks to build these worlds out. So being able to work in a way that's familiar to the game business, like using a source control system to manage files, kind of takes two different workflows: one that we're learning a lot and integrating and one that we're familiar with, which is the standard modeling/building/texturing method.

We took this real-time workflow, which was using source control and allowing multiple people to work inside files at the same time. And using that technology combined with our experience in post and projection mapping and mixing it together has been the perfect fusion, because we would have never finished if we didn't have source control. Multiple people can't work at once in a Cinema 4D file. So we are able to leverage this technology now in ways that we kind of hadn't expected, but now it's kind of a core thing that's built into what we do. It’s super important because we can do things on tighter timetables, and we were able to deliver because we didn't have to render it. You could hit play, and it would run the show. And you can make changes to it in real time.

GR: A lot of that did come out of Covid specifically for us because we made a decision when Covid hit, and all of our live concerts and corporate events went down, to dive very hard into XR. We built our own LED stage, we got partners together with us, we shifted our entire animation team over to Unreal Engine, and did this whole lengthy paid training process to get everybody up to speed. They're great artists, but now they had to learn a new technology. That helped us do XR projects, and that XR knowledge that we had from doing live XR shows ported its way into this job, because now we have a team that's really knowledgeable about Unreal Engine and real time rendering, and now we're doing it in a live environment. So that's the arc of the story that's the most interesting — how Covid and that challenge drove virtual production, and as it's going partly back live, how that technology is finding its way into the live environment. To me, that’s the most important message.

XL: That’s something I’d like to dig into a little bit more, especially when it comes to things like hybrid environments. Do you see tools like Unreal Engine being used more to bridge the gap between a virtual world and what's going on in real life?

VV: Yeah, that's the talk of the town, if you will. As you've noted, that's what everybody is interested in. You have this desire to have these meta spaces at an event or concurrent to an event. We're linking live music or live performance with a virtual world that’s concurrent, or using that technology to engage audiences. I think we're going to see more and more of that. And I think with some of our work that we're going to be doing in 2022, working on metaverse type worlds, we're going to start seeing more and more features and circular interactivity between the audience and the performer and the live show, so I feel like it's bringing something new. Gamification of entertainment is really what's happening. Live entertainment and the video game industry are merging together, kind of like when lighting and video merged together and created some of the first media servers. I feel like we're seeing this new occurrence of industries, technologies, and talent and tools merging together. The new media servers are incorporating extended reality and hosting virtual game engines inside the servers. So everybody's looking to these new rendering technologies and how we're going to use this new tool. And of course, the music industry is fascinated by how to engage people both at home and live. There’s going to be a lot of development. There's a lot of heavy hitters putting millions of dollars into this development.

XL: As more and more people get into the space, particularly artists or people in the music industry, who are exploring this technology, who may not have a lot of experience with it, how do you approach working with new clients who may be unfamiliar with the space?

VV: I think it's a learning curve for everybody. There's always the thing of managing client expectations. What's interesting is explaining to people that it could still take as much time to create an experience like this using real time technology as it may have taken in your previous pipeline, but that the more organized you are with that and the more time you spend prepping, the more flexibility you're going to have to do things you wouldn't have been able to do in the live environment. I'll give you a great example: an example would have been this world we built for the Walking Dead [at Galacon], if they had come around and said, “Hey, our environment was more blue, and we need this to be more blue and not look so greenish and dark.” And then if you're in the pre-rendered content world for a live show, you would have had a pretty horrible experience rendering very high-resolution stuff at the last minute and trying to make versions and trying to figure out how you can correct it. It’s explaining to people that you can't just go to the Unreal marketplace and buy a $20 scene and throw that up on screens. There's definitely education there where people are going, “Oh, wait, I'm just going to buy this and throw that up. It's like 20 bucks.” Well, it's not quite that easy. It’s definitely all about education. I feel like every job, every producer, every team we've worked with has learned something new every time, including us.

GR: What I find interesting is I remember when I was in college — believe it or not, before I got into arts, I was a neuroscience major in college — and when I was when we had to do papers, there were no textbooks. You had to go to a library and find scientific periodicals because you only could work with the most recent research when it came to doing anything around the brain or nervous system. And it kind of reminds me of that, where there is no textbook. There's barely any research that you can read about a lot of the things that you're doing in real time on virtual stages, and the live shows, and the game world, how that works in a live environment. And at some level, it is kind of like brain surgery. You’re having to figure a lot of things out, who's got the most recent information? Are they a colleague of yours? Do you have to have an uncomfortable conversation to get information from each other in an environment where knowledge is hard to come by and experience is knowledge? To me, there's a lot of parallels there between science and what we're doing, and that's in part why, when we formed our company from our previous two companies, we decided to make our tagline “Creative Sciences.” It really is a lot of what we do, and the people that we work with are shockingly smart in very laboratory-testing type ways.

XL: Many event professionals have noted the increased collaboration and best practice sharing between competitors that has come out of the pandemic — from what you’ve mentioned it sounds like that’s something you’ve also experienced.

VV: Absolutely. And that’s something that's been helpful for everyone. It’s a really good thing, because people realized that this information needs to be shared for overall advancement, and that promoted more collaboration. We've collaborated with people that I never thought we would really work with. New doors have opened up because of new technology and the desire to be experimental. And if you're going to be experimental, why not put some of the best people in our industry together, who may be competing against each other, to try to advance things? I think that's what we're offering.

XL: What are you most excited for looking into 2022 and beyond as all these changes are happening in the industry?

VV: I'm excited to do a lot more with Unreal and for the possibilities of when there's less limitations on the technology and for how complex things can be and how you're able to move around. It's going to almost make anything possible, in a way, when you're not limited by the number of triangles you can draw and you can create these kinds of photorealistic spaces. I think we're kind of at the beginning of a new approach to live entertainment right now, where like I said, gaming and entertainment are coming together and furthering that concept and people's connectivity together. Also the ability to collaborate more with even lighting. This last show we did was integrating to the right tools for the right approach so you can deliver, and collaborating with lighting and pyrotechnics and effects and live camera directors, and folding these things into these virtual platforms or portals or worlds. I'm really excited about that. I know Greg is too, I think that's so much potential, you know?

GR: I'm very excited for the hybrid events where we get to be live and doing virtual production at the same time. I’m loving it. Can't wait to do more of it.

XL: Is there anything else you’d like to mention?

GR: The other thing I would like to touch on in the virtual production world is that, in terms of what you were asking earlier about guiding clients, I think one of the things we've learned is that there's no perfect solution for every need in our current world. People think, ‘Oh, we're going to get an LED screen and it's going to solve all of our creative problems in the world.’ Which isn’t true. It's about having enough expertise to now understand okay, well, sometimes you need an LED screen. Sometimes you need an LED screen and floor, sometimes you need a green screen and a green screen floor, and sometimes you need this kind of lighting rig or this kind of camera. That overall scope and method of like evaluating projects now I think is the most important thing to do with clients, because they get all this information from peers about how their peer or a competitive company did something, and then they'll want to use that same type of approach for their project, which isn't the same kind of project and would have been better served by more formal consultation with a company like ours or some of the other experts in the industry. I'd say from a client perspective, trying to get people to tamp down on their super overzealous, “let's do what everybody else is doing” attitude and actually doing a little more research with smart partners and companies before they dive into things is probably the take home message from an advisory standpoint.