The Road To LDI2022: Q&A, Ryan Metcalfe, The Future Of Visualization

Ryan Metcalfe is managing director of Preevue LTD, a company that creates virtual reality and mixed reality visualizations used for architecture, construction, and theatre productions. At LDI2022 he will talk about LiDAR Scanning, Digital Twins, And The Future of Visualization.

Live Design (5Qs: Ryan Metcalfe) spoke with Metcalfe in 2017, when he launched his company, and has gone back to see what has happened over the past demi-decade and where he thinks we are going in terms of visualization in the next five years.

Ryan Metcalfe

Live Design: What has been happening with Preevue over the past five years? Has the technology changed? Who is your client base?

Ryan Metcalfe: It’s been a whirlwind few years and reading that it’s five years since your original piece makes me feel very old! One big change is that the team at Preevue has grown considerably. I feel genuinely lucky with the group we have and I’m constantly in awe of the talents and efforts of each and every one of them.

The bulk of the work that we do has also changed a fair bit from the original idea that was at the core of Preevue back when I incorporated the company; for the first year or so, the focus was very much on using VR technology to visualize the set designs of productions in advance - we still do this but it’s now a smaller part of the business, with the majority now being venue-focused as opposed to show-focused. We still do a fair bit of work on productions, with shows such as Harry Potter and the Cursed Child, Bat Out of Hell, and Moulin Rouge! The Musical, along with several typically NDA-hidden ones, but it’s rather dwarfed by our work in laser scanning and building CAD 3D digital twins of venues. VR is now something that we only use as-and-when a particular project requires it. It’s an incredible tool for understanding a space and its sightlines in a matter of seconds, but ultimately it’s not necessary for every job.

Our client list is one that I’d have probably drawn up as a wish list had you asked me five years ago. Clients include Ambassador Theatre Group, LW Theatres, the National Theatre, the RSC, Barbican, Chichester Festival Theatre, and the Royal Opera House. A handful of upcoming projects have clients of the “pinch-me” variety, that I’m sure I’ll be able to talk about come 2030…

LD: Can you talk about your project with the National Theatre and what was Preevue's role?

RM: We were brought onboard by the National to laser scan and produce 3D digital twins of all three of their venues: the Dorfman, the Lyttelton, and the Olivier. This project involved multiple days of scanning and several months of 3D modeling to rebuild each venue. One of the key uses of this data is for lighting pre-visualization with ETC’s Augment3d, which ties in nicely with the National’s use of the Eos ecosystem in-house. Now they have the data on file, other departments have also found use in having 3D CAD data for many other parts of production planning.

One of my favorite anecdotes is one shared by Daniel Murfin, formerly the lighting control manager at the National, who, when originally forwarding some of our early screenshots of the Dorfman, was met with the response of “Why are you sending me photos of the Dorfman?” Don’t think we could get a better testimonial than that.

LD: Please talk about your work for Moulin Rouge: laser scanning, visualization, VR, and their new views-from-seats system, etc…

RM: Moulin Rouge! The Musical has been a fantastic project for us and a wonderful case study as the production utilized practically all that was possible with the data we gathered and assets we created for them. The first step in the process was to complete a LiDAR laser scan on-site at the Al Hirschfeld Theatre in New York back in 2019, shortly after the show’s opening. This was quite a unique scan as typically a scan of a set design would focus on just the stage area, but with Moulin Rouge the production takes over the whole auditorium too, with scenic elements like the windmill, elephant, elaborate drapery, and festoon lights. So in the end we scanned nearly all of the Al Hirschfeld as well all key scenic states of the show. The next step was to extract the show elements from the point cloud and begin adapting them to match the new show’s drawings, which we then combined with our 3D model of the West End’s Piccadilly Theatre to create a visualization file of the full show.

As Covid hit and travel became restricted, our visualization and VR files were used in the final sign-off of the design variations and seating configurations. We also ran our Sightline Analysis Tool for the production and finally linked the views from seats which had been used by the producers to set ticket pricing to the official box office system. It’s been such a fantastic project to be a part of.

LD: What are digital twins?

RM: A digital twin is a computer-based virtual replica of a tangible, real-life object. In our case, that’s usually a 3D CAD model of a venue. We primarily build our digital twins from the LiDAR laser scan surveys we take, although sometimes we also build these from drawings - as is usually the case for upcoming new productions that don’t yet have anything built to scan-in. A digital twin enables those on the production to work in with a virtual like-for-like replica of the space, allowing them to complete certain jobs without needing to visit in-person. It also provides a digital sandbox, enabling creatives to experiment with ideas in an environment they can be confident accurately matches its real-life counterpart. Perhaps the most well-known implementation of this is lighting previsualization, which is certainly nothing new but has always had a level of cynicism surrounding it. A lot of these negative views on it can be attributed to occasions when designers and programmers have invested time in programming using a visualizer and then arrive on-site to find nothing matches and their time has been wasted, with the root of this problem being an incorrect 3D model of the venue. When working from a digital twin based on a 2mm-accurate laser scan survey, however, they can be confident that decisions made within the virtual world are going to be accurately pair in the real one.

LD: What is your Sightline Analysis Tool

RML Our Sightline Analysis Tool is the in-house software we use analyze the view from, and objective value of, every seat in an auditorium. The software measures the percentage of a stage, set, or part of a set that’s visible from every seat, taking into account architectural obstructions and people’s heads sitting in front. We can then add weighting to reflect how important certain areas of the stage/set are which then affects a seat’s overall sightline value. We then combine this with other factors, such as the distance to the stage and any custom parameters such as the seating preferences of a show’s demographic to generate a score per seat. We then link this with pricing to generate the objective ticket cost per seat. As you’d expect, this roughly follows the ticket banding found in most West End and Broadway theatres, but it does often throw up some outliers. It’s a great way of verifying pricing and checking the actual percentage numbers on sightlines for a new design. It’s also used in conversations between producers and creatives – for example, a designer may want to tab-in further with a false proscenium, but the producer has the data that doing so would drop the median sightline value from 80% to 75%.

LD: Where do you hope to be in another five years?

RM: I have rather ambitious plans for where I’d like Preevue to be in five years. We’re a few weeks away from launching a brand-new offering that we’ve been developing for a number of years that brings our experience in providing visualization for the theatre industry to a completely different sector. I think it’s going to be a game-changer for us and will soon become a major second arm of the business.

I think the industry’s use of visualization will evolve a lot in the next 5 years; I expect to see the role of production visualizer, whose job it is to collate all information from each department to create a master 3D model/visualization of the production. I think in 5 years’ time, it’s likely we’ll have the vast majority of venues in the West End, on Broadway, and the major ones of the touring circuit in the US and UK scanned and modeled, as we’re already well on the way.  As soon as we reach that critical mass, it’ll be a lot easier and cost effective for more productions— if not every production—to use visualization technologies.