Trekking through the thronged aisles of the National Association of Broadcaster’s annual trade show, you can’t help but be astonished by the confluence of myriad content forms facing down endless options for “broadcast” or more accurately, multi-device distribution platforms. It feels somewhat like being in the middle of a maelstrom and quite different from when I first attended this show some 28 years ago when there were only three major American broadcasters and their names all had three initials.
Conventional green-screen with highly detailed virtual background.
At NAB, one can find the most granular hardware or software to support production and distribution of content whether that content is a podcast, a YouTube clip, or a feature film. Need a camera crane that mounts to a car roof (in this case a Lamborghini Urus, which is definitely the top choice of video production people everywhere)? How about really bright light fixtures that help airplanes avoid transmitter towers? All that, and much more, can be found, but I was particularly interested in the “realities”—virtual, augmented and mixed—and how they are presenting to the media marketplace.
Virtual reality (VR) has been an element in broadcast production for some time now, mostly in the form of virtual sets where talent is shot against a green screen cyclorama and the background is filled in with a composited 3D CGI environment. While this has been an okay option for some applications, it is safe to say that the technology has not bankrupted any scenic fabricators. Clunky robotic cameras were required and the virtual sets looked like scenes from the Nintendo version of Duke Nukem.
Experimental virtual screen technology from Sony.
In 2019, virtual set systems have proliferated, and there were more than a dozen vendors making appearances on the show floor. While the end-products of these systems continue to appear a bit video-gamey, they are much more flexible and, in a few cases, there is genuine innovation going on. The significant change, from what I could discern, is based around camera-tracking, where the on-air camera’s position and perspective are accurately tracked and the resulting movement data controls the animation of the 3D environmental content.
disguise partnered with WorldStage and ROE to showcase extended reality (xR).
The most notable of a number of interesting demos was put together by disguise with help from ROE Creative Display, and WorldStage. What made this particular demo interesting is that it replaced the typical green-screen with active LED display walls and floor. The 3D environment was displayed on the LED walls, and the perspective of the content was animated by the camera movement.
This approach provides a number of advantages over the traditional method because it allows the on-air talent to see, rather than imagine, the content they are supposed to be interacting with. It also eliminates the edge conditions sometimes visible with traditional green screens. When the system is expanded to include augmented reality (AR) content (in this demo, a formula race car on the floor of the “studio”), the talent can see the AR object on the floor display and intuitively understands the physical relationship between studio talent and virtual object. In other words, there’s no “walking through” the undetected virtual object because it is now easily visible. This approach has parallels in film production where a similar system is based on a large-scaled projection system.
Example of custom LED tiles on display.
While the disguise system remains a work-in-progress, it is fairly far along and, according to Ash Nehru, disguise founder and chief innovation officer, there are a number of development options they are exploring, including integration of lighting control parameters to eliminate glare and shadows among other enhancements. Again, the key is the accurate tracking of the studio cameras that are no longer confined to pedestal-mount configurations. Indeed, jibs, rails and even handhelds are possibilities, thanks to the recent advances by companies such as Cast BlackTrax and Stype. Of course, a very capable media server system is also required.
Apart from attempts at altering reality, there were a few other notables at this year’s NAB, including the expanded presence of 8K video and even examples of 8K broadcast from Korean Broadcasting. All I can say about 8K is what I previously said about 4k: It looks better than what came before, and at some point in the future, you may want to be a new television set.
Of course, the LED display companies were in abundance at NAB as they seem to be at every trade show nowadays. However, one area of LED to keep an eye on is the ability of the manufacturers to create custom shapes using high-resolution pixel pitches. Apparently, they can turn fairly complicated designs around a lot faster than you’d imagine, which might be a good option for live design professionals looking for something different in their video displays.
Signing off for now, from my 50th floor room at Caesar’s Palace that is in reality on the 10th floor...
Throughout a 40-year career in the entertainment technology business, Josh Weisberg has experienced each of the evolutionary leaps in sound, video, and lighting technology from a seat in the front row. Combining a rare level of business management and technical engineering acumen, Weisberg has a keen understanding of the mechanics of running a technology business as well as the engineering and design chops clients rely on for all types of projects. Currently working as a technology and business consultant (having stepped down from the leadership role at Scharff Weisberg and WorldStage in 2017), Weisberg utilizes his expertise in large-screen display design as well as other event technologies for clients in the event, arts, theater, and spectacle sectors.