The reality of virtual lighting: Illuminating a new vista in TV broadcasting

I first became acquainted with virtual technology by accident. I was hired to light a traditional bluescreen shoot for ABC News, and got into a discussion about these composites always looking fake. In my opinion this is due to the background lighting being done by one person, usually a graphic artist, and the foreground lighting being done by someone else, usually a lighting director. In this scenario, these two people never discuss any kind of unifying design concept.

My conversation, as it turns out, was with Dave Satin of SMA Video. His company was about to start a virtual project for ABC/Discovery Channel called Discovery News--the first weekly broadcast in the United States using the new technology. He suggested I work on the project and see what improvements we could make to the believability of the composite.

Most people are familiar with standard bluescreen and Chromakey technology. We can all remember seeing the behind-the-scenes weather set on the news: The talent stands in front of a blue or green curtain and looks at a monitor while pointing to the curtain. An image, such as a weather map, is inserted in place of the curtain. With this traditional technology, when the camera moves, the background doesn't. This means if you move the camera in any way, the talent will appear to be Superman flying through the air, completely dissociated from the background.

Virtual technology takes this to the next level. Virtual allows you to insert a graphic, or in more advanced instances a 3D set, while still retaining the ability to fully move the camera. This is called tracking. The camera and the computer are in sync so that as one moves, the other sees and keeps up. The result is truly amazing. The possibility of moving the camera opens up this technology and finally makes it useful as a real tool.

As film uses similar technology, it's easy to wonder why television's virtual technology isn't producing images as incredible as Jurassic Park or The Matrix. It all has to do with the lighting and rendering. When films are being produced, the filmmakers often have more than a year from first shot to opening in the theatre. Television is often shot in the morning and airs that night--or more amazingly, it's live! Every light you add in the computer slows down the rendering speed. There are also several different types of rendering. Raytracing and radiosity are two rendering styles that require much more computer time but achieve realism in ways that nothing else can.

Some films are so complex that one second of virtual animation can take up to 12 days to render. In television we don't have the luxury of time. We must render in real time, and that means that every second we must make a complete render 60 times--quite a difference in comparison to film. To achieve this and still maintain quality, we can't afford to have lights that are interactive like in film. We are forced to paint all the lighting (highlights and shadows) into the set. I work very closely with the animator at this stage in the process, and have even learned several software packages to light the set in the computer myself.

The computer we use for rendering the scenery is a Silicon Graphics Onyx 2. A company called Orad, from Israel, puts out the system that does our tracking. This technology was originally used by the military. When it was converted to its current use for television, the system needed to be reworked and made more user-friendly. As we developed Discovery News, we worked closely with Orad to achieve this.

There are other companies that make virtual systems. All are slightly different, so I will concentrate on my experiences with Orad for simplicity. We have a blue grid on the upstage wall of the studio and software called CyberSet. There is an infrared (IR) camera in each upstage corner of the studio which in turn detects an IR device on the studio camera(s).

Since traditional fresnels and ellipsoidals give off a high level of IR, the cameras can easily become confused. Lights are often mistaken for additional cameras. This can disturb the tracking, making the set jump and jitter. I had to come up with an alternative way to key the talent.

The only choice readily available was fluorescent. I choose to go with Flathead 80s from Kino Flo, using their 3200K lamps. The quality of the Kino Flo ballasts is superior and the lamps are hand-picked for consistency and color temperature.

This solved the IR problem, but I felt something was lost in the feeling of the light. The fluorescent tube has a very different quality than an incandescent source coming through a glass lens. I was unhappy with the sacrifice and decided to find another alternative.

I recently discovered that Rosco makes an IR/UV glass filter that reduces the IR output of a traditional source by more than 50%. This has allowed me to slowly begin adding back traditional fixtures into my design. I say slowly because we must make sure that we are not sacrificing the tracking, and that the filters are doing what we need. It is the best fix I have found so far, though I am still in the search for alternative solutions.

When I first began working with virtual technology, the mandate was always to deceive the eye and create reality where there is none. Roger Goodman of ABC News was the director for Discovery News. He was concerned that if the setting looked "cartoony" the broadcast would be taken less seriously. So we began the process of creating "real" virtual scenery.

We decided to avoid gimmicks. The technology was capable of having the talent walk behind a virtual object, having something rise out of the floor or spin in from any direction. But everyone felt it would not look credible. The set designer, George Allison, went through his usual process: research, sketch, model, blueprints. All of this material was then handed over to the virtual team and we began the process of building the scenery in the computer and then lighting it.

Since Discovery News was the first weekly broadcast to use virtual technology, there was nothing to compare it to. We spent the first two broadcasts pretending the set was real. The third week we were on the air, we felt secure enough to highlight the new technology we were using for the scenery. Since Discovery News covers science and technology, it was a perfect marriage.

The primary challenge in lighting anything virtual is to make the foreground and background match. The composite of these two components is critical. The primary machine used to achieve the composite is the Ultimatte. It gives a great deal of control over the final image. There are two main ways to manipulate the image to achieve a good composite.

First off, the style of lighting in the computer must be consistent with the lighting in the foreground. In a standard news setup, this is fairly easy. Most times, the anchor is at least 10' (3m) away from the set and doesn't interact with the scenery. It gets more complicated when the talent is right up next to the scenery; then, the only way to make the match work is to insert the things you usually hate. By this I mean flares, kicks, and contamination. I hate it when the talent's backlight desaturates the color on the floor, but if you want to make it look real, you have to do what it would really do. It always looks so perfect before you muck it up--but it doesn't look real! Reality is never perfect, in fact it's the little things like scratches and dents that make real things look real.

The second thing to work with when trying to achieve a good composite is the video. The most important thing during the shoot is to make sure that you and the video operator are speaking the same language. One of the advantages of virtual is that you can control the foreground and background separately. This can come back to haunt you if you're not careful, but when used properly it can be a great aid. If the set has a dominant color, as in ESPN's SportsCentury, I often ask the video op to tint the whole picture slightly toward the set color. The main word is "slightly;" a very minor shift in color can tie the whole image together.

One distinct benefit of shooting a virtual set is the ability to get any shot in an instant. There is no need to reposition the camera or have the talent move to a different area of the set. All that is required is a simple rotation in the computer and you're magically transported to a new area. Of course, the real lighting must match this new angle. And since the scenery is ready immediately, it's rare that anyone wants to wait for the lighting to change. Television lighting is not known for its heavy use of cueing, but in the virtual shoots that I light, I use cues extensively to help keep me ahead of the next set change.

I am also using ETC Irideon AR5s. Since my grid height is at 10', these little lights fit into my light plot perfectly. These units are fabulous. They simulate having full-size Vari*Lites at a normal studio height of 20' (6m). I can refocus them from one area of the stage to another instantly. I can change color, tinting slightly for wardrobe changes if it is used as a backlight. They are also great accent lights for real props.

There are several technical challenges in this process. The goal, of course, is not to allow them to stand in the way of achieving your final artistic vision. The cameras used in a virtual shoot are the same cameras used in a traditional shoot. The camera may or may not be fitted with a sensor head for the computer to use in tracking. Ultimately the body and lens have not changed; no changes in lighting are needed to make the talent look good. One big difference is that normal filters used in the lens of the camera can't be used. They would throw off the focus of the background and affect the tracking. So, you are pretty much on your own to make the talent look as good as if the filters were in use.

Virtual studios, by their nature, can be much smaller than traditional studios. The Discovery News set, in reality, would be more than 75' (23m) in diameter. The logistics of setting up, shooting, and storing a set that size is prohibitive. The lighting call would be three days between set, shoot, and strike. The studio we shoot in is 15' wide and 12' deep (5x4m). We come in at 7:30am and roll tape at 8:30am--a big difference. I would have used more than 300 instruments to light the set and talent if we shot a real set. I currently use 14 lights. (She-TV, another project, is a little more complex due to the extra props and furniture; for that I use 22 lights.)

The studio is very small and heat is therefore a huge issue. Even with the air conditioning running full, it can't keep up with the heat coming from all the incandescent units. So we have made some substitutions. Instead of traditional cyclights to light the bluescreen and grid, I have elected to use Kino Flo 4x4s; they have a much more even field, which is critical, and they give off minimal heat. I now use them on all virtual shoots, even if there is no heat issue.

When I first started working on Discovery News over two and a half years ago, the technology was just beginning to be used for television. There had been a few trials of one-night specials, primarily by ABC. But no one in this country had ever attempted a weekly show that would be dependent on virtual technology. We had not developed the techniques for painting the lighting into the scenery. Most things looked OK at first glance, but as soon as you began to examine them closely, the image would fall apart. The shadows weren't convincing; the highlights didn't match and certainly didn't change with the camera angle.

The Orad people were just beginning to understand our needs for television. We spent a lot of time asking for new code to be programmed to allow us further options. For example, we couldn't do more than one transparent/translucent object. So although we could have a piece of glass looking through to something solid behind it, the rendering suffered greatly. Initially we used two global computer lights, one from above and one from below to give a small amount of interactive lighting effects. We found out rather quickly that it was better and more predictable to paint them in whenever possible. The biggest problem was always with the talent's feet--more specifically, where they hit the floor. When the talent touched anything blue, there was no way to cast a real shadow. We always ended up avoiding the full body shot because of this problem.

On Discovery News I participated hands-on in the development of the scenery. I worked extensively in a program called Lightscape, which allows you to bring in a 3D model of the set and place lights as if you were hanging and focusing them. You can assign very specific qualities to lights in terms of color and intensity. The great thing about lighting with software is that the lights never burn out, they never drop focus, and there's no such thing as full intensity--you can always make them brighter! From Lightscape I would render out each surface, creating a texture--or piece of wallpaper--that already contained all the highlights and shadows. This is very time-consuming, but can yield great results.

Today, much has changed. WCBS local news in New York City airs its virtual set everyday. Some networks and cable stations now use this technology on a regular basis. More opportunities are being made for virtual--we can now do multiple layers of transparencies, project a real shadow onto a virtual object, make virtual objects in front of talent appear realistic, and cast a shadow of the talent onto the floor. All of these advancements help to give the set more credibility.

Since our first attempts, we have begun using faster methods to create the textures, which are slightly less accurate. Lightscape proved to be too time-consuming for television. So now we paint most of the shadows and highlights in Adobe Photoshop. We occasionally use a 3D program like 3D Studio Max to do simple lighting and then render out the textures.

The future should prove very exciting and promising for this technology. Every new development in computers brings faster processors, which in turn allows us to achieve more complicated scenery and lighting. Faster processors allow us to maintain rendering speed while advancing the techniques we use. Ultimatte is about to debut a new version of its product that will give us more flexibility in the composite.

I doubt that we will ever be able to use exclusively interactive lighting and still render in real time. But I can hope for some breakthrough in the technology that will allow us to move away from the painting we currently use for shadow and light.

Rita Ann Kogler is a lighting designer in New York City. She can be reached at [email protected].

PROJECTS She-TV, SportsCentury, John Stossel Reporting, ABC News Saturday Night, Mr. Anatomy's Food Pyramid, TLC Legends with Bryant Gumbel, Vietnam: A Soldier's Story, Discovery News

EQUIPMENT Arri 2k baby fresnels; Chimera small and medium Quartz Banks; ETC 36-degree Source Fours, ETC Irideon AR5s; Kino Flo 4' 4Banks, Kino Flo Flathead 80s, Kino Flo Image 20s; Mole-Richardson 2k Zip softlights, Mole-Richardson 1k baby fresnels, Mole-Richardson 650W Tweenie fresnels

SOFTWARE Lightscape, Soft Image, 3D Studio Max, Game Gen, Adobe Photoshop, Adobe Fractal Design Painter, Adobe Illustrator