What's Trending In Content Creation

What's Trending In Content Creation

In today’s highly visually motivated world, digital media takes the stage as a major element in storytelling. The more compelling the content, the better the story is told as the visual media serves as a means of communication from artist to audience, be it for the theatre, a concert, or an immersive digital experience. To get insight into how this is being done on the cutting-edge, we asked five visual visionaries to share their favorite content creation software, and here’s a look at the technical wizardry found in their toolboxes.

Zachary Borovay

As a projection designer, one of the best parts of the job is getting to work with some of the most beautiful content in the world, from wonderful photographs to film and video. While it’s a digital game, I’m a film junkie. (It goes back generations in my family.) Even though film may be gasping its last breath, its digitally simulated offspring lives on in Alien Skin’s Exposure plug-in.

Today I will share some details on my top-secret weapon.

I would say that 70-80% of the images used in my designs have been touched by this plug-in. The thing I like about it more than any other film simulator plug-in is that not only can you add or remove grain, saturation, tone, etc., but it literally lists hundreds of brands and speeds of film. Looking for the accurate colors in Fuji Velvia 100? No problem. Black and white Polaroid? You betcha. How about a nice warm Kodak Ektachrome? It’s there too, not to mention a host of vintage and low-fi film simulations and even some fun Technicolor Wizard Of Oz-type filters.

Dig a little deeper, and you’ll find some really interesting tools like bokeh adjustment, infrared simulation, vignettes, light bleed effects, and really great dust, scratch, and aging filters. I remember working with an artist who was particularly skilled at aging images with Adobe Photoshop. He had developed a workflow, and the results were great. Then I showed him Exposure, and it all went out the window. It was light years faster and, in most cases, better. I showed another designer friend that you could export an After Effects movie as frames and then batch export them with Photoshop. After they were digitally “aged,” he reimported them back to AE, and he had a beautiful Technicolor movie.

Eva Peron (Elena Roger) and Che (Ricky Martin) in Evita, standing in front of the scrim with image of the inn projected on it, while Magaldi (Max Von Essen) stands behind the scrim and is lit to appear as magically coming out of the photo, by Zachary Borovay.

In the 2012 Broadway revival of Evita, we used the filter extensively. At the end of the opening number, “Requiem,” we travel back in time from Eva Perón’s funeral to her humble beginnings in Junín, where we see her boyfriend Magaldi performing at a tavern. We did a scrim bleed-through effect, fading from a photo to the real thing, at which point the scrim flew out and revealed the inn. We took several photos using a Canon G1x and then transformed them via Exposure, using a few of the manual settings as well to ensure that we covered the set from as many angles as possible. The results were pretty fantastic! When projected on the scrim, it really looked like a beat-up old photo.

I’ve tried a slew of photo-aging filters, and no other filter can do as good a job of transforming a digital photo into a vintage photo.

Daniel Fine

Recently, I have been experimenting with stop-motion animation. I love the tactile experience of manipulating real-world objects, creating in-camera effects, and layering in digital special effects. My preferred tool for creating stop-motion content is DZED Systems’ Dragonframe. It is an industry-leading software used by the top pros at Aardman and Disney studios, yet Dragonframe’s power is easy to unlock. When I first launched the application, all I had to do was watch a few excellent tutorials, and within an hour, I was shooting.

Dragonframe has a clean, intuitive interface that offers a camera control and live view panel, allowing me to control the vital settings of my Canon 7D, such as F-stop, shutter speed, ISO, focus, and white balance. Dragonframe automatically records a full-resolution image (of my choosing) and a lower-resolution image of every frame shot. This is a great feature because, by using the lower-resolution stills, it is less CPU-intensive for realtime previewing of thousands of still frames.

For a three-minute clip, shot at 30 frames per second, I have 5,400 still photos. Multiply this by two, for the high-res and low-res versions, and I now have 10,800 photos. Suddenly, this is a file management nightmare, but the folks at DZED Systems have carefully thought through this issue. Dragonframe automatically saves the high-res and low-res images in separate folders, within project and scene folders that I create. It allows me to give a name to each scene and saves each still image with my given file name followed by a sequential number, all easily customizable in preferences. This well-planned file management system allows me to put the focus on content creation rather than worrying about managing tens of thousands of files.

Built into the user interface is the ability to see and reorder stills in a timeline view; this is a powerful option, as it allows me to navigate the stills in a familiar, non-linear video-editing interface. It is easy to advance a frame or go back to the previous frame shot for matching action and placement. Though my workflow is to import the still frames into Premiere Pro for more precise editing, Dragonframe’s timeline allows for in-app editing and can export edited sequences as movies or image sequences. Other useful features include: selecting the number of frames to shoot at a time, being able to log notes on each frame recorded, enabling shooting grid overlays, importing sound files, and importing mask overlays or videos/stills for in-app compositing.

Everybody's Talkin: The Music Of Harry Nilsson at the San Diego Rep, 2015. Projection and photo by Daniel Fine.

Dragonframe has a built-in option to shoot in stereoscopic 3D, which I look forward to testing on an upcoming project. The software allows for control over vital stereoscopic elements, such as parallax adjustment, and creates two images for every frame, one for left eye and one for right eye. It will also export still sequences as a 3D movie.

But Dragonframe’s power doesn’t stop here. DZED Systems and affiliated second parties have a host of hardware that extends the software. Dragonframe ships with a USB keyboard controller, which is a really useful tool that allows me to step away from my computer in order to control all the vital elements of the software while shooting. Several other hardware devices are available that extend Dragonframe for realtime motion control equipment, including complex, multi-axis camera moves. I look forward to integrating their DMX control device into my workflow in order to program lighting cues directly within Dragonframe. For the DYI crowd, there is even a sketch that will allow Dragonframe to interact with an Arduino to control motion tracking systems, cameras, and other hardware devices.

All these powerful features and the ability to easily integrate raw footage or image sequences into Adobe Premiere and After Effects make Dragonframe my essential tool when creating stop-motion content.

Mary Franck

For the last several years, I have been hooked on making realtime content with Derivative TouchDesigner. TouchDesigner is a graphic programming environment, so it’s not a traditional content-production tool. Since it is developed by Derivative, it shares DNA with Houdini. The main intent of the developers is to allow for realtime content and live video performance. I wouldn’t say that realtime media will ever replace carefully crafted pre-rendered content, but it’s very complementary. For some kinds of shows, it makes way more sense.

I’ve also used TouchDesigner to make rather sophisticated custom media servers, but that’s another story. The realtime aspect really comes into play in large-scale immersive and projection-mapping productions. Being able to see every adjustment in the space as you make it really makes TouchDesigner unparalleled for custom large-scale production.

TouchDesigner is very flexible in terms of combining it with other tools, so for example, I have been developing workflows for making animated geometry in Autodesk Maya and then using it in realtime. Camera-matching between the two also allows for mixing pre-rendered and realtime content. Another technique I’m into right now is generating 3D data with algorithmic tools like Rhino/Grasshopper, and then using that as a data set to visualize. I make what I think of as visual instruments that allow me to improvise and have content that endlessly varies.

Mary Franck and Kadet Kuhne in Carapace, an immersive AV performance. Photo by Sebastien Roy.

Last year, I made an immersive AV performance called Carapace along with Kadet Kuhne at the Société des Arts Technologiques. The idea for the show was that the viewer was moving into and through a series of elaborate, abstract spaces that represented the interiors of the self. The geometries of those spaces were inspired by shells and sea creatures. I rendered the full-dome show in realtime. Designing content for a 16:9 screen versus designing for a 360° dome requires completely different compositions and scales. So it was amazing to be able to adjust every element of the show during the residency in the space. Kadet took advantage of the incredible surround sound system in the dome, and we were able to match our spatializations. Even in our last rehearsals, I could fine-tune what I wanted to.

I’ve been using TouchDesigner for long enough that it feels like a native language to me, but I see people from other backgrounds pick it up and quickly realize how it can let them reimagine the ways that they present and produce content.

Matthew Ragan

Working as an interactive engineer at Obscura Digital, much of my time is spent developing software for installations and events. Recently, I worked on some new visual elements for AT&T at the Cowboys Stadium in Arlington. Our team of four developers was in charge of updating the media playback system, which included a custom playback system developed by Obscura. My contribution to the project focused on the LiveFX Board, a 130' high-resolution display. Made of 40 spinning LED screens with mirrored backs and strobe lights, this project was about finding ways to take advantage of this machinery in new and exciting ways.

Working alongside the other programmers, content developers, systems engineers, and producers on this project, we had an opportunity to work at the intersection of pre-rendered content and generative systems—the kind of work that I find the most exciting. My work here is done in Derivative TouchDesigner. This single tool allows us the flexibility to build an audio analysis engine, realtime 3D rendering environment, DMX control system, and visualization tool to simulate how the actual machinery would respond in the stadium. 

Obscura routinely builds its own software to meet the challenges of working with large arrays of servers and projectors. TouchDesigner is one of our core development tools, as it allows us to flexibly build software that scales, is optimized for any number of media servers, is modular by nature allowing us to be nimble in our development process, and for a litany of other reasons. It allows the programmer to peek into any process and see the mechanics of what’s happening in the code. 

Where then does this fall as a content creation tool? Generative and interactive installations often require that media feels alive, that it presents as something listening and reacting to participants, not fixed or looped. In this way, procedural media-making often pushes the programmer/artist to think about programmatic and conceptual frameworks to generate visual landscapes with code. 

Initially, that was an intimidating prospect. Over time, however, it only becomes more and more thrilling. Computers are very fast at tackling redundant tasks, and the more you begin to think procedurally about the world, the more opportunities you find to create art that might otherwise take lifetimes. Want to visualize the entire works of Shakespeare as a barcode? Sure, why not. Want to fly through a cloud of Tweets grouped by hashtag? Let’s get started. I often find that the most abstract procedural artistic investigation helps to inform the kinds of aesthetic I want to explore more. This is one of the things that’s most exciting and thrilling about working with TouchDesigner. The only limitations are the ones you impose on yourself.

Laura Frank

MTV VMA's design by Laura Frank.

Coming from a background in lighting consoles, there is a special place in my heart for Apple’s Motion. I spend most of my time in Adobe After Effects, Illustrator, and Maxon Cinema 4D, but when I need to quickly throw together some eye candy or rhythmic visual effects, I open up Motion.

The great thing about this software tool is that it has a lighting-style effects engine for pixels. I recently worked on a project using the ROE Hybrid 18mm screen. This screen has embedded spot LEDs on 150mm spacing within the lattice of the 18mm LED surface. I gave the designers delivering files to me the option of using that space or leaving it to me to fill in with lighting-style effects. Above is a shot of the outdoor stage at the MTV VMAs, where you can see the LED spots pop through a blue field on the upstage wall. In Motion, I use a tool called the Replicator that allows me many of the same “handles” I’d get on a lighting desk. Things like rate, random seed, shape, direction, and parameter work together to quickly create activity in your composition. There is also a library of Particle Emitters that is very useful, making the tool quick and easy to learn. And Apple sells it for $50! I don’t use this tool as much anymore. As resolutions increase, I need more powerful tools to create detailed content, but in a low-res environment where you want to interact with your system with lighting-style effects, I highly recommend this software.

For more, download the October issue of Live Design for free onto your iPad or iPhone from the Apple App Store, and onto your Android smartphone and tablet from Google Play. 

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish