I admit it. I’m in love with particles. Who isn’t? For some time, I’ve been creating with Red Giant’s Trapcode Particular After Effects plugin for rendered content and Troikatronix’s Isadora for real-time content. In the coming months, I’ll be diving into Derivative’s TouchDesigner and test-driving Notch for more real-time particle generation fun for upcoming VJ gigs that I am working on. In the meantime, I had three lectures/demos that I had to present and I was looking for a quick way to include particles and to liven up the “professor stands in front of a room and talks.”
I needed a real-time generator that was more of a rapid-prototyping tool—something I could easily and quickly get up and running. Alex Oliszewski introduced me to some free iPad apps that we played around with one night—you know, just the thing a couple of digital media designers do on a Friday night for fun—by hooking up our iPads to a projector and a sound system and rapid fire, creating some playful particles.
This led me down the rabbit hole of downloading and purchasing all available iPad particle generating apps. The one that rose to the top of the pile was Ions 2+ created by Douglas Applewhite. It is available for iPhones, iPods, and iPads. At a cost of .99 cents, it packs a pretty powerful punch. It comes pre-loaded with templates of different types of particle generators, but the true power lies in the ability to change and save the templates or make your own from scratch. It allows for control of particle number (between 10,000 to 100,000), color, size, and rotation over a particle’s life. It also allows for control over forces, such as gravity, turbulence, and drag. Some nice features are the ability to pause and play the engine, to customize the background, as well as how the particles blend with the background, and to hide the control menu.
You can add as many particle emitters as you’d like, with the ability to change the count, speed, direction, and orientation of each emitter independently. You can also create fields that have the customizable ability to attract or repel particles. Being that is designed for an iPad, it has built-in touch capabilities. You can use all ten digits as independent particle emitters or fields. Or you can disable touch completely. This is where the true fun and interaction of the app comes into play.
OK, getting back to the three lectures…Rather than talk to the audience the entire time, I created a sort of interactive VJ rig, where I could use Ions 2+ to draw and write with particles, as an alternative way to communicate with the audience and also allow them to participate in the lecture by playing with it themselves. For this set-up, I used a lighting to HDMI dongle for a video feed out of my iPad Pro. I hooked up the HDMI to an AJA U-Tap HDMI capture box and brought that video capture into Isadora to map and further manipulate. This allowed me to extend control of the images created in Ions 2+. This work flow/system worked out great for these lectures, and I continue to adapt and use it.
Around this time, I was gearing up to create content for a new work for piano, electronics, and full orchestra that I was designing for the Sioux City Symphony Orchestra. My concept was to have small particle-driven white lines that represented the string section, that would grow and change with the music. I turned to Trapcode Particular and to a few different audio reactive tools. I was not satisfied with the results I was getting. Everything seemed to be too mechanical; I was missing more of an organic feel. I was also looking for a way to quickly create, rather than waiting for rendering.
Now, I know what you are thinking…It was the same thing that I thought…“a 99 cent iPad app might be fine for a few lectures, but to create content for a professional gig? No, way!”
Well, I decided to dive in. Since I was projecting onto black scrim, I knew I had a very forgiving surface. I set up four different custom particle looks for all four movements of the concerto. I then hooked up my iPad Pro to my computer via the AJA capture and recorded my iPad screen, so I could further manipulate the content. I put on some headphones, fired up a recording of the music and drew/generated particles with my fingers. The result was exactly what I was looking for: particle lines that were extremely organic in their movement and quickly responsive to the music. Because the particles were manipulated in real-time by my fingers, Ions 2+ became a sort of digital video content instrument, allowing me to play/create along with the music.
I imported the captured video files from Ions 2+ and further manipulated them in After Effects and composited them with other original content. I was surprised and delighted how this app and workflow met all my artistic needs and allowed me to rapidly create particle driven content. I will definitely be sharing this method with my students as an introduction to particle generation that allows them easy entry to this type of content creation. Sometimes the best solutions don’t always come with high price tags. If only everything in digital media design only cost .99 cents.
Check out the hands-on software courses at LDI2017 including:
- content creation and Isadora taught by Alex Oliszewski,
- TouchDesigner taught by Matthew Ragan
- and Notch
Daniel Fine is an assistant professor of Digital Media in Performance at The University of Iowa. He is an artist, scholar, and technologist working in immersive, responsive, mediated environments, site-specific locations and installations for interactive users, audiences and live performance. He will participate in the two-day Projection Mapping Summit on November 15 & 16 at LDI2017.