In one word: laziness. I started my first motion graphics by using After Effects. But I rapidly got tired of keyframes, easing, and rendering time. I looked for a way to create animations without going frame-by-frame, and eventually found NodeBox 3, created by some researchers at Sint Lucas School of Arts in Belgium. It’s a visual programming environment, one that I liken to drawing, except with functions (translate, rotate, copy, etc) instead of pencils. I’ve been using it to create animations for about a year. None of those were sound-sensitive, however. Almost all the GIFs you see on this blog (with the exception of the two most recent) were created with NodeBox.
The program I use to create sound-sensitive animations is called Processing (or P5 for short), an implementation of Java specifically geared towards creative applications, created by some students at MIT. It was intimidating to write code at first, but the more I use it the more I want to abandon NodeBox’s arbitrary limitations entirely. To make sound-reactive visuals, I use the included minim code library, which provides many tools for audio analysis, and the results of the analysis can then be used as variables to control various program states. In my case, these are usually the position, size, shape, etc. of geometric objects. I really enjoy using audio frequencies as variables because they strike the balance between geometric and organic form that I’ve been seeking for a long time.