Fractal Flame was developed as the final-project for the Workshops in Creative Coding class presented by Max Wogan. Shapes not only served as an introduction to iOS programming using openFrameworks, but also to Objective-C which I used to self code the main-menu. Fractal Flame built upon these findings, but within more of an artistic setting.

The project is a responsive audio-visual installation exploring the usefulness of an iOS device for interaction. This device controls which animation is displayed (generated by another oF project on a remote computer), and allows the user to create or delete animations. The orientation determines which musical instrument is selected (drum machine, polyphonic synth, polyphonic timestretcher, polyphonic time freezer), or to stop audio altogether. Each touch layers animations on the screen, and adds an extra voice to the instrument (up to eight touches supported). In the case of the synth, the accelerometers x, y, z are used as pitch, volume, and spatilization modifiers. For the rest, touch position and length of touch affect parameters such as volume/spatilization, speed/pitch, probability of being triggered etc. These touches are represented to the user by a ball of random lines.

Apologies for the poor video quality. Also the iOS simulator doesn’t support shake gestures, accelerometer data, and only two touches, so not much can be inferred from this video. Hence the synth etc. couldn’t be demoed. I would like to set this up in out studio in the New Year, and perhaps merge both applications together as an audio generator for iOS.