ImOnFire (Jeremy's Final Project)
Source Code, Setup Diagram & Collected Musings
Available on extended entry.
This project came out of experimentations with particle systems using the traer.physics library inside of Processing. Initially I was using the particle systems to optimize visualizations of scale free network topologies (don't ask), but the physics model proved so convincing in that context that I began imagining other uses for it. After one week I had made several improvements on the network example patch and turned my attention to the smoke simulation. I found much room for improvement. By adding color and shape to the particles and by creating/manipulating/destroying forces within the system I was able to model a coarse flame. Applying a gaussian blur to the result yields a rather convincing flame, but also requires a slower framerate which detracts from its realism.
SLIGHTLY MORE INTERESTING STUFF
With the fire model in hand I turned my attention to a user interface. I wanted the piece to require a minimum of physical engagement from viewers while inviting them into an imaginary space that they could manipulate with their bodily gestures. It was important to me that the piece required no actual human contact, that its behaviors were quickly and intuitively understood and that it responded to the participants as they respond to it. The last of these criteria is perhaps most important because it necessitates a responsive space which produces nothing without participant engagement and falls dorment when the participant has left the space. This approach ensures the sense of discovery and play I hoped to acheive with the piece.
The most obvious method of extracting movement data from the participants was by video motion capture. I had previously done motion capture in Jitter utilizing a combination of frame differencing and color tracking. For this application, however, I wanted to do all of the video processing live inside the Processing environment (mostly for stability purposes) This required a little deconstruction of the underlying concepts and some new methods for thinning the parametric data stream. In the end, I feel that the implementation is fairly elegant and processor friendly.
In the prototype installation of the piece I was happy to watch as participants actively engaged with the projection. When watching the participants from a spot on axis with the projection they appear to be practicing some improvised modern dance moves. This makes me giggle and may be an area of the work to exploit in future iterations.
PROCESSING SOURCE CODE