I was happy to finally get all the components I needed to make a video interactive. I used a sensor and a bit whacker and connected them to a max patch that took input from the sensor and could convert it to numbers that then changed the opacity of the video (I had two videos layered on top of one another). It could also do things like change the exposure or speed of the video. Pretty cool. Andy Mattern showed me how to use the patch. It looks like this:
Here's the actual file:
I used a sparkfun bitwacker (bit whacker) as my microcontroller instead of our usual arduino because I guess it speaks to max/msp better. Here's a link to how to set up the bit whacker.
With a new distance sensor (because the ultrasonic sensor I bought required an input pulse to be programmed into it from the microcontroller, and by the time I figured that out I didn't have time to learn a new language to program a new microcontroller). In the presentation of the piece the projected video changed as people walked up to it. The tree went from being a plain tree to having gold shine through it.