Main

December 23, 2009

Slit Scan Box model

Art Box.jpg

This is the model of my slit scan box. Inside I had installed an i-cam to capture the footage as the people moved a wooden block with a cloth pattern on the bottom, over the surface of the clear plexiglas surface. An LED light was also installed inside in order to make the image of the pattern clear within the enclosed space of the plexiglas surface.

-Mike Ballard
fabric1.jpg

fabric1.jpg
Thumbnail image for fabric3.jpg
fabric2.jpg

December 22, 2009

Independent Project: Mother Me?


One-on-One Sketch-crop8xAdjs.jpg

Sketch reflects initial design for belt

This project is a prototype interactive wearable belt. It has been designed with two focuses in mind. On the one hand, it "heralds" a future in which cloning and "petri-dish pregnancies" are a norm. A woman may choose to carry her fetus to full-term in a wearable incubator (re-sized to match the growth of the fetus). In this senario, the woman's partner may also share in the carriyng-to-full-term.

The second focus relates to women's experiences of strangers giving themselves permission to touch their pregnant stomach. It has two modes of operation: non-touch and touch. In non-touch mode, a red "warning" lights flashes in the capsule with a modeled fetus to discourage touching. In touch mode, a series of interactions are possible. The cast rubber part has two touch zones. Touching the container with the fetus embryo model, triggers a white LED in the fetus to pulse. Depending how the cast rubber part of the piece is touched, different sequences of blue LEDs blink or pulse.

Neutral6xS.jpg

Wearable not activated.

Alarm6xB+SS.jpg

Wearable in Don't-Touch mode

Play6x.jpg

Wearable in Touch mode


The following code reflects the second focus operations. Note: Commented out is code for using a Rogue Robotics sound shield to play sound clips together with the LED lights. The interface is not working at this time.

One_on_One_AllZonesNoSound.pde


Final Project - Alec Rippberger


View image
This is the completed physical tree, allowing you to interact with the virtual tree through agitation.
View image
This is the switch that converts physical interaction into the digital realm. The center ball of aluminum foil is attached to a wire that is connected to digital pin 4 of the arduino. The coil of aluminum surrounding the ball is attached to a wire connected to arduino's ground. When the tree is agitated the ball makes contact with the coil completing a circuit and sending digital information to the arduino.

This digital information is then sent to a MacBook Pro computer using the firmata software uploaded to the arduino board. When the data reaches the computer it is converted by another program,serproxy, so that it can be interpreted by Adobe Flash. Flash then interprets the data using a library class known as AS3 Glue.

The flash actionscript code I used:

stop();
import flash.utils.*;
import net.eriksjodin.arduino.Arduino;
import net.eriksjodin.arduino.events.ArduinoEvent;
import net.eriksjodin.arduino.events.ArduinoSysExEvent;
import flash.utils.ByteArray;
import flash.events.Event;
var a:Arduino;
var numEvents:Number=0;
var leaf:uint=0;
var frame:uint=1;
var myTimer:Timer=new Timer(200);
myTimer.addEventListener("timer", checkLeaves);
myTimer.start();

// connect to a serial proxy on port 5331
a=new Arduino("127.0.0.1",5331);

// listen for connection
a.addEventListener(Event.CONNECT,onSocketConnect);
a.addEventListener(Event.CLOSE,onSocketClose);

// listen for firmware (sent on startup)
a.addEventListener(ArduinoEvent.FIRMWARE_VERSION, onReceiveFirmwareVersion);

// listen for data
a.addEventListener(ArduinoEvent.ANALOG_DATA, onReceiveAnalogData);
a.addEventListener(ArduinoEvent.DIGITAL_DATA, onReceiveDigitalData);

//listen for sysex messages
a.addEventListener(ArduinoSysExEvent.SYSEX_MESSAGE, onReceiveSysExMessage);

// triggered when a serial socket connection has been established
function onSocketConnect(e:Object):void {
trace("Socket connected!");
// request the firmware version
a.requestFirmwareVersion();

}

// triggered when a serial socket connection has been closed
function onSocketClose(e:Object):void {
trace("Socket closed!");
}

// trace out data when it arrives...
function onReceiveAnalogData(e:ArduinoEvent):void {
trace((numEvents++) +" Analog pin " + e.pin + " on port: " + e.port +" = " + e.value);
}

// trace out data when it arrives...
function onReceiveDigitalData(e:ArduinoEvent):void {
trace((numEvents++) +" Digital pin " + e.pin + " on port: " + e.port +" = " + e.value);
leaf=leaf+1;
}

// trace incoming sysex messages
function onReceiveSysExMessage(e:ArduinoSysExEvent) {
trace((numEvents++) +"Received SysExMessage. Command:"+e.data[0]);
}


// the firmware version is requested when the Arduino class has made a socket connection.
// when we receive this event we know that the Arduino has been successfully connected.
function onReceiveFirmwareVersion(e:ArduinoEvent):void {
trace("Firmware version: " + e.value);
if (int(e.value)!=2) {
trace("Unexpected Firmware version encountered! This Version of as3glue was written for Firmata2.");
}
// the port value of an event can be used to determine which board the event was dispatched from
// this is one way of dealing with multiple boards, another is to add different listener methods
trace("Port: " + e.port);

// do some stuff on the Arduino...
initArduino();
}

function initArduino():void {
trace("Initializing Arduino");

// set a pin to output
a.setPinMode(13, Arduino.OUTPUT);

// set a pin to high
a.writeDigitalPin(13, Arduino.HIGH);

// turn on pull up on pin 4
a.writeDigitalPin(4, Arduino.HIGH);

// set digital pin 4 to input
a.setPinMode(4, Arduino.INPUT);

// enable reporting for digital pins
a.enableDigitalPinReporting();

// disable reporting for digital pins
//a.disableDigitalPinReporting();

// enable reporting for an analog pin
a.setAnalogPinReporting(3, Arduino.ON);

// disable reporting for an analog pin
//a.setAnalogPinReporting(3, Arduino.OFF);

// set a pin to PWM
a.setPinMode(11, Arduino.PWM);

// write to PWM (0..255)
a.writeAnalogPin(11, 255);

// trace out the most recently received data
//trace("Analog pin 3 is: " + a.getAnalogData(3));
//trace("Digital pin 4 is: " + a.getDigitalData(4));

}

function checkLeaves(eventArgs:TimerEvent):void {
if (leaf>=10) {
nextFrame();
trace("increase");
if (leaf>1000) {
leaf=leaf-100;}
if (leaf>100) {
leaf=leaf-10;}
if (leaf>10) {leaf=leaf-1;}
leaf=leaf-1;

} else {prevFrame();
trace("decrease");
if (leaf>1000) {
leaf=leaf-100;}
if (leaf>100) {
leaf=leaf-10;}
if (leaf>10) {leaf=leaf-1;}
}

}


This code is a modified version of the "simpleIO.fla" example included in the AS3 Glue Class Package. I modified it in several ways:

  1. added "stop();" to the first line of code in order to stop any flash movieclips from running without input from the arduino.

  2. Imported the timer class so that I could use the timer functions

  3. Created a timer that would run a specific function every 2000 milliseconds

  4. Added and defined the variables "leaf" and "frame" to track the input received from the arduino. I later decided not to use the variable "frame" for its original purposes.

  5. Modified the "onRecieveDigitalData" to add 1 to the variable leaf

  6. created the "checkLeaves" function in which the amount of digital input is checked. If the leaf function is higher than a certain threshold the function advances the timeline to the next frame and reduces the amount of the variable "leaf". If the threshold has not been met and the frame is more than 1, the function will advance the timeline to the previous frame and reduce the amount of the variable "leaf".



This diagram shows how the information travels from the physical tree to the digital tree.
MAIWEB.jpg

Interactive Wall

This project uses capacitive sensors to communicate with Processing via the Arduino. I am currently in the process of re-writing the code to work out bugs and such. The wall will be on display in the Rapson Hall courtyard over break and during the first week of the Sping 2010 semester.

Justin Berken

My Slit Scan Project (Final)











When Diane showed me the slit scan program in processing, I almost immediately knew what I was going to do. Taking a little bit of my inspiration Maurice Benayouns identity stealing worms, I wanted to create a structure that could take the interaction of the subject and project it for them to see and change in real time.

I think my idea for making the boxes as the source for one to interact with was that the box as an object is something simple and universally recognized as being something a person is not really afraid to interact with. The construction of the boxes was key to realizing my project and it became quite frustrating in the end when I found my self being held up by their construction. I eventually decided that I would have to build them myself, something I knew I was not proficient in doing, but must be done in order to get there. I had a solid plan that I was sure would work, but with out the boxes complete, I had no way to test it and make sure.

The plan itself was to have three boxes which had openings at the top where individuals could put and move around blocks with fabric patterns on the bottom side. Camera's inside the box would capture the footage and run it through the slit scan program which would take these patterns and their movements and elongate them through time. These images would be projected on the wall for the individuals to see and react with. Lights would also have to be placed inside the box pointing up at the patterns in order for the fabric to be seen by the camera.

A complication arose when I realized I would have to use 3 separate computers to run the slit scan because I did not want them to be part of the presentation. The solution came when I found a white display box that I could stash the computers in as well as set my main display up on top of the box. The boxes themselves were obviously not constructed with the most talented of hands so I used lighting to hide or distract from their defects. As it turned out, once the boxes were completed and I had found a way to install the lights and cameras, the plan had worked out pretty much exactly the way I had envisioned, which doesn't usually happen. I think this had to do a lot with the experience I had in the previous group project, where I learned a sort of method of work which was to start out simple and keep building on that concept. In the end I believe I had a successful project.

The main concept of the presentation from the start was to evolve around the idea of identity, time, and choice. The people interacting with the project have many patterns (identities) to choose from and each one they will effect in their own way. They make their decision and they use the boxes to form (recycle) that pattern the way they wish. As they do, their patterns are mixing with the patterns of those around them and each pattern is connected to one another. These patterns are all effected and tied together by the passage of time.

Most of my other documentation for this project has been lost to the trash can (sketches, blueprints, etc.

Translucence and Shadows

For my independent project, I wanted to use the qualities of light and shadow and explore the properties of some translucent, Duralar material. A servo sweep operates and animates a scene composed of film strips and a plastic model tree. Multiple light sources create layers of shadows, with one layer splitting off into three and then reverting.

In this revised version of the project, a plain full screen was used, which was constructed out of a large sheet of Duralar, 3 dowel rods and some leftover presentation board pieces for support and to create a frame.

P1012224.JPG

P1012228.JPG

P1012225.JPG

Robin Schwartzman-Final Project

The Lonely Tree is an interactive installation that invites people to kick off their shoes and step off of the concrete to enjoy a picnic lunch under the shade of an apple tree. Once one enters into this world, there are many surprises in store. Please watch the following video for complete documentation.








This piece was made using three Arduinos- two WaveShields and one Motor Shield. To make the flowers and clovers talk, I set up a system of circuits to be connected. When the bottom of each stem (which has a grounded piece of metal on it) makes contact with the underside of it's bush or dirt cluster (connected to separate analog pins), it connects the circuit, thus triggering the sound. The following is the coding I used to make the flowers and clovers talk: flowersigotit.cpp
And here is the code I used to make the tree talk when one steps onto the blanket: treesounds.cpp The blanket works in the same manner as the flowers and clovers-the underside of the blanket is connected to an analog input pin. Then there is a layer of foam squares separating the positive from grounded metal underneath. When one steps on the foam, it connects the positive and negative, thus connecting the circuit and making the tree talk.
Lastly, the following is the code I used to activate the Motorshield. I wired a Ping ultrasonic sensor up to the motorshield so that the eyes would blink at different rates according to a range of motions based off of sound. blinkyeyes.cpp

I consider the piece in its current state a working prototype for a bigger, better, more durable version to come in the near future. Keep your eyes peeled!

Interactive Tree

I was happy to finally get all the components I needed to make a video interactive. I used a sensor and a bit whacker and connected them to a max patch that took input from the sensor and could convert it to numbers that then changed the opacity of the video (I had two videos layered on top of one another). It could also do things like change the exposure or speed of the video. Pretty cool. Andy Mattern showed me how to use the patch. It looks like this:

maxpatch image.jpg

Here's the actual file:

bitwacker input test.pat

I used a sparkfun bitwacker (bit whacker) as my microcontroller instead of our usual arduino because I guess it speaks to max/msp better. Here's a link to how to set up the bit whacker.

With a new distance sensor (because the ultrasonic sensor I bought required an input pulse to be programmed into it from the microcontroller, and by the time I figured that out I didn't have time to learn a new language to program a new microcontroller). In the presentation of the piece the projected video changed as people walked up to it. The tree went from being a plain tree to having gold shine through it.

Presentation.mov

Presentation.movPresentation.mov

December 21, 2009

Prelude to Fly on the Wall Project

For continuing with my idea, as I mentioned in class, once I figure out the proper coding I would like to do a little miniseries using the same interaction (scrolling flash using a potentiometer) with different themes. Themes involving small animals with different perspectives would be interesting. For example, with this project I used an insect that had a very low perspective, so using an animal like a bird or something that would be higher up would be cool. Or an animation of something underground, so you would see a worm's perspective or something, I think you get the idea. For the visuals, I would have preferred to have an actual drawn animation, not video format, because animation was originally why I was interested in using arduino with flash. I also think the aesthetic of animation would help realize the piece more for someone who is interacting with the potentiometer handle feel and how that relates to what they look at. Live video seemed to have a disconnect a bit from what i intended, but I think a fabricated reality would be a nice touch to the action of the piece. Then, in terms of how someone interacts with the piece, I've always thought of this as being a little mini series on a website or something like that, where you would interact with it just as how I presented it on my laptop in class. But when thinking about other options for viewing this, I think it would be really interesting to have it be large scale as well. Where the image would be projected against a wall and the entire framing of the wall would be covered. Then if it was in a rectangular or box-like room, when you moved the potentiometer the entire screen projection would move to different areas in the room. So it would take up an entire blank room. For example, when looking forward you would see the entire projection against the wall in front of you and if you scrolled right, then the projection would scroll against the wall over to the wall on your right. In this scenario, the tiny little potentiometer would seem very powerful because a tiny little object/fly/bird/bug/whatever would control the motion of this huge projection in the room. Those are my hopes and dreams for this project anyway, once I figure out step number 3 that is! But if anyone has any thoughts on this, I'd love to hear them.

Final Project

Photo 33.jpg


View image

Here is my coding, painting.pde..

int photocellPin = 0; // the cell and 10K pulldown are connected to a0
int photocellReading; // the analog reading from the sensor divider
int LEDpin = 11; // connect Red LED to pin 11 (PWM pin)
int LEDbrightness; //
void setup(void) {
// We'll send debugging information via the Serial monitor
Serial.begin(9600);
}

void loop(void) {
photocellReading = analogRead(photocellPin);

Serial.print("Analog reading = ");
Serial.println(photocellReading); // the raw analog reading

// LED gets brighter the darker it is at the sensor
// that means we have to -invert- the reading from 0-1023 back to 1023-0
photocellReading = 355 - photocellReading;
//now we have to map 0-1023 to 0-255 since thats the range analogWrite uses
LEDbrightness = map(photocellReading, 0, 355, 0, 50);
analogWrite(LEDpin, LEDbrightness);

delay(100);
}

For my project I created an interactive painting. I inserted a light sensor in the painting that would cause several blue LED lights to light up when there was a lack of light hitting the sensor. I imagined that the painting set up in an area with natural light and would change throughout the day as the light slowly turned to darkness causing the painting to glow bright blue. If i were to make another one of these paintings i would have painted the paintings with the LEDs on so that i could better play off of the light and i could match the colors better. I also wish that i could have gotten my original idea to work so that i could have programmed the LEDs to switch from one to the other based on the light sensor.

FINAL Blinky Glove - Bryce Davidson

It was quite a journey with the Lilypad Arduino, but it worked out in the end. I really learned a lot. I was happy to have learned how to use conductive thread, program pins, work with powering the Lilypad, and physical construction. Here is the pictorial/video history of my project's progress...

Movie 70.mov
Movie 75.mov
GlovePrototype.jpg
Movie 114.mov

and the Code

int ledPin = 10; // LED connected to digital pin 13

// The setup() method runs once, when the sketch starts

void setup() {
// initialize the digital pin as an output:
pinMode(ledPin, OUTPUT);
}

// the loop() method runs over and over again,
// as long as the Arduino has power

void loop()
{
digitalWrite(ledPin, HIGH); // set the LED on
delay(80); // wait for a second
digitalWrite(ledPin, LOW); // set the LED off
delay(80); // wait for a second
}

I am still waiting on the download of the video footage of my performance with the blinky Lilypad glove, but for now, I think it really worked out how I wanted it too. I had hoped to understand the button switch, but I was still able to accomplish what I wanted to with my project which was an interaction between the digital world and the physical world. This is what I really understood Interactive art to be in its essence.

To break down the concept of my performance, I started with the idea of seeing light interact with movement. I wanted to make a complete body suit full of lights and then do a dance performance in it. My first thought was to power this suit with an accelerometer, so that the amount of light was dependant on the amount of movement. I came to realize that my understanding of the arduino and all of its elements would have to grow quite a bit more to realize this idea. But it was for the best when I started to think about the CONTEXT of this idea. Why was I spinning around with lights on my body? Why was I breakdancing? What was this idea about? As I explored this idea, I came to the realization that I wanted to create a story. I wanted to display something that had a pace. A piece that had a beginning, middle and end. I also liked the idea of opposites and extremes. Through this exploration of context and story line, I decided on creating a piece that had an interaction between an animated character and myself as the viewer. In my final performance, the character very slowly approaches a paperclip, a light socket and eventually after a long, drawn out process, the relationship between the two objects interaction. As soon as they connect, he is wildly electrocuted and sent out into the physical world, into my hand (represented by the blinking glove). At this point I explode into dance. The performance is ended abruptly when the character shoots out of my hand and back onto screen. The end. The intention of the piece was to represent a pace of excruciatingly slow build-up and waiting, followed by extreme energy and speed, and an abrupt end. I also wanted to create a representation of a viewer being forced into an imaginary digital character's experience. I think I was able to accomplish this in my piece and I am very happy with the final product. I will post the video as soon as it is downloaded...
-Bryce

Here is an excerpt of my performance
http://mediamill.cla.umn.edu/mediamill/embed.php?subclip_id=795&live=true

Here is the clip of my performance:

bryceinteractivesmall.mov

December 17, 2009

Fly on the Flash











fly on wall video

http://mediamill.cla.umn.edu/mediamill/display/56278

Fly on the wall

fly on the table

Using Flash, Arduino, and a potentiometer.
Here you can see how the potentiometer is used to scroll through the flash video, so that the viewer controls what they see. If anyone has questions about using flash, I can answer animation issues, but I'm not to familiar with the technical issues of connecting arduino with flash. But if you're interested, feel free to contact me. I'd be willing to help people out with flash animations if they ever need it.

December 4, 2009

Final Project Progress-Robin Schwartzman

My "tree installation" has been slowly progressing throughout the semester. As of right now, I have my flower bush complete, with about 50 different flower sounds that are triggered randomly when each flower is picked. I also have completed the clover mound, which is a game of sorts. When a three leaf clover is picked, it yells at you saying something like "nope", "wrong", "not me", and "can't you count?". When you find the four leaf clover and pick it, you are awarded with cheers and applause.
Here are some images of those pieces:

flowers.jpg

clovers.jpg

As far as the coding, right now all of my coding for the flowers and clovers is together with the background music (zip-a-dee-doo-dah) on one waveshield. When I get multiple waveshields, the sounds will be programmed separately so that they can start to build on eachother. The coding is here:

flowers:clovers.doc

I've also made significant progress on the blinking apple tree. He blinks using a system with the stepper motor, an arduino motorshield, some thick aluminum wire, a sewing bobbin and fishing line. Here are two images-one with eyes closed, the next with eyes open. You can also see the picnic blanket hanging behind the tree. I still need to wire up a system underneath the blanket (with foam and aluminum foil) so that when one steps on the blanket, the tree will also talk.

treeclosed.jpg

treeopen.jpg

Here is my coding for the blinking eyes:blinkyeyes.doc

More to come, looking forward to piecing everything together into a cohesive installation.