Sunday, October 25, 2009

Processing Maquette for my light sculpture

I am currently working on a 4x4 ft light sculpture for Danny Rozin's "Project Studio" class. The sculpture uses 40 servos to move 40 arms that each have 2 RGB LEDs at the tips. I am most interested in the way the light from the LEDs will mix together on the canvas, especially since I can control both the color of the light and it position. I uploaded a Processing sketch I made to visualize patterns and color sequences. This version focuses on movement patterns and not color. Drag the mouse on the screen to change the viewing angle.

Estrella Intersects the Plane from Matt Richard on Vimeo.

Monday, October 12, 2009

Foot Sensors for Longboarding Data

This week the goal was to make our data logging devises mobile if not wearable. I was unable to become fully detached from my computer for the data logging. However, I was able to make it mobile by using a small netbook to log the data, and keep it stored in my backpack while I went for a nice ride around Washington Square Park. The values were sent from the Arduino via serial communication, and Processing listened for the values and wrote them to a CSV(comma separated values) file. Next, I stuck the file into a program written by Dan O'Sullivan that graphs that data and allows the viewer to scroll through the data. If you look closely, you will see that my right foot makes more jumps than my left foot. This is because I push my longboard with my right foot :) Moving forward, I would like to find a better sensor for a more stable and wider range values. I found that the FSRs that I used were almost maxed out even when standing.

Friday, October 9, 2009

Depth Time: adventures in finger painting

Sue Syn and I collaborated on a project for Dan O'Sullivan's "The Rest of You" class, in which we wore ultra sonic range finders on our wrists while we painted with our hands. The values that were recorded from the sensors were then used to render information that was captured from a camera that was recording our movements. Both the actual footage and the data augmented video were displayed together. The result is a time lapse record of our movements and color choices, allowing the user to watch the painting develop and time pass. The idea was to make the artist and viewer aware of spatial relationships that are not considered important or taken for granted. The hope being that a greater understanding of one's subconscious might come to light. While I was pleased with the result, I have been making strides toward a better mapping of the depth data than just value. The next iteration of this idea will have the painting pushed and pulled in 3D space based on the values from the range finder.

Depth Time from Matt Richard on Vimeo.