[pluto-festival] :: festival images

I have started to (FINALLY) upload images from the pluto festival where I exhibited my spatialized umbrella project, in Opwijk, Belgium October 1 -4, 2009.

[flickr-gallery mode="photoset" photoset="72157623309556168"]

Spatialized Umbrella v01

This is my first prototype of the Spatialized Umbrella.

The Spatialized Umbrella project offers an entirely new dimension to walking in the rain. Using light and sound spatialization this umbrella creates an immersive, mobile, and highly personal multi‐sensory environment.  Range sensing technology helps the Spatialized Umbrella react to your movement through a space.

5 speakers and LEDs are mounted inside of the umbrella, around the users’ head, allowing for sound and light spatialization.  The ‘raindrop’ samples play in a loop, each speaker playing their own unique raindrop. The LEDs light up the speaker playing at that moment. The tempo of the loop is controlled by a long-range Sharp Infrared range finder.  The closer an object is to you, the faster the loop plays. If an object is close enough and a threshold is reached, a lightning sequence is triggered. Best part: COMPLETELY SAFE FOR USE IN THE RAIN.

This video is actually an early version of the code, and I apologize for not using a microphone INSIDE the umbrella (it’s hard to hear the ‘raindrop’ sounds). New video soon.

The most time consuming part of the project was in soldering the PCB i used (i wanted it to be small to fit at the top, so the entire arduino did not make sense). I designed my own “mapduino” circuit and used an IC socket for the ATMega168 chip to sit in on the PCB. This way i can just pop the chip out and replace it with another I have reprogrammed on an Arduino. Rigging the umbrella also took a little while.

***ALL SOUND IS MADE USING ONLY AN ARDUINO AND 8OHM SPEAKERS:: lookup tables store values for waveshaping, which is output directly from Digital Pins from the ATmega chip. See the current version of the code, which can be found HERE.

still to do: linearize the IR data so that there is a more even rate of change in the tempo. When I began, I also had the thought to use an accelerometer, to measure the direction of movement. BUT, I have been successful tonight in reading data from a digital compass sensor, which can give me degrees of rotation — like say if the user spins the umbrella, i could have the sound/light spin around the users head in that direction, at that speed. This is much more interesting data than an accelerometer, in my opinion.

>> UPDATE :: Featured on HackaDay.com and ArduinoShow.com and CoolCircuit.com !!

Messa di Voce

about 2 weeks ago, i got the chance to see one of my teachers, zachary lieberman, perform at NYU. It was a piece called Messa di Voce which in italian means “placing the voice”. It essentially is an interactive visualizer for the human voice in real time. IR light and cameras track the performers’ location on stage and 2 projectors were used side by side to project the visualizations of their respective voices. here’s some video that i took —


thanks to stephanie for letting me borrow her camera!

image + sound

completed the first assignment for audio/visual systems and machines. we were given several images, and then created audio that we thought fit each image. here are mine —


LightBox • modular controller

concept: physically and functionality modular, multi-input controllers with visual feedback and dynamic sensor data output.  Manipulating video, sound, or anything digital is possible with LightBox, and when using more than one simultaneously, group interaction and collaboration is possible, as the controllers themselves are wirelessly networked. project post-mortem paper here.

final design:

final schematic

Here is a not-so-revealing video of one mothercube and one daughtercube (all i had the time and money to build), being run through a theremin-emulator max patch.  I have written a sampler patch and video controller in jitter, I will document these soon enough (making sure you can see my hands) and then post the patches.

[vimeo 2800746]

***I apologize that you cannot see my hands, however they are controlling the pitch of the sound based on how close they are to the cubes — very similar to how a theremin works***

Essentially, frosted plexi-glass cubes with IR range finders and one single button (capacitance [touch] sensor).  The sensors are being powered and read by an arduino, and then processed in Max/MSP/Jitter.  I used the PDuino firmware for that connection.  What I was hoping to acheive also, was one array of LEDs sensitive to the IR (you’ll see in this video, only the touch changes the color).  I worked on using a TLC5940 LED driver run by a PIC16F88 microcontroller, but was never completely successful. In fact it was quite frustrating, to the point that i gave up on it for the moment.

Here are some images of the process:

mothercube, finished

daughter guts

mothercube guts

LED driver circuit

mother and daughter

mother and daughter

Left to do now is finish the wireless aspect, between the two cubes.  Then more patching.