today, zach lieberman surprised our class (audio visual systems + machines) with a special appearance by his pal, daito manabe. daito has reached quai-youtube fame with a couple videos. here’s one.
here is a video that i took, of daito shocking my friend nick’s arm and face, at school.
and an image of me and daito!!
but this is just one project he is involved in. while i have an affinity for electrodes on the human body, i’m not so interested in controlling or manipulating the body itself. i’m more interested in taking data (via EKG, EEG, GSR, EMG) from the body and applying it to audio and video – but i’ll post more about that later (my ars collab will be about this, maybe thesis material also?
the project of daito’s that I am MORE interested in currently, is his that resembles my work of the last 8 weeks. this one:
each cube has a microphone and a PIC chip. the PIC is listening for specific frequencies of tones, and also patterns of the notes being played. when it hears a specific note it might blink a specific color. when it hears a specific pattern of notes, it might glow a specific color for a length of time. daito has handed these out at shows before, and then during the show inserted a pattern of notes. the cubes then all respond from all around the room, and has a great effect. in essence, it is serial communication through sound. brilliant. I’ll post a vid of daito showing me the insides of the cubes, as well as zach lieberman trying on daito’s electrode stimulus system, real soon.
concept: physically and functionality modular, multi-input controllers with visual feedback and dynamic sensor data output. Manipulating video, sound, or anything digital is possible with LightBox, and when using more than one simultaneously, group interaction and collaboration is possible, as the controllers themselves are wirelessly networked. project post-mortem paper here.
Here is a not-so-revealing video of one mothercube and one daughtercube (all i had the time and money to build), being run through a theremin-emulator max patch. I have written a sampler patch and video controller in jitter, I will document these soon enough (making sure you can see my hands) and then post the patches.
***I apologize that you cannot see my hands, however they are controlling the pitch of the sound based on how close they are to the cubes — very similar to how a theremin works***
Essentially, frosted plexi-glass cubes with IR range finders and one single button (capacitance [touch] sensor). The sensors are being powered and read by an arduino, and then processed in Max/MSP/Jitter. I used the PDuino firmware for that connection. What I was hoping to acheive also, was one array of LEDs sensitive to the IR (you’ll see in this video, only the touch changes the color). I worked on using a TLC5940 LED driver run by a PIC16F88 microcontroller, but was never completely successful. In fact it was quite frustrating, to the point that i gave up on it for the moment.
Here are some images of the process:
Left to do now is finish the wireless aspect, between the two cubes. Then more patching.
Plugged both of my guys in today, and had fun with stretta’s Polygomé (running on my 64) and my BeatSeq (on my 128). The audio is horrible, mostly because the mic of the tiny camera i was using (a flip) was pointed down, not at the speakers. Also, it’s a mic of unfortunate quality. I will take the audio from the line out next time, or at least record it in logic simultaneously. I’m using Polygomé to send MIDI out to logic, where i’m using the ES1 synth. It sounded nice and bass-y in front of my monitors, too bad the mic didn’t capture any of it. More Soon!!