today, i assembled my monome 40h kit! it’s essentially the same as a 64, but comes in kit form – 2 PCBs and all the pieces that need to go on it – ATmega chip, shift registers, FTDI serial to USB, button pads, etc. The only thing it doesn’t come with is LEDs. These you must find on your own. I ordered these from LEDShoppe, and so far i’m not extremely impressed with their consistency (brightness) from bulb to bulb. However, the color is beautiful (violet). I failed to realize this while constructing her, but the LEDs are in fact all UV Blacklight LEDs, which is kind of cool — not really something I am into, but once i finish construction of the enclosure w/sensors, I will probably sell this monome and I think the UV aspect will be appealing to others.
It took me about 2.5 / 3 hours total, at which point i plugged her in, to discover that one LED was burnt out, and entire row would not respond to button presses. It took me a while to de-solder and remove the dead LED and replace it, but i did, and it works. the row, however, i believe is a circuit problem, and I’m waiting to hear back from Brian Crabtree (inventor of monome) to see what’s up. other than that, it works great, and looks great. The hardest part of the process was soldering 64 surface mount diodes on the button pad PCB. I didn’t expect to have to do anything that tiny. Luckily I got a Weller soldering iron and a couple 0.8mm tips. nothing beats a fresh tip. here are some more images from the process.
__________ the 40h comes with 4 ANALOG inputs that are just waiting to be sensorized. I am considering a capacitance touch sensor, or possibly an IR rangefinder, or maybe even just a potentiometer(s) to have some knobs. the most common is an accelerometer, so that the monome has tilt control. My 64 has this, so i probably will not do that. I still have to build an enclosure (obviously) and what sensors i choose to install will determine what the enclosure looks like. i’m excited.
about 2 weeks ago, i got the chance to see one of my teachers, zachary lieberman, perform at NYU. It was a piece called Messa di Voce which in italian means “placing the voice”. It essentially is an interactive visualizer for the human voice in real time. IR light and cameras track the performers’ location on stage and 2 projectors were used side by side to project the visualizations of their respective voices. here’s some video that i took —
thanks to stephanie for letting me borrow her camera!
concept: physically and functionality modular, multi-input controllers with visual feedback and dynamic sensor data output. Manipulating video, sound, or anything digital is possible with LightBox, and when using more than one simultaneously, group interaction and collaboration is possible, as the controllers themselves are wirelessly networked. project post-mortem paper here.
Here is a not-so-revealing video of one mothercube and one daughtercube (all i had the time and money to build), being run through a theremin-emulator max patch. I have written a sampler patch and video controller in jitter, I will document these soon enough (making sure you can see my hands) and then post the patches.
***I apologize that you cannot see my hands, however they are controlling the pitch of the sound based on how close they are to the cubes — very similar to how a theremin works***
Essentially, frosted plexi-glass cubes with IR range finders and one single button (capacitance [touch] sensor). The sensors are being powered and read by an arduino, and then processed in Max/MSP/Jitter. I used the PDuino firmware for that connection. What I was hoping to acheive also, was one array of LEDs sensitive to the IR (you’ll see in this video, only the touch changes the color). I worked on using a TLC5940 LED driver run by a PIC16F88 microcontroller, but was never completely successful. In fact it was quite frustrating, to the point that i gave up on it for the moment.
Here are some images of the process:
Left to do now is finish the wireless aspect, between the two cubes. Then more patching.
Plugged both of my guys in today, and had fun with stretta’s Polygomé (running on my 64) and my BeatSeq (on my 128). The audio is horrible, mostly because the mic of the tiny camera i was using (a flip) was pointed down, not at the speakers. Also, it’s a mic of unfortunate quality. I will take the audio from the line out next time, or at least record it in logic simultaneously. I’m using Polygomé to send MIDI out to logic, where i’m using the ES1 synth. It sounded nice and bass-y in front of my monitors, too bad the mic didn’t capture any of it. More Soon!!