For my final openFrameworks project in Zach Lieberman’s setPixel course at Parsons MFA DT, I worked with Clay Ewing and Joon Moon on a public space computer vision public space projection project. We undertook the challenge of “seeing” the infamous movable cube at Astor Place in NoHo, NYC. We devised a system which used a hacked PS3 eye camera and cleverly diffused infrared LEDs to “see” the cube at night. The idea was to track it with openCV, and then project onto each surface of the cube as it was rotated in real-time. The following videos (and presentation) document the the work we’ve accomplished so far.
[pdf-ppt-viewer href=”http://jmsaavedra.com/weblog/wp-content/uploads/2009/12/ofxAlamoPrese_web.pdf” width=”500″ height=”400″]
Here is a demo vid of the app we wrote in OF for the cube. You can see the mouse “drawing” (in green) the specific areas to be CV’d. This way we can ignore the extraneous sources of IR light (headlights, streetlights, etc) and only track the points we want to track.
Here is documentation from our second attempt at tracking the cube with the system. It was largely a fail because of generator issues, and the IR lights not being bright enough. Zach has recommended we use IR flood lights and reflective tape on the balls for the next attempt. Special thanks to Jen Cotton for documenting the night and then editing this sweet vid.
Here is a demonstration of how nicely styrofoam balls diffuse LED light, even with extremely direction superbright LEDs. Thanks to GRL for the LED Throwie precedent, which we modified slightly with this diffusion and IR light.
Thanks to Zach Lieberman for the amazing course (yet again), Jen Cotton, Zach Gage, and Graffiti Research Lab. We will return to this project in the spring when it is warmer — so look out.