[thesis prototyping] :: techmmmology

So my new concept is geo-locative sensor tracking with a modular pack. Basically you pick and choose what you want to track (ie carbon monoxide, dust, noise pollution, alcohol[?], light pollution, or methane gas), and this data is recorded. Then you transfer it to an online database, and can watch a personal log of your own data collection over time. Last step is to connect individuals to each other via the database, so you can see how your friend in Beijing experiences his/her journey from home to work every day, and relate on a new level.

The other idea is an educational tool for kids with asthma. Child reports, i always get headaches when visiting a certain neighborhood. After wearing the module, the child can make a connection between their immediate environment and it’s affect on their body. More carbon monoxide exposure on this street corner is causing you to have your headache. You were there at 2:15pm last wednesday.

Much much more on this later.

I began to start prototyping the technology. The first piece was GPS, because i am the least familiar with this of everything I will be working with.

I ordered LadyAda‘s (aka Limor Fried) GPS shield kit for the arduino. It took me about 45 minutes / hour (very easy) to put together (see images). The best part about this, was her shield is already designed to write to an SD card ! Which has been my plan all along (i’ve seen several arduino writing to SD card projects, so I thought it will be useful).

Now that I have this working, the next step is figure out how the sensors will be attached. I think for this round of prototyping, I will use USB ports. I actually purchased about 5 of these in China over the summer when I was there with school. I’m going to post pictures of the circuit i made on perfboard, and my first attempt at a Printed Circuit Board (PCB) version.

This weekend i attended a class at NYC Resistor, which was rad, i learned how to use EAGLE to design PCB files, which i would then send to a manufacturer who would print the green boards and send them to me. It’s expensive, and chris hennelly at school convinced me to use the laser cutter and make DIY PCBs. Check out the results, i will have new ones tomorrow…

[thesis prototyping] :: material

i will write more on these when i have more time but i’ve made 2 material prototypes so far…

by this i mean, the cloth strapping mechanism that will hold the module to the user’s body.

design v0.1 is made out of white spandex, and has a flap/pocket mechanism to hold the module (housed currently in a plastic container i bought at the container store for $2.99… more on this soon).  this took me quite sometime and as you’ll see in the photos, would make a great post on failblog.org.

design v0.2 is a black plush material, that Ira helped me put together, i think in just about 30 minutes by hand.  Obviously, i could have done it in like 15 minutes, if i had the right equipment… which is at my parent’s house… so that’s why i asked her to help me. obviously.

thanks ira, clay, and nick for user testing, and the sexy smiles.

[thesis prototyping] :: processing to PHP, attempt one

ok nerdzors, look out. Ira told me about a project that she made that took an analog input from processing and posted it to php (and then possibly a database, not sure). i wanted to try it out for myself, so i did some searching, and of course the processing guru Daniel Shiffman had a nice tutorial on at least getting to a php script.

the php script writes to a text file that you are calling from processing. The php script i am using in this first draft is the same as Shiffmans here. once you have that up and running with all the correct permissions set on the file (read+write), you can fire up processing.

I modified his example file so that the sketch loops and constantly checks the script (+ text file) online, and updates the sketch accordingly. I inadvertantly created a collaborative drawing tool. Potentially, an infinite number of computers/machines could connect to this file that are running this processing sketch, so jessica, nick and i had a collaborative drawing session at 2 in the am.

Because you are just calling to a script that’s online, you can actually view the script at any time in a browser, and watch the coordinates get written as you create and erase them.  it would be here: http://jmsaavedra.com/projects/processingToPHP/loadstrings.php?type=load

Here is my modified Processing sketch.

Here are some images of our late night drawing session — in the background you can see firefox showing all the coordinates that make up the drawing:

this started out as “OOPS”.. then became “BOOBS”… and then became this. Actually I think Jess did the whole thing ourself. But Nick and I watched.

next up, I will hook up the Arduino to Processing and turn LEDs on and off from around the world.

SOBEaR v02 :: the responsible robot bartender


Finally, my finalized prototype of SOBEaR, the responsible robot bartender.

SOBEaR is a robot friend for anyone who does not know their own limits, or has problems controlling themselves.

I’ve added a glass coaster with a glowing status light to tell you that he is on, as well as a sewn-on patch to show you where the ‘go’ button is.  When you press the “breathe + pour” button on his right foot, the status light goes solid, and the user breathes into SOBEaR’s face.  You can see the alcohol sensor above the bowtie, under his chin.  Your current blood alcohol content (BAC) is then shown a scale from 1 – 6 with green, yellow, and red LEDs in SOBEaR’s chest.  Depending on how drunk you are (or aren’t) SOBEaR will pour you a drink appropriate for your current state.  In the video below, SOBEaR is pouring cranberry vodkas for my user tester.  Two servos hold the alcohol and the mixer, and with the SoftwareServo library for arduino, programming this aspect was simple.

For many obvious reasons, I used a MapDuino which is an ATmega168 chip soldered into a custom PCB circuit (started with perfboard from radiocrack) for the brains of this robot.  The alcohol sensor was super easy to implement, got it from sparkfun via my computation studio teacher.

This robot takes the shape of an adorable plush teddy bear, because I felt it gave it a sense of trustworthiness, as if a teddy bear could ever do you wrong. Trust SOBEaR, he knows what is good for you. It was a tough decision between naming this guy “SOBEaR” or “Teddy Drunkspin” [credit goes to matt for that one!]. Other suggestions?

There are a lot more pictures in my first prototype’s post HERE.

thanks to José + Chris for drinking and filming!

Robot Bunny // Desecrated and Celebrated

On Friday, we were told to bring in any plush toy that had interesting innards that we would want to understand. Naturally, I picked out a stuffed bunny that has a switch on his foot which triggers a song to play, and two motors to start up- one in the body, one for the ears. Here’s a vid of mr. Rabbit before dissection time. I accidentally pulled one wire out at the end,but I found it’s connection and had the “bunny” in a fully functional state. Seeing just the motors, speaker, button, and battery working completely disembodied made me feel like some great robot surgeon. Poor bunny.

The most interesting findings in this specimen were

– the PNP/NPN circuit that reversed polarities every second of faster in order to make the ears go up and down
– the gear box that was inside the bunnies’ head was quite intricate, and contained at least seven differently sized cog wheels
– the fact that the PCB was made up of completely through-hole components. This definitely surprised me, as I thought surface-mount is much cheaper. Yury told us that factories are set up to do one or the other, and changing is more expensive than it would be worth.

I will definitely be salvaging both motors, as well as the switch (appears to be a small tactile one similar to what radioshack sells, only in a plastic encasement), and the 8ohm speaker that seems to be quite loud and clear for it’s size. Oh, and the battery holder is definitely something I can use- screws shut and holds 3 AAA batteries. Overall, very much worth the $12 (kmart had a sale on Easter themed plush!)

Spatialized Umbrella v01

This is my first prototype of the Spatialized Umbrella.

The Spatialized Umbrella project offers an entirely new dimension to walking in the rain. Using light and sound spatialization this umbrella creates an immersive, mobile, and highly personal multi‐sensory environment.  Range sensing technology helps the Spatialized Umbrella react to your movement through a space.

5 speakers and LEDs are mounted inside of the umbrella, around the users’ head, allowing for sound and light spatialization.  The ‘raindrop’ samples play in a loop, each speaker playing their own unique raindrop. The LEDs light up the speaker playing at that moment. The tempo of the loop is controlled by a long-range Sharp Infrared range finder.  The closer an object is to you, the faster the loop plays. If an object is close enough and a threshold is reached, a lightning sequence is triggered. Best part: COMPLETELY SAFE FOR USE IN THE RAIN.

This video is actually an early version of the code, and I apologize for not using a microphone INSIDE the umbrella (it’s hard to hear the ‘raindrop’ sounds). New video soon.

The most time consuming part of the project was in soldering the PCB i used (i wanted it to be small to fit at the top, so the entire arduino did not make sense). I designed my own “mapduino” circuit and used an IC socket for the ATMega168 chip to sit in on the PCB. This way i can just pop the chip out and replace it with another I have reprogrammed on an Arduino. Rigging the umbrella also took a little while.

***ALL SOUND IS MADE USING ONLY AN ARDUINO AND 8OHM SPEAKERS:: lookup tables store values for waveshaping, which is output directly from Digital Pins from the ATmega chip. See the current version of the code, which can be found HERE.

still to do: linearize the IR data so that there is a more even rate of change in the tempo. When I began, I also had the thought to use an accelerometer, to measure the direction of movement. BUT, I have been successful tonight in reading data from a digital compass sensor, which can give me degrees of rotation — like say if the user spins the umbrella, i could have the sound/light spin around the users head in that direction, at that speed. This is much more interesting data than an accelerometer, in my opinion.

>> UPDATE :: Featured on HackaDay.com and ArduinoShow.com and CoolCircuit.com !!

Sharp Infrared Range Finder sensors

Click on image to walk through a great lesson myself, kerstin, and cecilia put together about Sharp Infrared range finding sensors. there’s info in there about pin connections, the 3 types of sharp IRs, and code for mapping, smoothing, and calibrating the range finder data.

HERE is the link to the page, if your resolution isn’t high enough for lightbox. and code links:


Smoothing and Mapping a tri-color LED

..and a vid – sorry, didn’t realize our hands were so out of frame when we filmed it — but you get the idea: closer the bluer, the further the redder the LED gets. this is demoing the last arduino code link i just listed.


Luminosphere • v0.5

Finally finished!!!

Here it is in all it’s glory, luminosphere. You can check out my Arduino code HERE

I also had to make a product sheet, the PDF file can be downloaded HERE.  There is also an image of it, in case you don’t want to download (always a hassle, i know).

I am going to sell this item on Etsy.com, i’ll post up here when i post it on there. probably will sell for $20, and my teacher, Yury, has guaranteed anyone who sells something they make for class for more than the cost of parts, an AUTOMATIC ‘A’.

so buy my stuff. thx. bye. UPDATE: http://jmsaavedra.etsy.com (!!!)


[flickr-gallery mode="photoset" photoset="72157623547745970"]