FriiSpray – Digital Infra-red Graffiti

Enabling Expression through Technology

FriiSpray – Digital Infra-red Graffiti header image 2

FriiSpray hardware + software developments

January 24th, 2011 · No Comments · Testing, how-to, software

Intro

Since starting the FriiSpray project back in 2008 we have always been trying to improve the hardware and software side of things – there’s always room for improvement and increased ‘playability’ / user interaction within the virtual graffiti projects we’ve used, played with and seen videos of.

We hooked up with Matthew Venn through his virtual graffiti instructable and built some different versions of his can design with a distance sensor for ‘spray’ size and pressure sensor for the paint opacity and drippability.

mk2 can design based on Matt Venn's instructable design

mk2 can design based on Matt Venn's instructable design

After experimenting with these new can designs we found that we were struggling to get smooth + satisfactory results from the distance + pressure sensors. With added hardware and software ‘smoothing’ techniques we managed to get a bit of a more stable setup but still not quite ready for general use.

Nintendo Wii nunchuck internals

Nintendo Wii nunchuck internals

Cue mk3 testing – we thought it could be good to use a WiiMote and nunchuck inside a can to get decent readings for pressure and can orientation. We butchered a nunchuck to get the board out of it to plug into a WiiMote. Using a Wiimote would also allow us to get stable wireless communication via bluetooth between the can and the computer running the virtual graffiti software.

We bobbed over to our friends at .:oomlout:. HQ to do a spot of prototyping and came up with the following test models:

MDF can prototyping

MDF can prototyping

Acrylic can prototyping

Acrylic can prototyping with nunchuck internal circuit board

Acrylic can prototyping with nunchuck internal circuit board

Acrylic can prototyping with nunchuck internal circuit board

After trying these models out we found that they didn’t really solve our problems – so back to the drawing board.

Current FriiSpray system

FriiSpray mk01 setup

We use a Nintendo Wiimote to track an IR LED mounted in the nozzle of an old spray can. This gives us an XY position of the can.

We have a pressure sensor at the top of the can, have tried both a force sensitive resistor (FSR) and a sprung linear potentiometer.

To get an idea of can distance to wall, we have an IR distance sensor in the can.

The software was written in Flash, but has recently been converted to Java (using Processing).

Aims

  • To improve the can tracking (ideally XYZ and orientation)
  • To make the software and hardware good platforms for other people to experiment and contribute updates.

Improve can tracking

To simulate a spray can with accuracy, we need to know:

  • Which direction it’s pointed in,
  • How far the can is from the wall / screen,
  • How far/hard the nozzle is pressed down.

At the moment we don’t know where the can is pointed because the camera in the Wiimote only sees the LED, not the spot it illuminates on the screen.

We have an approximate distance sense, but we have problems with excessive can modification (holes in the can for the distance sensor) and a noisy output from the sensor (presumably solved with better analogue or digital filtering techniques).

We have a good system to measure the nozzle pressure – a sprung linear potentiometer.

Experiments

We came up with some ideas to improve tracking:

2 cameras

We used 2 Wiimotes because we’ve got lots around. They also do blob tracking in hardware which reduces load on the computer. We set up the 2 wiimotes in parallel and used some triangulation ideas found here:

This led to a Processing sketch where we could track X, Y and Z of an IR LED! Excitedly, we tried to increase the volume that we could track in (our screen is ~2m wide in 4:3 aspect ratio). Unfortunately it seems that the cameras need to be at least 2m away from the screen each, to have an overlap in their views. This leads to low resolution tracking of at least one axis. We think we could get this method to work if we had higher resolution cameras at the corners of the screen, on 2m long sticks.

The advantages of this method are:

  • We could use thicker screen material and get more vivid projections
  • We can track distance of the can remotely (rather than in the can)

The downsides of this method are:

  • Still can’t track can orientation. We’d need to add more LEDs.
  • low resolution Z tracking with the cameras behind the screen, or low res XY with the cameras in front of the screen.
  • If the cameras are in front of the screen, we need more LEDs that can be seen from the sides of the can, but not get confused for 2 separate light sources.
  • If the cameras are mounted on the screen we would need to use really long sticks / arms to get the trackable area big enough for the screen.
2 Wiimotes - usable tracking area vs distance from screen

2 Wiimotes - usable tracking area vs distance from screen

2 IR LEDs

2 IR LEDs in one cap - wasn't tracked well by Wiimote at all - interpreted as one 'point'

2 IR LEDs in one cap - wasn't tracked well by Wiimote at all - interpreted as one 'point'

Test assembly with 2x IR LEDs at different axial angles

Test assembly with 2x IR LEDs at different axial angles

This method uses 2 LEDs that are not axially aligned. As the can is moved in and out, the dots drawn on the screen move closer or further away from each other. This is a great idea, but unfortunately, the LEDs don’t draw bright enough dots on the screen to be seen by the camera. Only the LEDs themselves are tracked.

Testing with LASER diode

Testing with LASER diode

However, this led to the idea of using LASERS! So we have now started to experiment with 2 IR lasers. Our first tests are promising, and we can do a lot optimising by using matched IR filters in front of webcams rather than Wiimotes. The Wiimote cameras are not sensitive to our LASERs and rather than finding LASERs with a wavelength of ~940nm, we decided to find IR filters matched to our LASERs and relatively cheap webcams.

Thoughts on making a good platform

We also want to make a system that encourages experimentation and adoption. One way we can do this is by linking the system with other open source initiatives. For example, we can move away from Nintendo’s Wiimote system to standard webcams, and use an Arduino in the spray can.

The software is already open source, so we have been concentrating on getting good enough can hardware that can be extended / modded easily.

However, if the results from using Wiimotes (in the can or elsewhere) are dramatically better than we can achieve with separate components then we will probably continue to rely on Wiimotes.

Conclusions

We have made some good progress, at least by ruling out some of the options. What looks the most exciting is using a number of LASERs (2 to start with) in the can and using a standard webcam to turn this into XY and Z data.

We’ll also use an Arduino with Bluetooth in the can to transmit pressure information and to be available as a platform for future development for us and for the FriiSpray / virtual graffiti community.

Next Steps

Matt Venn working at Jam Jar HQ

Matt Venn working at Jam Jar HQ

We think that 2 LASERs in the can will give us simplistic can orientation and distance. We still won’t be able to work out tilt, but rotation and distance will give is a better simulation. A possible downside is that the LASERs may need to be so bright that they are also dangerous. We are using near-IR LASERs so they are also slightly visible – but goggles and more testing required!

We have also set up a wiki that we’ll be adding more to in the coming weeks. Get on board with your ideas, thoughts and modifications.

Tags:

No Comments so far ↓

There are no comments yet...Kick things off by filling out the form below.

Leave a Comment