We walked, stood in smelly corners, manoeuvred in and out of fortified alleyways, got shouted at from cars and stared at from a variety of directions.
It was pretty great.
Moving on from the prototypes I had in March that simply vibrated at random, these prototypes were now loaded with code that made them responsive to their surroundings. Our task for the evening was to find out what that meant when you put that mathematical analysis of data a) into the hands of people and b) in to the streets.
Another development since the initial testing was that this time there were two organisms. We’re getting incrementally closer to finding out what it might be like to have that colony of them.
The protoshape I used in March was a “small human sized” pear/water-drop/fish/swaddled baby kind of thing. This had seemed to be an ideal blank canvas upon which to project empathy and from which to project emotions and desires.
After a bit of a false start the weekend before, where I had made one that was too darn big, I remade it a lot smaller. And therefore a lot more huggable.
For the second of the two organisms I wanted to try something different, so I added an awkward limb/tentacle thing. Whereas with the drop-shaped blob the feedback vibration motors were very strategically placed along anticipated contact areas with the carrier’s body, the plan for this one was to make something less intuitive to hold and to see how people dealt with it.
The photos above show a few of the many solutions people came up with: headwear, neckwear, waistwear and gripped in a variety of different manners. It seemed to me that people were much more inclined to experiment with different ways of holding this creature, whereas with the other … well, this next photo sums up the different modes of interaction very well, I feel:
A few next steps were identified over the course of the evening:
- People like data. I need data:
Of some urgency is the need to log the changes in the data as the blobs are carried around. This is very important in terms of me being able to learn more about how this system works and how to tailor the different reactive responses, but people were also asking me a lot if they’d be able to see the traces of their walks.
- Reactive responses:
There’s a lot of experimenting to be done in order to devise the vocabulary of vibrations (and possibly other responses) that somehow convey a sense of rising distress as the creatures are carried through environments in which they are uncomfortable.
- Reactive responses:
With the current (no pun intended) set-up I’m limited to having a maximum of 2 vibrating pager motors switched on at any one time. Any more than this and there’s not enough power to drive them and nothing happens – I need to power the motors separately to the arduinos dealing with the data, but still have them controlled by the microcontrollers. I have been pointed towards Darlington transistors.
I need to try more of them. More tentacles? Bigger? Smaller? Fatter? Flatter?
As well as a selection of interesting interactions with some of the few other people on the streets at that time, I was very pleased at the way the colony organisms provided an impetus for the testers to interact with their surroundings. Ultimately, this is what the project’s all about.
Many thanks to everyone who helped get the prototypes working in a technical sense, and to everyone who experimented with and offered feedback on how they worked in a practical sense. More photos can be found in this Flickr set.