Manhattan in cross-section

For the last few weeks I’ve been walking across Manhattan from East coast to West and back again gathering GPS data.

West of West 81st Street

East of East 79th Street, about 50 minutes later

I’m interested in how the land use (and therefore the effect on the GPS) varies as you move inland, gradually moving into more affluent areas, then the open space of Central Park, then out to the coast again.

As I repeat the process for different streets, I also get a sense of the change in landscape as I move further North: Midtown East becomes the Upper East Side becomes East Harlem; Midtown West becomes the Upper West Side becomes Morningside Heights.

I’ve recently reached the stage where I’ve been able to get the traces printed out and it’s interesting to see the ghosts of the city in the lines.

First printout of the processed GPS data

I’m looking forward to getting back home and being able to print them out full-size to enjoy and explore in more detail.

Mapping now possible

There’s still work to be done, but over the last couple of days I’ve finally been able to replicate the traces I’ve previously been using two sat-nav type devices to make, using my home made Arduino-based system instead.

This feels good.

From here I can go anywhere.

The iPaq-in-each-hand technique

The 3-arduino-2-GPS-modules-and-a-micro-SD-logger-hidden-in-a-bag-so-as-not-to-scare-people-too-much technique

Whilst I’m using this set-up for logging, there’s painfully little feedback on whether things are working as they should, however once I return to the UK I’ll be experimenting with using it to drive various visible and/or audible things and of course putting it inside the Colony creatures.

In the meantime though, I shall be testing, testing and more testing.

East – West

66th Street: there and back again

GPS parallelograms

Not entirely what I’d expected when I went out to test the homebrew GPS system today…

Several hours meandering around Central Park rendered as 10,000 small lines mostly aligned in one of two directions

What’s going on there, then?

Third Avenue. All of it.

Today I wanted to test various hypotheses etc etc and so I walked the entire length of Manhattan.

Battery Park - Water Street - Pearl Street - St James Place - Bowery - Cooper Square - Third Avenue

Apart from anything else it was nice to see the shifts in population and land use as I moved through the different districts.

The trace is in two parts (I stopped for a break!) with the GPS units obviously struggling with the second leg and giving much greater shifts and ‘noisy’ results compared to the first.

A quick screen grab of the resulting trace

This is why I like to repeat journeys several times – to filter out these one-off effects. I think I’m going to struggle to do many journeys of this size whilst the weather is so hot though. I might try some of the other avenues instead and sample some of the other districts.

…and hope the weather cools off a bit!

Logging the Queensboro Bridge

I’ve been able to evolve the circuitry I was using for the last Colony prototype to include a logging facility and (with Andrew‘s help) to then convert the log file into either svg (vector graphic) or kml (Google Earth) format.

Last night was its first test run as, with various blinky lights about my person, I went for a stroll over the Queensborough Bridge.

East River and Roosevelt Island from the Queensboro Bridge

The resulting trace

As you can see from the trace above (made from a lot more lines than it appears) there’s a bit of work to be done with this system…

Colony Prototyping #2

Thursday came and saw us doing the second round of user-testing for Colony as part of the Platinum showcase.

We walked, stood in smelly corners, manoeuvred in and out of fortified alleyways, got shouted at from cars and stared at from a variety of directions.

It was pretty great.

Moving on from the prototypes I had in March that simply vibrated at random, these prototypes were now loaded with code that made them responsive to their surroundings. Our task for the evening was to find out what that meant when you put that mathematical analysis of data a) into the hands of people and b) in to the streets.

The first group of participants set out to find out how all this works.

Another development since the initial testing was that this time there were two organisms. We’re getting incrementally closer to finding out what it might be like to have that colony of them.

The protoshape I used in March was a “small human sized” pear/water-drop/fish/swaddled baby kind of thing. This had seemed to be an ideal blank canvas upon which to project empathy and from which to project emotions and desires.

Grappling with the over-sized prototype (photo: Pete Ashton)

After a bit of a false start the weekend before, where I had made one that was too darn big, I remade it a lot smaller. And therefore a lot more huggable.

For the second of the two organisms I wanted to try something different, so I added an awkward limb/tentacle thing. Whereas with the drop-shaped blob the feedback vibration motors were very strategically placed along anticipated contact areas with the carrier’s body, the plan for this one was to make something less intuitive to hold and to see how people dealt with it.

A pause at the crossroads

Sash and shoulder; lock and load

Ant probes the streets of Highgate/Digbeth

JV goes for the torso wrap

The photos above show a few of the many solutions people came up with: headwear, neckwear, waistwear and gripped in a variety of different manners. It seemed to me that people were much more inclined to experiment with different ways of holding this creature, whereas with the other … well, this next photo sums up the different modes of interaction very well, I feel:

One blob gets a hug, whilst the other is borne down the street atop of a head

That's not to say that the baby-shape didn't get experimented with too...

A few next steps were identified over the course of the evening:

  • People like data. I need data:
    Of some urgency is the need to log the changes in the data as the blobs are carried around. This is very important in terms of me being able to learn more about how this system works and how to tailor the different reactive responses, but people were also asking me a lot if they’d be able to see the traces of their walks.
  • Reactive responses:
    There’s a lot of experimenting to be done in order to devise the vocabulary of vibrations (and possibly other responses) that somehow convey a sense of rising distress as the creatures are carried through environments in which they are uncomfortable.
  • Reactive responses:
    With the current (no pun intended) set-up I’m limited to having a maximum of 2 vibrating pager motors switched on at any one time. Any more than this and there’s not enough power to drive them and nothing happens – I need to power the motors separately to the arduinos dealing with the data, but still have them controlled by the microcontrollers. I have been pointed towards Darlington transistors.
  • Shapes:
    I need to try more of them. More tentacles? Bigger? Smaller? Fatter? Flatter?

Ashley gets his first buzz as he begins his journey with new companion

As well as a selection of interesting interactions with some of the few other people on the streets at that time, I was very pleased at the way the colony organisms provided an impetus for the testers to interact with their surroundings. Ultimately, this is what the project’s all about.

I've no idea what they're talking about, but I'd like to think it's a discussion about the architectural qualities of the local urban environment :)

A reassuring pat on the back for an organism unhappy to find itself in a narrow alleyway with no clear view of the sky

A corner not normally stood in

Not only not enough sky, but a hefty amount of barbed wire between it and you - no wonder they're not happy

A few moments of paying attention to a rollershutter alcove

Many thanks to everyone who helped get the prototypes working in a technical sense, and to everyone who experimented with and offered feedback on how they worked in a practical sense. More photos can be found in this Flickr set.

Platinum at The Edge

Myself and the other artists from the current Platinum cohort will be presenting our various works in progress at a public event to be held at The Edge on the evening of Thursday 2nd of June.

The artists involved come from a diverse range of backgrounds, and the work on show will range from performance through to endurance; presentation through to exploration.

Rob Jones: My Piece

An exploration into personal fears. This work will involve sharing a series of experiences.

Nikki Pugh: Prototypes for Colony

A playful device that aims to gently nudge and augment people’s perceptions of the urban landscape.

Aleks Wojtulewicz: fr36ze

An exploration of the stimulatory effects of adrenaline. The physical act is militaristic.

Sarah Farmer: Cultural Amnesia: what we lost in the fire

A research-led investigation of the cultural history of the arts in Birmingham from the perspective of recent graduates.

Lucy Nicholls: In Preparation for Death

Researching the commodification of the ritual of funeral, while exploring death within its wider social and cultural meaning.

Mark Essen: Club Hot Zeus; BAD MUSIC

John Napier and CLUB HOT ZEUS present BAD MUSIC.

The first prototypes being tested around Digbeth. This time they'll vibrate in response to their surroundings!

I aim to have two prototype organisms there as part of the development of my work in progress Colony. I’ll give a short introductory presentation and there will be a limited number of slots available for you to take the prototypes out for an exploratory walk around the streets of Cheapside.

Do come along if you’re in the area. Free entry; cash bar; FaceBook event.

Making the Colony prototypes landscape aware

Another compilation of Tweets.

These last few weeks I’ve been giving the development of Colony an extra big push to get it as far along as possible ready for next week’s public event.

Having previously been able to get the tech working on a temporary breadboard set-up, the next stage was to solder everything onto a more robust base that can be used inside the organisms that get carried around. The next stage after that was to start carrying it around!

With the circuitry in a shoulder bag, one GPS receiver on my left shoulder and the other in a glove on my right hand with a vibrating motor, I made myself more location aware as I made the 3.75 mile walk to and from work yesterday and the day before.

The code generates a measure of difference between the two calculated positions and then buzzes out a code so I can feel the values as I walk.

Initial findings are: it’s great! Super-exciting to have that extra sense that no-one else is aware of, and you can see the trends in output as you move from one type of terrain to another.

I’ll be refining the code over the next few days and also getting the circuitry into an organism ready for carrying. Do come along to The Edge on Thursday to find out how I’ve got on.

Ups and downs

Along with many other freelancers, I’ve been using the bank holiday season to concentrate on getting some work done. One of the major tasks on hand at the moment is to further develop my Colony project ready for the Platinum sharing event.

My first attempt at reading two GPS datastreams into an Arduino microcontroller over serial ran into some strange issues, so I decided to investigate I2C just to make sure I didn’t spend an incredibly long time barking up the wrong tree trying to make the serial version work.

Well, it’s been a steep learning curve and I’m truly indebted to the likes of GB, Ben and Adrian, but I we have now got the basics of some code working!

My coffee table currently looks like this:

3 RBBB microcontrollers, 4 breadboards, 2 GPS modules and lots of wire

Each of the GPS modules has a slave Real Bare Bones Board reading its data and sending longitude and latitude values over to the master RBBB when requested. The master RBBB reassemble then does a quick comparison between the two values to give a measure of difference.

I left this set-up logging the difference between the two GPS positions whilst I went out for an hour or so, and this is what I returned to:

The measure of difference (y-axis) plotted against time (x-axis, very approximately in seconds)

The kit is on a table not particularly close to the window in my attic flat, so there’s GPS reception, but I wouldn’t expect it to be particularly stable. That said, the graph above struck me as being very regular somehow: not as random as I had expected.

I’m wondering if the peaks correspond to the ebb and flow of the satellites overhead. Is that likely?

edit:Scratch that; wrongheaded. What would affect one GPS module and not the other?



Copyright and permissions:

General blog contents released under a Creative Commons by-nc-sa license. Artworks and other projects copyright Nicola Pugh 2003-2026, all rights reserved.
If in doubt, ask.
The theme used on this WordPress-powered site started off life as Modern Clix, by Rodrigo Galindez.

RSS Feed.