I’ve been asked to work with a primary school in Leicester to work on something that sounds remarkably like an Alternate Reality Game/cross media project/immersive experience to me. Here are a few excerpts from the brief:
…creating a memorable learning experience for Year 3 children… a real ‘Wow’ experience… allowing them to use a range of creative approaches to explore real science… memorable learning experience… a lively, enquiring mind and a love of learning… the ability to question, to argue rationally and to think for themselves… the ability to work hard and to succeed at tasks both independently and with other people… identify and solve problems, take risks… activities that are open ended- so that the children can shape the direction of the investigation… creating a fantastical narrative which is developed by pupils, staff and artists…
So, 56 Y3 (about 7 years old) pupils with a suspended timetable for a few days so they can work/play on the themes of ‘light and shadows’ and ‘rocks and materials’. I can’t wait!
No, really, I can’t wait! I’ve been exploring a few different candidates for special objects that different parts of the narrative/exploration might hang on. I’ve just finished assembling a solar-powered spider and yesterday afternoon was spent putting RFID technology into a toy bat.
The bat currently opens different images depending on whether the left or right wing is folded across the chest. This project was initially just for my own playtime, but in the middle of sewing up the bat I realised it could be used as a simple true/false answering device to respond to questions relating either to the narrative or to the curriculum.
I’d also like to use the touchatag RFID system to construct an array of objects that have to be placed in the correct position in order to demonstrate that a puzzle has been solved. For example this could be the correct arrangement of the Sun, Moon and Earth to show when an eclipse happens.
Ideally I’d like to be able to work with 2 separate events: when the tag is placed on the reader and when the tag is removed from the reader, however the touchatag system currently only supports a single tag-on-reader event to trigger an action.
There have been a couple of suggestions as to how to get around this: using a programme to monitor the touchatag application to see when there is an increase and decrease in activity corresponding to the two events (thanks Tom), or to work with the hardware API from the manufacturers of the electronics within the reader (thanks Ted).
Are there any programmers out there who fancy tackling the challenge to make a touchatag app that can distinguish between putting a tag on a reader, taking a tag off a reader, and which reader (out of at least 4) is having tags taken on or off it?