Physical computing is a creative framework for understanding our relation with the digital world. In practical terms, this mostly means re-discovering objects and practices that we take for granted by making things, usually some form of electronic bricolage of cardboard and code.
With the support of Prof Graeme Earl, I organised our department’s first ever physical computing club, which ran for 5x3hr sessions between December 2019 and March 2019. The sessions were open to all, and were organised as semi-structured informal gatherings loosely based on the Raspberry Pi curriculum.
The objective of the club was to learn how to fast-prototype an interactive connected object. And to do this, we learned:
- What is and how to setup a Raspberry Pi
- How to control the Pi using Python
- How to connect basic circuitry using a breadboard
- How put this together into an object using cardboard and tape
- How to connect our object to twitter
More than a coding club, our approach was to get students interested in programming by showing them how it underpins our relation with objects and spaces of the physical world. There was programming involved, but the main goal was to learn by making, and to show in this way the the relation between intangible code and embodied experience.
This is what we made:The initial step was to learn to control the GPIO pins of the Pi. A special thanks to Alex Hadwen-Bennett for supplying us with his excellent design of a card-board circuit controller. Once we knew how to make the Pi communicate with the world, we used a cardboard box enclosure and learned how to program its camera to make time-lapse videos like the one at the top of this post.
Finally, we moved our circuitry to a breadboard, and put everything together to create the prototype for a tweetcam:
The tweetcam box has a camera, a buzzer, a LED, and a button. When it’s on and the button is pressed, there is a beep and the LED lights up, a picture is taken and immediately tweeted to the test account Y0g_50th0th.
This box does nothing that a modern smartphone cannot do. But in creating it ourselves, we are confronted with questions of design and interaction: why do we expect the sound feedback of the buzzer to assume the picture was indeed taken? What if the button was programmed to only tweet one out of every five times it is pressed? Or randomly? how could show this using the LED? And how these different types of interaction affect our idea of privacy?
Changing only a few lines in our programme we can quickly iterate over these options and test how people react to our tweetcam. And we also learn some coding along the way.
The club itself was a prototype of sorts, we wanted to explore the ideas of learning by making and of teaching design & programming simultaneously. On a more academic note, we also wanted to advance the idea of what Philip Agre called a critical technical practice as a mode of research in our department. This is where you come in, we want to know:
- Are these ideas relevant or useful to your research? can one of the concepts or arguments you work with be expressed as an electronic bricolage of cardboard and code?
- Is our tweetcam useful as a demonstration for any of the modules you teach?
- Can you or your students think of betters uses for the Raspberry Pi?
If you have any thoughts about this let me or Graeme know. And in the meantime, if you want to see the tweetcam in action email me or come to S3.39
Thanks to the Department of Digital Humanities for sponsoring the gear for the club (it is of course available for anyone else to use), to the Arts & Humanities Research Institute for letting us inaugurate their new project space (and to Laura Douglas for helping us book the space), and of course, to the students who participated in the club —I hope I see you again next year!