On Friday 18 December, we hosted a workshop on ‘the future of geocomputation’ involving over 30 researchers from across the UK and Ireland. We’re still working to synthesise and write up the discussions that made up the second half of the workshop, but below are the presentations that kicked off the day. Some of the tweets from the day are embedded below but from more see our storify for the day or search #fogeocomp.
To set the context for the day’s discussion we argued that the future of geography is cheap – cheap hardware and software, cheap data and code, and ‘cheap’ (by which we mean simple) interaction with sophisticated geographical models.
Chris Brunsdon’s Keynote
In his opening keynote, Chris set out an ambitious agenda for geocomputation that called for a deeper understanding of geographical processes, data visualisation (with provocative images from his caricRture library) and, most challenging of all, Approximate Bayesian Computation (ABC).
Chris highlighted some of the problems with GIS and argued that geographers need to rediscover coding for reproducibility, for flexibility and for openness in their research.
Alison’s talk rounded out Chris’ preliminaries, highlighting the role that Agent-Based Models (ABMs) could play in deepening our understanding of spatial processes by bringing aspects of the real world and human behaviour into our computational models. We see these as complementary to, rather than competing with, the more statistical aspects explored by Chris.
Alison also highlighted that despite the large number of platforms that have been developed now for agent-based modelling, and its growing use across academia, its adoption and use in policy-making has been limited. Alison puts this down to the great uncertainties that often remain and argued a need for increased focus on model calibration and verification.
@ajheppenstall as a result, we have ABMs in films, games, academia but not policy. Little calibration or measure of uncertainty in ABMs
Finally, Alex wrapped up with his thoughts on how we can train the next generation of students in the tools and concepts that they will need to get to grips with the issues explored by Alison and Chris.
Alex pointed out how ‘point-and-click’, button-pushing GIS classes in undergraduate degree programmes fails to teach anything about the process of data analysis and research and called for us to move beyond ‘sleepy’ geography curricula.
@alexsingleton: most of undergraduate point-and-click stats and GIS training useless as soon as you hit a real problem. #fogeocomp
After the keynotes we broke into smaller groups to discuss Training the Next Generation, Data, Tools & Processes and Setting an Agenda for the next 10 Years. We’re still working on summarising and synthesising the discussions from these groups so look out for that soon.
Before finishing and continuing discussions over wine, Andy Evans from University of Leeds outlined plans for the Geocomputation 2017 conference. We’re looking forward to that already but there’s more to come in 2016 first!
Today is the first day of our new Gecomputation and Spatial Analysis (GSA) pathway on our undergraduate degree. Over the summer Jon Reades, Naru Shiode and I have been developing module material and today we (well, Jon and I) finally get to use it with our students. We provide a very brief overview of the pathway on the About page of this website, but I thought today is opportune moment to discuss it in a little more depth.
As highlighted in a recent report by the UK Economic and Social Research Council, human geography in the UK has been recognised for its conceptual innovation, but its current low levels of quantitative and technical training is of concern. For example, concerned with such low levels of training in quantitative methods, in a recent paper Ron Johnston and colleagues argued that the curricula of current undergraduate programmes in geography are failing to develop graduates that can “appreciate the underlying principles of quantitative analyses and their important role in the formation of an informed citizenry in data-driven, evidence-based policy societies”.
These societies are produced as digital technologies become pervasive throughout society and science. Global positioning system (GPS) technologies that allow the precise location of mobile devices on the Earth’s surface have become miniaturised and mainstream (e.g. in smart phones), generating geo-data not before available. Governments and other organisations are now opening up their digital databases on schools, crime, health and other public services for re-use and investigation by others (e.g., UK OpenData). Investigation of these (often) geo-referenced and large digital datasets requires computation to ensure patterns can be identified efficiently and in a reproducible manner. Put together, as Elvin Wyly recently discussed, the multiple aspects of this ‘big data’ digital revolution create new geographies and provide new means to explore and understand geography. Although an older concept, this has led to a resurgence in the idea of Geocomputation.
Recognising this issue, and in the context of the importance of the ‘big data’ revolution and the increasingly pervasive influence of computing devices outlined above, we set out to develop the GSA pathway. The pathway will enable students to develop the skills needed to undertake independent geographical inquiry using the latest datasets and computational tools, and to understand how they do and can shape the geographical world. Important for developing a curriculum in this context is acknowledging that the aim is not to produce computer programmers with no means of thinking critically about how their tools inform or change the geographical world, but to produce geographers that comprehend how new data and computational tools can be used to understand geography and that have the technical skills to use those tools.
Geography as a discipline has often had a critical or radical streak aiming to promote social change or combat oppression (e.g., see Antipode). If our future social and geographical world is to be based in-part on ‘data-driven evidence-based policy’as Johnstone argues then Geography students at least need the basis of the technical skills and understanding to contribute to driving social change in that data and technology-driven world. A significant challenge for the GSA pathway then is the need for students to learn new skills such they are empowered to be able to employ computational techniques for data analysis.
This first module on the pathway, named simply Geocomputation, is foundational in that students will be learning skills and methods that they are unlikely to have encountered previously but which they will need if they are to continue to use understand the possibilities of (and use!) these new forms of data and technology in future. However, it is also important that whilst skills are learned the curriculum is not so narrow as to prevent curiosity about the geographical world or inhibit the geographical imagination. Consequently, we’ll be pushing students to ‘learn by doing’ and take an inquiry-based learning approach – my own experience of learning computational skills shows that these skills are best acquired when using them to work towards answering some particular question.
We’re looking forward to putting this theory into practice. We start today but hope to continue learning through the process and will post updates here when we can…
As we prepare to teach the first year of the GSA pathway, we’ve been experimenting with techniques more commonly used in software development to see if they can help us to deliver quality and integration in our new modules right from the start. This post will explore the logic of Pair Programming.
What is Pair Programming?
In the traditional software development arena applications are designed by a group of experts; they then hand a set of requirements over to a programmer who heads off to their desk to write the code that will meet those requirements. If the requirements are sufficiently well thought-out and extensive then delivering code that meets those requirements means the project is a success.
There’s just one problem: how often has anything complex been sufficiently well thought-out that individuals, working in isolation, have been able to deliver something integrated and feature-complete on the first go? Actually, there’s a second problem: it is also possible to write something that fully meets the requirements, but doesn’t meet the needs of the application or the organisation. While I work away on ‘my’ bit, I miss a major issue that was also hidden from the application’s designers because no one has an eye any longer on the ‘big picture’ of what the application is supposed to actually do.
It’s into this breach that Agile-derived pair programming steps as a way both to keep developers looking at the big picture, and to enable individuals to access the type of practical knowledge that is only formed through long or diverse experience. Sometimes called peer programming (which is a rather nice terminological link to academia), pair programming matches a ‘driver’ who focuses on the tactical aspects of task completion with an ‘observer’ or ‘navigator’ who “continuously and actively observes the driver’s work, watching for defects, thinking of alternatives, looking up resources, and considering strategic implications” (Williams et al., 2000). In other words, the driver has someone looking over their shoulder… but in a constructive way.
Issues in Pair Programming
Can it work? Programmers aren’t known for their tolerance of being supervised or managed during programming tasks, so there are a number of techniques designed to make this a more constructive experience: for instance, the pair switch roles regularly so that each person ‘drives’ for a while and then puts on the strategic thinking hat for a bit. And repeat. The constant role-switching means that both programmers have an opportunity to do both types of thinking, which builds up practical knowledge and also helps to ensure that many more possible approaches to a problem are considered.
So, by pairing old hands with novices pair programming encourages sharing of ‘best practice’ and yields immediate and frequent feedback during development. That said, you don’t ordinarily pair very experienced programmers with complete novices because the knowledge gap is too wide; it’s common to pair novices and intermediates, or intermediates and experts, with the idea being that the more experienced person still remembers having to learn what their ‘junior’ is trying to understand at the same time as it gives them a chance to systematise their own experience through teaching.
Obviously, some level of social aptitude/sensitivity and trust is also going to be important here, but somehow many Agile firms have managed to make it work. Interestingly, developers actually report finding the process quite enjoyable, while businesses report 40% faster turnaround, more efficient code, and fewer defects (ibid.). And it has been noted that pair programming works best on challenging tasks that call for creativity and ‘sophistication’ (Lui and Chan, 2006).
Applications in Academia
These types of benefits are clearly relevant for thinking about teaching and administration in academia where there is often a poor understanding, especially amongst new hires, of the objectives of a particular task, its rationale, and the range of viable solutions. So while none of us involved in the GSA pathway would claim to be experts in either module design or programming, we thought that pairing would be useful for new module development because we could cover each others’ ‘weaknesses’ while also talking out the overall strategy of the modules themselves.
So far, the results are really promising: although two of us had to invest fully 1.5 days working together (and switching offices since no one else can use my Kinesis Contour keyboard) to develop a week-by-week teaching plan that incorporated pre-class readings, in-class concepts, and practical work, I feel that the result is looking much better – more integrated and with an obvious appreciation of what concepts need to be covered in which weeks in order to bring the students to the final assessment – than if we’d tried to each tackle ‘our’ bit independently. We also brought more ideas and resources to bear on how we might teach each concept and came up with what I think are really good ideas for testing student learning.
Personally, I’ve found it so productive that, where remotely practical, I’m thinking of inflicting it on every team taught module I’m involved in. That said, like all things it’s probably best in moderation and may be most valuable during the planning stage, less so during the “I need to create my PowerPoint slides” stage.
So that’s peer programming – or, in this case, peer planning – and we’ll try too post about the other techniques and tools we’ve experimented with over the coming months.
Lui, K.M. and Chan, K.C.C. (2006), ‘Pair Programming productivity: Novice–novice vs. expert–expert’, International Journal of Human-Computer Studies, 64(9):915–925.
Williams, L. and Kessler, R.R. and Cunningham, W. and Jeffries, R. (2000), ‘Strengthening the Case for Pair Programming’, IEEE Software, July/August 2000, pp.19–25.