A day in the life of an Evaluation and Research Adviser

By Henry Woodward- Evaluation and Research Adviser|

This blog is often dominated by the behavioural insights and experimental trials aspect of What Works, but it has another side. While we’re running our interventions to promote belonging and understand how to help students increase their social capital we’re also helping our colleagues and the sector to evaluate the programmes they already have in place. Henry Woodward documents a typical day of an evaluation and research adviser operating out of the What Works Department.

Previously I would have said that my role sat on the evaluation arm of the What Works Department – but as What Works has grown and progressed, the design and evaluation components of the behavioural insights approach have become more merged. As testament to this my responsibilities reflect a unified departmental approach to programme delivery.

But what are these responsibilities? As the evaluation and research adviser, I help manage the outreach and student success departments’ monitoring and evaluation activities. What does that mean in practice? Perhaps the best way to reveal what it is like to work as an Evaluation and Research Adviser is to walk you through a typical day.

Training colleagues to be critical consumers of information

In the first hour or so of the day I finesse slides for a workshop I will be running for the Widening Participation Department on critically assessing claims made in articles and research papers. Our division is encouraging a shift towards evidence-based practice, but this will only work if we are able  to properly assess the evidence when someone claims something ‘works’. The purpose of the workshop is to set out ways in which research is commonly flawed, and provide a jargon buster of frequently opaque terms, with the aim of helping practitioners spot ways that research can over promise  and ensure that researchers can’t hide behind jargon to push claims through unchallenged.

Supporting our student-facing teams in measuring their impact

Then it’s on to a meeting with a practitioner from our outreach team – one of the departments we support within the division. Today’s agenda focuses on designing a survey to evaluate our support available to vulnerable students as they transition into university.

I’ve worked with the team to prepare a Research Protocol in the past which sets out what the key elements of the project are; what will be delivered, when and by who, and how the project will be evaluated. Today, we’re looking at how we can measure impact through a survey. We re-familiarise ourselves with the Research Protocol and  Theory of Change to establish whether any specific student knowledge (e.g. which services are available to help) or constructs (e.g. social capital) have been identified as particularly relevant as a target for this project. We encourage practitioners to move away from just asking participants whether they believe their intervention has improved their knowledge/skills to directly assessing this impact using knowledge-based questionnaires (ideally piloted first).

Then on to lunch, which is never boring as Waterloo has a vast array of lunch options. This may not be completely relevant, but in case you are interested, I am currently exploring the various new vegan options, with some success, though the vegan steak bake at one well known bakery chain tastes so suspiciously like it’s meat counterpart that I am now convinced that the original steak bake never contained any meat anyway.

Scoping ways to break down barriers for students

Post-lunch I am settling into some desk research to scope out a potential outreach trial. This is something I started last year after reading an article on how university applications were dictated by train fares –  students didn’t want to apply for universities they hadn’t visited, and many students couldn’t afford to visit universities outside their local areas because train tickets were so extortionate.  I am scoping out a trial to evaluate the impact of a King’s travel fund targeted at disadvantaged students to  encourage them to attend open days. I am currently researching whether any quantitative analysis has been conducted on the impact of train fares on open day attendance, or even the impact of open days on subsequent application decisions. So far, I am drawing a blank, but it is encouraging to see that other universities have similar travel reimbursement schemes (and if you have any useful information- please get in touch!).

Planning new and innovative evaluation methodologies

Next I’m looking at what research methodology to use to evaluate the impact of our K+ programme, our flagship post-16 outreach programme,  on not just enrollment at selective universities, but also students’ social capital and self-efficacy. This year, we are trying to take advantage of the large application pool we get for K+ and build a comparator group out of unsuccessful applicants. We are providing two information, advice and guidance (IAG) events for those who fulfilled the eligibility criteria and came close to being accepted. This gives us the opportunity to assess their social capital and self-efficacy at the same time as the participants on the K+ programme. The impact of the programme on social capital and self-efficacy can then be modelled using a difference-in-differences estimator. This should also provide a robust comparator group for longer term student outcomes as not only can we control for observable variables, but we have gone some way to reduce the latent (hidden) variable differences between groups.

Building my own skills

My last real task of the day is working on my R coding skills. This year I have been given the freedom to design, deliver, and analyse the impact of a self-reflection tool targeted at students progressing through their first year at King’s. At this stage I am just looking at the number of students engaging with the tool, how long they are engaging for, and the effectiveness of different distribution strategies – but further down the line I will be integrating students’ responses to generate a personalised end-of year report, as well as assessing the tool’s impact on students’ sense of belonging. The code for this analysis, and the reproducible bespoke reports, is written in R – a language and environment for statistical computing and graphics. Although I had some exposure to this software before joining the What Works team, I was by no means ‘proficient’. I have been building my skills since I started at What Works, and now, I don’t know how I even operated without it.

This evening I will be taking a break from my  gruelling running schedule (I’m training for a marathon) and attend the first session of a machine learning course I have enrolled onto. Machine learning and AI is increasing being used to develop interventions and impact evaluations across development and social impact sectors – and is currently being employed to enable the measurement of outcomes we’ve previously struggled to measure and improve both the spatial and temporal targeting of interventions. In What Works we’re all learning, and we take training seriously. Not only is King’s paying the fees, but they have blocked out a section of my week so I can practice and prepare for class to ensure I get the most out of the course.

So that’s what I do, and I take great pleasure from it.  Although there is great satisfaction in a randomised controlled trial going out the door, seeing students interact with your intervention, and even sometimes get a positive result – I would argue that helping staff who truly believe in their programmes to articulate what they are doing and why, and to help them to identify measurable goals so that they can see their progress and use that information to promote their efforts is just as, if not more, satisfying. Although the BI design work of the of the department may be running lots of interesting trials, such as asking students to fill out postcards and read their emails, our department’s interventions have evolved through iterations of delivery and evaluation, and their design reflects what has been selected through the prism of what works. It is therefore crucial to be as innovative with evaluation as we are with intervention design. We’re achieving great things through robust evaluation, and I’m just as happy to talk to anyone in the sector who wants advice about implementing something similar in their team.

_______________________________________________________________________

Click here to join our mailing list.
Follow us on Twitter: @KCLWhatWorks

Be the first to comment

Leave a Reply

Your email address will not be published.


*