Bumps in the KCLxBIT road

By Maija Koponen, King’s College London

The learning we’ve gained from the KCLxBIT project hasn’t just come from our successful randomised control trials (RCTs). In fact, a lot of important insights have been built up as a side product of things that haven’t gone as smoothly as we would have hoped. KCLxBIT is the first time behavioural insights have been applied and tested through RCTs in a UK higher education context. The unique nature of the project has challenged us, and there have been a number of issues we could not foresee or prevent. We wanted to share our learning from some of these ‘bumps in the road’, as they may be useful to others embarking on a similar journey.

Data readiness

An important learning point from this project has been our need to reflect on our data readiness across King’s. In planning our trial targets, we had to carefully consider what outcome data was available to us, which in turn informed the trial designs selected.

Where data was available, this had naturally not previously been collected for the purposes of running an RCT, and sometimes the format and quality of the data had to be negotiated to enable the running of our trials. A frequent issue was the ability to collect unique identifiers (student numbers) rather than name-based data to allow us to analyse the effectiveness of our nudging.  Reflection on how we can improve our data collection practices continues, and our colleagues have a number of options under consideration.

Student contact data

Last September we faced the challenge of adapting the existing format of student contact data to the purposes of our text message trials. This was the first time we were texting students, and we soon discovered that there was room for improvement in the quality of the mobile phone numbers we had on record. Our colleagues have since introduced new prompts for students to update their contact information throughout the year, which had an immediate impact on the number of students we could nudge in later trials.

‘Unknown number’

During the first year of the project we learnt that calls from a campus landline to a mobile phone display as coming from an ‘unknown number’. This made students less likely to pick up the call and may have been a particular deterrent for female students, who had a lower pick-up rate. For the second year of the King’s Community Ambassador phone calls we invested in a new call platform which allows a phone number to display instead, with the hopes that this would improve pick up rates.

Necessary flexibility

We learnt early on that running RCTs requires a great deal of flexibility and resilience. Our colleagues from BIT reassured us this is the case regardless of setting, and were quick to develop workarounds when things did not go to plan.

Running an RCT in a university setting requires extensive conversations with a range of colleagues, in order to build up knowledge of the data available as well as the processes in which this can be collected and shared. The collaboration of colleagues from across King’s has been a vital element in the success of our project. In some cases even small tweaks to how we do things may have a positive impact on the student experience. It’s these insights which have made KCLxBIT valuable to us, even when trials have not produced direct results.

Email lifecycle@kcl.ac.uk to join our mailing list.
Follow us on Twitter: @KCLxBIT

Collaborating on KCLxBIT

By Anne-Marie Canning, King’s College London

The KCLxBIT project has been a two year collaborative project between the Widening Participation Department, Policy Institute at King’s and the Behavioural Insights Team. Partnership working is a core feature of how we make change in widening participation so it makes sense that partnership could play a full and innovating role in improving the evidence base.

The Widening Participation practitioners brought in-depth understanding of the target audience, the university eco-system and the needs of non-traditional learners.

The Behavioural Insights Team brought their confidence and expertise with data and RCTs, alongside field experience and superlative knowledge of behavioural economics.

The Policy Institute brought political nous, connections and academic oversight.

Together our contributions were greater than the sum of their parts.

The project was administered through a weekly Skype meeting with a clear set agenda and space given to debate and iteration of ideas. We invested in an early doors away day to spend some quality time thinking over major issues in the project. Documentation was consolidated on a shared drive and a ‘query log’ (copyright Susannah Hume 2016) made sure that blockers were resolved and accountability was in place. An academic steering group oversaw our activity and included Professor Jennifer Rubin, Professor Anna Moutford-Zimdars and key members of staff from King’s College London.

A valuable spin out from the project has been our ongoing collaboration with the Harvard University Behavioural Insights Group and their immersive field project. Last year a team of PhD and MBA candidates worked with us to help develop solutions for parental engagement in higher education access, the result of which is going to trial with a large number of schools in the coming year.

We also brought in students and recent alumni to assist with the project – using their skills, expertise and communication strengths we were able to make the project much more vibrant and relevant. Special thanks go out to the generous Ryan Wain and Maia Rowe-Sampson who acted as our near-peer What I Wish I’d Known Presidents and were the face of our bursary club.

Ultimately, what makes partnerships successful is a shared self-interest. The motivating self-interest here was the desire to help improve outcomes for widening participation learners. This was shared by everyone working on the project from the Chief Scientist of the Behavioural Insights Team to the wonderful interns who packed the letters we sent out. Our varied perspectives and diverse backgrounds made the project better and more ambitious.

E-mail lifecycle@kcl.ac.uk to join our mailing list.

Increasing attendance at the Welcome Fair

By Lucy Makinson, Behavioural Insights Team

I. The benefits of participation in student societies

Participating in extracurricular activities is an important part of being at university. It is an opportunity to meet other students, build social networks, and develop soft skills[1] which are often valued by employers and can improve labour market outcomes after graduation.[2][3] Participation can also improve outcomes within university by improving academic attitudes.[4][5]

In addition, being a part of a university club or society may enhance students’ sense of belonging within university which in turn has positive impacts for academic engagement and retention.

There is some evidence from the US that the benefits of campus participation are particularly significant for students from ethnic minority and low-income backgrounds.[6] However, a recent study from the University of Edinburgh found students from Widening Participation backgrounds are half as likely to hold positions in the university’s clubs and societies.[7] There may be several factors driving this – one is that students from low-income backgrounds are less likely to participate in extra-curriculars at school,[8] where they may have also had less extra-curricular provision,[9] with school participation being a good predictor of extra-curricular participation at university.[10] Another is that they may underestimate the benefits of participation, particularly in relation to employment outcomes.

II. The trial

The KCL Welcome Fair runs for three days at the start of the academic year and is an opportunity for students to find out about the clubs and societies available at King’s and sign-up to join them. We therefore wanted to encourage first year students to attend the Welcome Fair, with the final aim of increasing sign-ups to clubs and societies, using a series of messages in the run-up to, and during, the event. Although we considered participation a positive outcome for all students, we were particularly interested in the attendance and sign-up rates of students from WP backgrounds.

Designing the messages

From the existing research on student society participation, as well as the wider behavioural literature, we hypothesised three key reasons for why a student may not attend the welcome fair.

1. Lack of belonging
Students who feel uncertain about whether they belong at university may be more likely to rest on their academic success to justify their being there, and see extra-curriculars as a distraction. They may also be worried about attending the fair alone, possibly through a perception that other students already know each other

2. Lack of information
Students may not perceive either the social or career benefits of being part of a society. They may also be unaware of the range of societies available, and not know where to find societies which match their interests.

3. Lack of planning or priority
Students may not attend the welcome fair due to competing priorities, not feeling like going on the day, or failing to plan appropriately (for example, by not making a clear plan of when to go and making a note of it).

Based on these hypotheses we designed two different sets of messages. One focused on addressing barriers to belonging – reassuring students that it was normal to not know anyone when arriving at university and to feel overwhelmed, but presenting societies as an opportunity to meet other students to support them through the transition. The second focused on addressing informational barriers, in particular emphasising the positive impacts of society participation on employability as well as emphasising their social benefits. Both sets of messages also drew on the extensive literature surrounding planning prompts, and encouraged students to make a specific plan with a link to the Citymapper app to help them plan their route.

A1 texts

Figure 1: Example messages relating to belonging and employability

Trial design

The Welcome Fair ran from Friday 23rd to Sunday 25th September 2016. Ahead of this we randomly allocated[11] all first year students[12] to one of three groups.

The first (control) group did not receive any text messages about the Welcome Fair, although they would have received information about it through the existing communications with first year students.

The second (belonging) and third (employability) groups each received three text messages: the first message was sent the Monday before the Welcome Fair, 19th September; the second on Friday 23rd, the morning of the first day; and the final message was sent on Saturday 24th, the morning of the second day.

The messages for the ‘belonging’ group all focused on reducing perceived barriers associated with belonging, whilst those in the ‘employability’ group received messages emphasising employment benefits of societies and other informational failures.

The primary outcome measure for this trial was whether a student attended the Welcome Fair. Students entering the Welcome Fair swiped their KCL ID card in order to enter,[13] and this data was recorded by the King’s College London Student Union (KCLSU) and subsequently shared with BIT for analysis.

The secondary outcome measure was whether a student had signed up for at least one society, either at the Welcome Fair or later in the year (data was collected in July 2017). Again, this data was provided by KCLSU who record all sign-ups to student societies throughout the year.

The results

We found that the messages did make students significantly more likely to attend the Welcome Fair, although the most effective message appears to depend on student characteristics.

Looking across all students, the messages centred on belonging had the greatest impact on attendance, resulting in a 6 per cent increase. Employability messages, in contrast, increased attendance by just over 5 per cent, and the increase was not significant at conventional levels.

However, this trend was not true across groups. We conducted separate analyses for students in ACORN groups 1-3 which we used to define non-WP students, students in ACORN groups 4 and 5 which we used to define WP students, and students without an ACORN grouping within KCL’s record (many of these will be international students).

Amongst non-WP students, the employability messages were the most effective and increased attendance by 9 per cent, whilst the belonging messages only increased attendance by 3 per cent which was not significant at conventional levels.

Amongst WP students, the belonging message appears to be more effective indicating an increase in attendance of nearly 5 per cent, whilst attendance amongst WP students in the employability group in fact had lower attendance than those in the control group (who had not received any messages at all). It is worth noting that these changes are not significant at conventional levels, but this will be partly due to the smaller sample of WP students.

A! graphs

Figure 2: Differences in Welcome Fair attendance by message group and ACORN

Looking at sign-up rates to societies, we find that the employability messages increased sign-ups to societies but the belonging messages did not. This is true both at an aggregate level[14] and for non-WP students specifically. This is interesting given that attendance was increased to a greater extent by the belonging messages. Particularly notable is that for WP students, although no sign-up effects were significant, sign-ups in the employability group were almost seven times larger than those in the belonging group relative to the control group, despite belonging messages appearing to be more effective for attendance.

Figure 3: Sign-up rates for student societies by message group

Figure 3: Sign-up rates for student societies by message group

 

 

 

 

 

 

 


III. What we have learned

This trial has several important lessons for approaches to student engagement. Firstly, it demonstrates that text messages to students can be an effective mechanism for increasing engagement, both for one-off events (such as the Welcome Fair) and for harder to shift outcomes such as participation in clubs and societies.

However, it also demonstrates the importance of good design and rigorous testing when creating engagement initiatives. Not only did responses to the messages differ by group, which means the results of this trial can be used to improve the targeting of messages in the future, but the fact that messages around employability may have decreased attendance for WP students is evidence that texts in and of themselves may not always increase a prompted behaviour.

There are also more speculative conclusions which can be drawn from this work, which would benefit from further research. One is that messages addressing student belonging may be particularly effective for students from WP backgrounds – a hypothesis that we will test in some of our later trials to be shared through this blog. Another is that there may be deeper barriers to student participation in societies which require a wider structural change, but that student motivations for attending the Welcome Fair may make them more or less likely to overcome these barriers. For example, it has been suggested that societies themselves may be intimidating to some students, particularly those from WP backgrounds. It is possible that whilst belonging messages encourage Welcome Fair attendance, those looking to societies for reassurance are intimidated on arrival and subsequently do not sign up, whereas those who are looking to societies for long-term benefits such as employability are less affected by societies which may appear intimidating.

This is just one possible explanation for why the messages do not appear to impact attendance and society sign-up – there are many more possibilities to be considered which may have different implications for deeper interventions.

_________________________________________________________________

Footnotes

[1] Clark, G., Marsden, R., Whyatt, J.D., Thompson, L. and Walker, M., 2015. ‘It’s everything else you do…’: Alumni views on extracurricular activities and employability. Active Learning in Higher Education, 16(2), pp.133-147.

[2] Andrews, J., & Higson, H. (2008). Graduate employability,‘soft skills’ versus ‘hard’ business knowledge: A European study. Higher education in Europe, 33(4), 411-422.

[3] Stuart, M., Lido, C., Morgan, J., Solomon, L. and May, S., 2011. The impact of engagement with extracurricular activities on the student experience and graduate outcomes for widening participation populations. Active Learning in Higher Education, 12(3), pp.203-215.

[4] Fredricks J.A., and Eccles J.S. (2006). Is extracurricular participation associated with beneficial outcomes? Concurrent and longitudinal relations. Developmental Psychology, 42(4): 698–713.

[5] Darling, N. (2005). Participation in extracurricular activities and adolescent adjustment: cross-sectional and longitudinal findings. Journal of Youth and Adolescence, 34(5): 493– 505.

[6] Kuh, G.D., Cruce, T.M., Shoup, R., Kinzie, J. and Gonyea, R.M., 2008. Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education, 79(5), pp.540-563.

[7] Sandford-Ward, M. (2016). EUSA study shows need for widening participation in University societies. [online] The Student Newspaper. Available at: http://www.studentnewspaper.org/eusa-study-shows-need-for-widening-participation-in-university-societies/ [Accessed 16 Aug. 2016].

[8] White, A.M. and Gager, C.T., 2007. Idle hands and empty pockets? Youth involvement in extracurricular activities, social capital, and economic status.Youth & Society.

[9] Demos. (2015). Learning by Doing. [online] Available at: http://www.demos.co.uk/project/learning-by-doing/ [Accessed 16 Aug. 2016].

[10] Berk, L. E., & Goebel, B. L. (1987). Patterns of extracurricular participation from high school to college. American Journal of Education, 468-485.

[11] Randomisation was stratified by gender and ACORN status (coded binary as ACORN 1-3 and ACORN 4-5)  to ensure even representation across the three groups.

[12] At the start of the year all students were sent an introductory message, in which they were told that they would receive messages from Kings Tips and given the opportunity to opt out. Any students who opted out were excluded from all further text trials.

[13] There were two cases in which a card was not swiped and attendance was not recorded. Signs informed students that their attendance data would be collected if they swiped their card, but that they could show their ID to a security guard if they did not want this data to be collected. In addition, students who forgot their card, or hadn’t yet collected it, could enter the Welcome Fair but their attendance was not recorded.

[14] At an aggregate level, the increase in sign-ups due to employability messages was significant at the 10% level, but not at the 5% level.

The What I Wish I’d Known Diaries and Other Programme Content

By Maija Koponen, King’s College London

The What I Wish I’d Known programme is one of our two randomised control trials focused on enhancing students’ sense of social belonging. The aims of the trial were detailed in last week’s blog post.

What I Wish I’d Known ran from December 2016 to June 2017 and involved 470 first year King’s Living Bursary recipient students. The intervention focused on making students aware of a range of support services available to them at King’s, and messages throughout the programme laid a heavy emphasis on the fact that it is entirely normal to need help and support during your time at university.

Programme presidents

The majority of the programme messaging was sent in the name of our two What I Wish I’d Known presidents. One of our presidents was a King’s alumnus, and the other a current King’s PhD student, who had also completed her undergraduate degree at King’s. These near-peer presidents were invited to represent the programme due to their own WP student backgrounds, and their willingness to share their stories, experience and advice which was important in bringing the programme to life. We used their images in communications, as behavioural insights research suggests that people’s engagement with a message is increased if the text is accompanied by an image of a person’s face.[1]

Special social events

The programme was kicked off in December by an email from the presidents. The email introduced the programme and invited students to a free dinner at Pizza Express. The pizza event aimed to provide students – and in particular those who live at home – a means to build their networks and bolster their feeling of belonging and social support. A further aim was to provide students with a positive student experience right before the Christmas break, which is a tricky milestone in the student journey, and a difficult time in terms of student retention. Various behavioural insights were built into the event, including activities that encouraged students to exchange contact details.

WIWIK diary design-page-001

Figure 1: WIWIK diary cover

What I Wish I’d Known diaries

Right before the Christmas break, all students on the programme received a What I Wish I’d Known programme diary in the mail. In the first year of the KCLxBIT project, we found out that students often do not use any type of diary to plan their studies and other term time activities. So our first aim was simply to provide them with a diary. The inside covers of the diaries included a collection of tips from 2nd and 3rd year students, so the diary also worked as an additional way to encourage students to engage with the opportunities available to them as King’s students.

WIWIK diary design-page-002

Figure 2:  What I Wish I’d Known diary inner leaf

Furthermore, research from the USA has shown that sending students a gift from the university – branded or otherwise – helps emphasise students’ connection to the organisation and increases their sense of belonging.[2] This in turn has a direct positive effect on the student’s institutional commitment and significant indirect effects on intentions to persist and actual persistence.

Behaviourally inspired text messages and e-mails

In the spring term, student received fortnightly communications from What I Wish I’d Known in the form of text messages and e-mails. Communications centred around a different theme each month, and included information on support available, unique opportunities, and video testimonials from fellow students. Topics included; promoting study skills support, information about financial services and encouraging take-up of summer study abroad opportunities. All messages were sent in the name of one of the programme presidents, and were formulated based on the principles of BIT’s EAST framework.

WIWIK messages

Figure 3: Example text messages from the programme

Programme feedback and outcomes

Preliminary qualitative feedback has indicated that students found the programme content useful, and the diaries and invitations to the social events in particular were well received. An interesting point of feedback from a number of students was that even if they did not attend the socials, being invited in the first place was appreciated. Further analysis of the feedback is underway. Results from the full quantitative analysis of the What I Wish I’d Known programme randomised control trial will be available in the coming months, once students have completed their re-enrolment for the 2017-18 academic year.

Follow @KCLxBIT on Twitter for our updates, or contact lifecycle@kcl.ac.uk to receive our latest blog posts via e-mail.

____________________________________________________________________

References

[1] Behavioural Insights Team (2013). Applying behavioural insights to charitable giving. Cabinet Office.

[2] Hausmann, L. R., Ye, F., Schofield, J. W., & Woods, R. L. (2009). Sense of belonging and persistence in White and African American first-year students. Research in Higher Education,50(7), 649-669.

Introducing the What I Wish I’d Known Programme

By Maija Koponen, King’s College London

A central focus throughout our work on KCLxBIT has been to find ways to apply behavioural insights to enhance the support we provide for widening participation (WP) learners. Throughout planning our interventions we were acutely aware that the kind of disadvantage experienced by WP students is multi-layered and complex, and unlikely to be fixed by simple text message interventions. This led us to develop the What I Wish I’d Known programme.

WIWIK logo

Figure 1: Programme Logo

What I Wish I’d Known is a wrap-around programme of support for first year King’s students who are recipients of the King’s Living Bursary (KLB). The KLB is awarded to students based on an assessment of their household income levels and in this trial recipient status was our indicator of WP status. The programme is designed to enhance the sense of belonging experienced by WP students at King’s and equip them with resources to help them make the most out of their student experience, and succeed academically. During the programme 2nd and 3rd year students share their tips about what they wish they had known in first year with the programme participants.

Students on the programme are also pointed towards extracurricular events or alternative study options and offered opportunities to build their networks, with the aim of supplementing the informal advice they may be missing. The aim of the trial is to test whether such a cohesive offer of support leads to improved attendance, attainment, and general engagement with university.

Rationale for the programme

In the student journey mapping workshops we carried out as part of the first year of the KCLxBIT project, we found evidence that first year widening participation students are less likely to attend their January exams, less likely to apply for some forms of study abroad, and perform, on average, less well in their first sets of exams. The ‘What I Wish I’d Known’ programme was designed based on insights from the literature relating to attainment and attendance for widening participation students and the type of support that may benefit them.

Research from the USA has suggested that the lower sense of belonging at university felt by students from disadvantaged backgrounds may be contributing to their higher rates of dropout and lower attainment.[1] [2] Based on this evidence, social belonging is at the heart of the ‘What I Wish I’d Known’ intervention.

The programme also seeks to address potential differences in the social and cultural capital of students from different backgrounds, which are largely acknowledged to play a role in shaping the experiences of students at university and account for some of the differences experienced by low SES students and their wealthier counterparts.

Cultural capital refers to the general cultural background, knowledge, experiences, disposition, and skills that students use to navigate an educational setting.[3] Research has shown that students from higher SES backgrounds receive more preparation for university life and what to expect than WP students. Many are given advice by parents or family members who have often been to university themselves and are more likely to have been prepared by their schools.[4] In contrast, students from disadvantaged backgrounds are less likely to have parents that attended university, and on arrival will need to devise their own strategies of engagement[5] which may contribute to the reduced participation with university activity.

Social capital, meanwhile, is defined as social connections and networks that an individual can draw on to aid progression.[6] While children of middle- or upper-class families have a variety of networks available to them, low income individuals and racial minorities often lack the networks to provide the most up-to-date and accurate information about educational opportunities.[7] Given that these groups are also more likely to take up paid employment or remain living at home, it is unsurprising that – in addition to lacking the initial networks – they are also less likely to participate in non-academic activities and spend fewer evenings per week socialising during their time in university.[8]

Students are expected to benefit from the experience of the programme membership itself, as well as the targeted package of support that it entails. We hope that involving students in the What I Wish I’d Known programme will help to level the playing field for WP students.

Analysing the impact

Over the past year, What I Wish I’d Known has been run as a randomised control trial, involving half of the total first year KLB recipient population. In order to ensure a robust trial, students who were not in the trial group were not informed about the programme.

Analysis of the trial outcomes will not distinguish between the impacts of the different programme components, but will look at their overall impact on a range of outcomes for first year WP students. These include data on attainment, re-enrolment and indicators of social belonging at King’s. Full analysis of the impact will be possible after the enrolment period for the 2017-18 academic year. We will be publishing results in this blog when they are available.

Remember you can follow us @KLCxBIT for blog update reminders, or join our mailing list by emailing lifecycle@kcl.ac.uk.

____________________________________________________________________

References

[1] Walton, G. M., & Cohen, G. L. (2011). A brief social-belonging intervention improves academic and health outcomes of minority students. Science, 331(6023), 1447-1451.

[2] Vincent Tinto, Leaving College: Rethinking the Causes and Cures of Student Attrition (University of Chicago Press, 1993).

[3] Lamont, M. & Lareau, A. “Cultural capital: allusions, gaps and glissandos in recent theoretical developments,” Sociological Theory, vol. 6, pp. 153–168, 1988.

[4] Forsyth, A and Furlong, A (2003) Losing out? Socioeconomic disadvantage and experience in further and higher education Policy Press/JRF, Bristol

[5] Forsyth, A and Furlong, A (2003) Losing out? Socioeconomic disadvantage and experience in further and higher education Policy Press/JRF, Bristol

[6] Stuart, Mary et al (2009). The Impact of Social Identity and Cultural Capital on Different Ethnic Student Groups at University

[7] Simon, J. & Ainsworth, J. W. (2012). Race and Socioeconomic Status Differences in Study Abroad Participation: The Role of Habitus, Social Networks, and Cultural Capital

[8] Forsyth, A and Furlong, A (2003) Losing out? Socioeconomic disadvantage and experience in further and higher education Policy Press/JRF, Bristol

 

The KCLxBIT Panel Study: our approach to recruitment and retention

By Lucy Makinson, Behavioural Insights Team

There are several aspects of a survey panel which are necessary to ensure that the data drawn from it can be meaningfully interpreted.

1. Sample size
The panel must be sufficiently large to draw reliable conclusions about the broader population. The larger the sample size, the smaller the margin of error.[1] It’s also important to remember that if you want to look at the responses of specific subgroups– in this case, students from a widening participation background- the sample size of those specific subgroups is also important.

2. Representativeness
Representativeness is how well the people on a survey panel reflect the overall population. A panel is unrepresentative if some groups are more or less prevalent in the panel than in the overall population. This is a problem if the views of the groups which are over- or under-represented are different from the population as whole. Representativeness can be tested across observable, but not unobservable, characteristics.[2]

3. Retention
In the case of panel surveys, retention is of key importance. High attrition impacts survey sample size but, often more importantly, representativeness. This is because those leaving the panel are likely to be different from those staying on it – for example, they may be less attached to the university – and therefore the views of those who are less attached to the university will become underrepresented in later waves of the survey.

As well as developing recruitment strategies which deliver on all three criteria it is important to recognise the potential trade-offs between the different criteria in selecting strategies. For example, focusing on a particular type of incentive to increase sample size can impact representativeness, but the scale of these trade-offs is often not apparent from the outset.

Our recruitment and retention strategy

Roughly 4,500 first year students enrol at King’s each year. We set out to recruit a minimum of 700 students to our panel, which would give an initial margin of error of 3.4%,[3] and still imply just a 5% margin of error if we had an attrition rate of 50%. We sought a representative panel in all respects other than WP status.[4] We wanted WP students to make up roughly 30% of the sample so that we could make meaningful comparisons between WP and non-WP responses.

Here we detail some of the core elements of our recruitment and retention strategy. However, it is important to note that the selection of recruitment strategies and the exact form of their implementation is context-dependent.

1. We recruited in waves

Untitled Diagram (1)

Figure 1: Initial recruitment procedure

Recruiting in waves allowed us to target specific groups with lower response rates to ensure a balanced sample.

On 30th September 2016 1,000 potential participants were selected using stratified random sampling, stratified across gender, ethnicity and department, with widening participation students deliberately oversampled to make up 300 of the 1,000 recipients.

These 1,000 were sent a text offering them a chance to participate in the first survey on 4th October. The message also informed them that they would receive an email with the link should they prefer to complete the survey online, with emails being sent that evening.

A few days later the composition of the panel was reviewed and another round of invitations were sent if the panel sample had not reached 700. The invitations in this round oversampled demographics which had lower panel sign-up rates in the first round to achieve a balanced final panel. There was to be up to three recruitment rounds. The procedure is detailed in full in Figure 1.

2. We asked for upfront commitment

Individuals like to be consistent and explicit commitments can have a strong influence on future behaviour,[5] including in the context of mail-response surveys.[6] To support retention through the panel waves we told all potential panel members that there would be six surveys through the year, stated the time commitment required, and emphasised the importance of responses to every survey.

This information was presented at the start of the first survey, and respondents had to make an explicit commitment to answering all six waves in order to continue to the survey. Only on completion of this first survey would they be counted as a member of the panel.

In asking for an upfront commitment we prioritised retention through the survey above the potential negative impacts on initial sign-ups.

3. We emphasised the relevance of the study

Individuals are more likely to respond to surveys when the topic is of interest to them,[7] therefore highlighting the personal relevance and significance of a survey is a useful tool for increasing response rates. However, it could also make the panel less representative if the messages only focus on areas of interest to particular subgroups of students.

In designing messages we focused on a few ways in which the survey might be of interest to students; the focus on understanding their personal experience, the impact the survey would have on the shape of student support at King’s, and the survey’s position as a significant piece of research.

By using a range of possible points of interest, but also by keeping each one broad, we aimed to maximise the recruitment effects whilst minimising associated bias

4. We used incentives

Incentives significantly increase participation in surveys.[8] That might not be ground-breaking news, but there is a lot to the design of good incentives. For example, whilst lottery incentives are effective in many contexts, there is inconclusive evidence in the context of online panel studies[9],[10] and recent work suggests guaranteed incentives are more effective.[11]

The type of incentive is also important. Incentives can bias the panel composition if they offer something which is of greater value to certain groups. In addition, incentives must not significantly change the students’ university experience – for this reason we did not use incentives which were connected to King’s (such as food or drink tokens for student cafes).

For this survey, participants received a voucher for Marks & Spencer following the completion of each wave. This was framed as a ‘thank you’ rather than a payment, to reduce potential crowding-out of intrinsic motivations for survey completion.

5. We sent (lots of) reminders, and personalised them

Students were sent up to four reminders for each wave of the survey, using both SMS messages and email. Reminders drew on a wide range of behavioural approaches, including those used in the initial recruitment. For example, they often reminded panel members of the commitment they had made at the start of the survey, and reiterated the potential impact of the survey.

In later waves the reminders were adapted based on the response pattern of the panel member. Those who had missed a recent wave could receive a message inviting them to return, whilst persistent responders were thanked for their commitment.

Our final sample

We recruited 769 first year students to be part of the panel. The panel was representative of the student population on all observable characteristics other than gender, with females being slightly more prevalent in the panel than in the first year student population at King’s. However, the extent of this divide was minimised through our wave-based recruitment which heavily targeted male students in the later waves.

Neither the slight underrepresentation of the Social Sciences and Public Policy faculty, nor the slight overrepresentation of Life Sciences and Medicine students was statistically significant. However, the final analysis of the panel will weight the data to ensure fair representation in responses.

panel retention4

Figure 2: Panel representativeness

rentention2

Figure 3: Panel retention across the waves

We also achieved a reasonably high retention rate throughout our panel, with over 60% of panel members responding to our final survey. This was particularly notable as the final wave took place a considerable period of time after the conclusion of the academic year, when it is often harder to engage students.

We will be making more details of the survey content available in October, but this will only be available through our mailing list.

If you would like to be added to our mailing list contact lifecycle@kcl.ac.uk.

Remember you can also follow us on Twitter @KCLxBIT to receive alerts of new blog posts.


Footnotes

[1] The margin of error specifies how close to the views of the population you can expect the survey responses to be. A 3% margin of error says that responses to the survey will be within 3% of the responses you would get by asking everyone in the population. This will be true 95% of the time, if the confidence interval is 95%.

[2] Observable characteristics are those we have information on, such as gender or subject studied. There are lots of possible unobserved variables which may affect responses but for which we don’t have data, for example how many other students the respondent knew when they arrived at KCL.

[3] Using a 95% confidence interval.

[4] We defined WP students as those with an ACORN consumer classification of 4 or 5.

[5] Lokhorst, A. M., Werner, C., Staats, H., van Dijk, E., & Gale, J. L. (2011). Commitment and behavior change: A meta-analysis and critical review of commitment-making strategies in environmental research. Environment and Behavior, 0013916511411477.

[6] Hinrichs, J. R. (1975). Effects of sampling, follow-up letters, and commitment to participation on mail attitude survey response. Journal of Applied Psychology, 60(2), 249.

[7] Groves, R. M., Presser, S., & Dipko, S. (2004). The role of topic interest in survey participation decisions. Public Opinion Quarterly68(1), 2-31.

[8] Booker, C. L., Harding, S., & Benzeval, M. (2011). A systematic review of the effect of retention methods in population-based cohort studies. BMC Public Health11(1), 249.

[9] Porter, S. R., & Whitcomb, M. E. (2003). The impact of lottery incentives on student survey response rates. Research in Higher Education, 44(4), 389–407.

[10] Heerwegh, D. (2006). An investigation of the effects of lotteries on web survey response rates. Field Methods, 18(2), 205–220.

[11] Stevenson, J., Dykema, J., Cyffka, K., Klein, L. and Goldrick-Rab, S. (2012). What are the Odds? Lotteries versus Cash Incentives. Response Rates, Cost and Data Quality for a Web Survey of Low-Income Former and Current College Students. Presented at The American Association for Public Opinion Research (AAPOR) 67th Annual Conference, May 18th 2012

Introducing the KCLxBIT Panel Survey

By Lucy Makinson, Behavioural Insights Team

Why did we run a panel survey?
Our understanding of the student experience is incomplete. At present the sector relies heavily on annual national-level student surveys, such as The National Student Survey (NSS) and the Student Experience Survey (SES), which are primarily designed to compare universities, and focus on the availability and quality of university provision (such as whether the teaching provided is of a high standard) more than the way students experience that provision and feel about university generally.1

To supplement these we can turn to a handful of one-off surveys, such as the Higher Education Academy (HEA) research into the first-year experience of university. However, as our recall of events can vary from the way we experienced them (see, for example, the Peak-End rule), we can take this research even further by asking people how they are experiencing something as the event takes place.

This is particularly important when considering wellbeing. Questions around overall wellbeing portray a very different picture from the sum of short-term happiness taken at multiple points in time, and even accurate assessments of wellbeing for the whole academic year will not capture the same information as short-term assessments.
For these reasons, tracking the experiences of a fixed cohort of students at regular points throughout the academic year will greatly change the nature, level of detail, and accuracy of the information we are able to collect.

Objectives of the survey
The KCLxBIT panel survey has two key objectives:

  1. To improve our understanding of how student engagement and wellbeing fluctuate over the course of the year through the use of regular measures of wellbeing and student activities;
  2. To identify differences (and similarities) between the university experiences of different groups of students, particularly Widening Participation and ‘traditional’ students. We focused on factors which have been shown to relate to student attrition or attainment.

Content of the panel study
The panel study consisted of six waves. Questions varied across the waves but focused primarily on key areas:

  • Wellbeing (current emotions),
  • What the student was doing (e.g. how long they were spending in the library), and
  • A range of measures found to predict student retention and attainment.

A selection of other questions were also included in specific waves. Some have been drawn from other surveys, such as the NSS, and provide useful comparators. Others have been exploratory questions, often with implications for the design of future support and of particular interest to KCLWP.

Panel survey overview

Figure 1: An overview of the timing and content of the panel survey waves.

 

The following presents some of the key sources for questions used through the panel.

  • Wellbeing
    In every wave we asked 21 questions to track student wellbeing. These were drawn from a range of sources, representing different aspects of wellbeing, including:

    • The Scale of Positive and Negative Effect (SPANE) which measures the balance of positive and negative emotions experienced in the preceding two weeks. It focuses on frequency of emotions rather than intensity which is found to better correlate with life satisfaction .
    • The HEPI/HEA Student Academic Experience Survey (SAES) which is more short-term focused than the SPANE and allow us to compare responses to those gathered in the national SAES.
    • The Perceived Stress Scale (PSS) which is the most widely used instrument for measuring the psychological perception of stress. This was of general interest from a student wellbeing perspective, and also because stress has been found to predict student persistence.
  • Student activities

In order to understand how students were engaging with the university and their peers, and the role of any external obligations, we adapted eight questions from the US-based National Survey of Student Engagement (NSSE) relating to participation in certain activities, such as co-curricular activities and working on coursework with other students. In the first wave we asked students how often they participate in the eight activities, and in subsequent waves students were asked how regularly they were participating. An additional question relating to the use of the King’s online learning platform, KEATS, was introduced from the second wave.

  • Factors relating to student retention

The literature on student retention focuses on two models to explain why students choose to persist or drop-out of university: Tinto’s theory of student departure (1975, 1987), [otherwise known as the Student Integration Model] and Bean’s model of student attrition (1980, 1983), [the Student Attrition Model]. We drew on two papers which had used surveys to evaluate these models, Pascarella & Terenzini (1980) and Cabrera et al. (1992), and adapted specific subscales which were found to predict student retention, including; peer-group interactions, external support, interactions with faculty, and academic and intellectual development.

The results
Over the next few months we will be releasing select results from the panel survey on this blog, along with more detail on our experience of running the survey.
We will also be making more details of the survey content available in October, but this will only be available through our mailing list.
If you would like to be added to our mailing list, contact lifecycle@kcl.ac.uk.
Remember to follow us on Twitter @KCLxBIT to receive alerts of new blog posts.

__________________________________________________________________________

Footnotes

1. The wellbeing questions in the Student Academic Experience Survey (SAES) are a notable exception.

Five trials to increase student engagement through text messages

By Maija Koponen, King’s College London

Universities offer a range of services and activities designed to support students and enrich their university experience, but in many instances widening participation students use these provisions less.

The KCLxBIT project has explored whether behaviourally inspired messages might offer a way of increasing student engagement, with a particular interest in the effect these messages have on widening participation learners.

Our trials have tested both whether receiving a message will increase the likelihood a student will engage with the services, and also whether the type of message received will produce differential outcomes in behaviour for different student groups.

Between September 2016 and February 2017 we carried out five large-scale message trials, each involving around 4000 first year students. Our aim was to increase engagement with the following services:

  1. Student union Welcome Fair
  2. Study abroad
  3. Compass student advice services
  4. KLaSS online study resources
  5. King’s Connect mentoring platform

All trials included text messages, and two were a combination of texts and e-mails. The trials incorporated a range of behavioural insights and were informed by the Behavioural Insight Team’s EAST framework.

All messages were personalised, meaning we addressed students by their first name – a key EAST framework principle.

The trials were carried out in collaboration with:

  • King’s College London Students Union (KCLSU)
  • King’s Study Abroad
  • The Compass (King’s cross-campus support service)
  • KEATS (King’s primary online learning environment) & IT services
  • King’s Alumni Relations team

The trials were designed as follows.

1) KCLSU Welcome Fair

The student union’s Welcome Fair takes place at King’s in September as part of Welcome Week, and is a key moment for new students to find out about, and sign-up to, student societies.

We wanted to test whether messages focused on the employability benefits of societies, or messages focused on the social belonging aspect of societies, would be more effective at encouraging students to attend the Welcome Fair and sign up for societies.

The belonging messages addressed the fact that many students worry about making friends at university, but that clubs and societies are a great way of meeting new people. The employability message, meanwhile, emphasised the value placed on societies or clubs by future employers.

Each trial arm received a total of 3 text messages, with examples given below.

Control Employability Belonging

[No messages]

Hi Kate. Build your skills & networks by joining a society or club.  Employers value these experiences. Explore Welcome Fair today or tomorrow @ Barbican Centre and see what’s on offer. #link

Hi Kate, lots of students are concerned about making friends in their first few weeks at uni. Don’t worry! There is a society or club for everyone.  Find yours at Welcome Fair @ Barbican Centre today & tomo: #link

2) Study Abroad applications

Students from widening participation backgrounds are less likely to apply to study abroad, which is often associated with more positive labour market outcomes, including a higher employment rate and higher salaries.

A previous trial, in the first year of this project, increased the number of students attending an information session about studying abroad. This time we wanted to test the relative strength of messages around the benefits or perceived barriers of studying abroad on both attendance at the King’s Study Abroad Fair and subsequent applications to study abroad opportunities.

As we had already established the positive impact of text messages on information session attendance, the control group for this trial received three basic messages with information about the Study Abroad Fair. The remaining first year students received four messages either focused on dispelling the potential barriers students might perceive in taking up these opportunities, or emphasising the benefits gained from studying abroad.

One group received a mixture of these messages in case a combination of the two turned out to be most effective. To control for ordering effects, the messages alternated between benefits and barriers-focused messages, with half the group receiving a message on benefits first and the other half receiving a message on barriers first.

Control Benefits Barriers Benefits + Barriers

3 messages

4 messages

4 messages

4 messages

Hi Kate,
King’s offers lots of ways to study abroad.
More info @ the Study Abroad Fair.
Tues 11-2 @ Great Hall, Strand campus
#link

Hi Kate,
Studying abroad is an incredible opportunity to travel the world, experience a different culture, and make lifelong friends.
“What I loved was the atmosphere, and the people were so welcoming.”
More info @ the Study Abroad Fair.
Tues 11-2 @ Great Hall, Strand campus
#link

Hi Kate,
Lots of students worry about the cost of studying abroad, but for King’s students it is often cheaper. For example, for one semester abroad you pay at least £3000 LESS in tuition fees for the year.
More info @ the Study Abroad Fair.
Tues 11-2 @ Great Hall, Strand campus
#link

A combination of messages from the ‘Benefits’ and ‘Barriers’ arms

3) Engagement with the Compass

The Compass is the King’s student advice service, providing information and support on academic, personal and financial issues, and with service desks located at all King’s libraries.

This trial tested whether portraying a need for support as a normal part of the university experience would be effective in encouraging students to seek advice and guidance from the Compass staff, beyond the impact of basic information about the service.

Students in the two trial arms received one text message and one e-mail. Messages in the ‘factual’ trial arm simply provided students with information about the Compass services. The ‘belonging’ arm had the same content, but reassured students that it is normal to struggle in the first term at university and suggested the services provided by the Compass as a way of accessing support.

Control Factual Belonging

[No messages]

Hi Kate. The Compass team provide information and support on everything from academic to personal and financial challenges. Find out more: #link

Hi Kate,
You’ve now been part of the King’s community for a term, and first year students have told us it’s good to have some extra support at this time of year.
The Compass team provide information and support on everything from academic to personal and financial challenges. Find out more: #link

4) Sign-ups on KLaSS

KLaSS is an online study skills hub, available to all King’s students, from where they can modules to support them in their studies.

For this trial we aimed to encourage students to sign-up to modules over their winter break, in the run-up to January exams.

One trial arm received information about KLaSS alongside an encouragement to make a plan for when they were going to spend some time exploring the resources available. The key behavioural insight here is that you are more likely to complete an action if you plan when you will do it.

The other treatment arm received this same message but it included additional content highlighting that many study skills at university are new to students, and that the KLaSS modules would provide additional support.

Control Planning Planning + Belonging

[No messages]

Hi Kate. Boost your academic performance over the holidays with King’s Learning & Skills Service (KLaSS). It can help with a range of key study skills.
We’ve sent you an email with more info or sign up now: #link

Hi Kate,
Lots of King’s 1st years find adapting to university study takes time.
Boost your academic performance over the holidays with King’s Learning & Skills Service (KLaSS). It can help with a range of key study skills.
We’ve sent you an email with more info or sign up now: #link

5) Sign-ups on King’s Connect

Our final engagement trial focused on increasing sign-ups to King’s Connect, an online platform where students can connect with King’s alumni.

The trial arms received information about the platform via two text messages. For one trial arm these messages just provided information about King’s Connect, whilst another arm also emphasised that King’s Connect is a unique opportunity for King’s students, and added a bit of loss aversion for good measure (“don’t miss out!”).

Control Factual Factual + King’s Opportunity

[No messages]

Hi Kate.
King’s Connect lets you contact 1800 King’s alumni to build mentoring relationships. They can provide support to you through your studies and help you think through questions from module choices to summer plans.
Sign up here: #link

Hi Kate,
King’s has alumni all over the world working in incredible jobs. As our student you have a unique opportunity to speak to them and learn from their experiences.
King’s Connect lets you contact 1800 King’s alumni to build mentoring relationships. They can provide support to you through your studies and help you think through questions from module choices to summer plans.
Don’t miss out, sign up here: #link

The results
What we found fascinating about this project is the valuable insights these nudge trials provide, even when (or perhaps especially when) the trials don’t produce the kind of results we were expecting.

In the spirit of full disclosure, while we have had some important successes, not all trials have produced the outcomes we would have hoped. This proves how important it is to properly test new interventions.

We will be publishing our results on this blog over the coming months, and will be discussing the main takeaway points from all trials – as each of them has definitely given us plenty of food for thought.

Don’t forget that you can follow us on Twitter @KCLxBIT to receive alerts of new blog posts.

Or, if you would like to be included in our mailing list, contact lifecycle@kcl.ac.uk

Thinking behaviourally about higher education

By Susannah Hume, Behavioural Insights Team

Small things can have a big impact on our lives. A transport museum visit might inspire you to become a train driver, while watching CSI might spark the idea of a career in forensics. Psychologists have known this for a while–that the way we interact with the world is influenced by many contextual factors, noticed and unnoticed.

However, when we’re designing policies, we often don’t consider these factors–we assume that the people we’re trying to influence are ‘rational’ in the economic sense: they weigh up the pros and cons of all their options before choosing that which, on balance, is best for them. And if the choices made don’t align with what we expect, then we look at the big levers for solutions: regulation, funding and fees, and information.

We tell the young person who wants to be a train driver to apply their skills to an engineering degree instead, because they can get a bursary and they’ll earn more at the other end; or the young person who’s applied to a forensic science course that studying chemistry will give them more flexibility.

And nothing changes.

What we’ve assumed is an information gap is actually something else–not only about the information itself, but the way in which it was given, and often the context in which it was received. This matters particularly for widening participation efforts, because the context will often depend significantly on a student’s background.

Behavioural insights and BIT

Whilst our behaviour may not seem rational to an outside observer, these ‘irrationalities’ can be remarkably consistent across individuals. Often, they result from systematic rules of thumb (or, ‘heuristics’), which we use to simplify complicated decisions we face. The work of Daniel Kahneman and Amos Tversky systematised and quantified some of these systematic deviations from the predictions of standard microeconomic models, while the work of Thaler and Sunstein, which Anne-Marie has already written about, started the work of bridging the gap between the academic insights and their practical applications.

The Behavioural Insights Team, which started life in 2010 inside No. 10 Downing Street, was the world’s first government institution dedicated to systematically applying these insights from the behavioural sciences to improve public policy. We’re now a social purpose company with offices around the world, working in almost every policy area, and with 20 governments worldwide. Our approach has two pillars:

  1. Thinking differently about how people interact with public services; and
  2. Raising the standard of evaluation applied to policy or service changes, be they big or small. Our CEO, David Halpern, is also the government’s chief advisor on the What Works programme.

A case study on university aspirations

The need to both think differently and test interventions was illustrated by a project we ran with the Somerset Challenge in 2014, investigating ways to raise university aspirations among sixth form students in the county. Working with a collection of secondary schools in Somerset we ran a study to test three interventions:

  • providing young people with information about the costs and benefits of attending university;
  • providing the same information to their parents; and
  • giving students a short talk from a former student from their area who went to university.

Perhaps at this point you would be willing to pause and think about which of the three approaches above you expect to be successful, or any that you think might not have been. Once you’ve fixed your prediction in mind, read on!

We ran this study as a Randomised Controlled Trial (RCT – a topic for a forthcoming blog post), which meant that any difference in aspiration we observed following the interventions could be attributed to the intervention they received.

Firstly, the talk significantly increased students’ interest in university and their likelihood of applying (well done if you picked that!). Further analysis revealed that this was driven by the belief that attending university would result in better friends and a more interesting life, but these students also recalled key elements of the financial information.

However, providing parents with information cards had no effect on students’ interest in attending university, while giving the same information to students actually made them less interested in attending.

If you picked that–and you haven’t already read the report–very well done. We certainly didn’t develop these cards with the expectation that they would discourage young people from university; in fact, we thought that perhaps because of the recent tuition fee increase, there might be a genuine information gap about the benefits of university that needed correcting.

However, what the results show is that the how and who of delivery may matter as much–if not more–than the content. The inspirational speaker, who was from the same background as the students he was speaking to, was able to address perceived social and identity-related barriers to university through a medium the students could relate to, and provided the financial information within this context. The cards, which just addressed financial barriers, were not able to do that.

What’s next?

Since that first trial, we’ve also published the results from another RCT, where we increased applications and acceptances by disadvantaged young people to highly selective universities, at a cost of just £45 per additional student who accepted a place. You can read more about that project here.

We’ll be writing more about the behavioural insights approach, the simple ways that we can tweak systems to help people persist and succeed in higher education, and the very exciting results from our collaboration with King’s College London over the coming months–please subscribe to the RSS feed or follow our Twitter account to be the first to hear about updates.

The beginning of KCLxBIT

By Anne-Marie Canning, King’s College London

At King’s College London we’ve always known that helping widening participation students to ‘get in’ is not enough but needs to be coupled with a focus on helping those students to ‘get on’ too. That’s why we have a full lifecycle approach to widening participation:

Full lifecycleIn recent years the Office for Fair Access has become increasingly concerned about student success as well as student access, and rightly so. We know that the student experience at university is highly stratified and the consequences of this have been expertly detailed by my colleague and friend, Dr Anna Mountford-Zimdars, in her 2015 report for HEFCE ‘Causes of differential outcomes.’ It impacts in terms of social belonging, outcomes and labour market progression. If we’re serious about enhancing social mobility trajectories we need to have a joined up approach to student access and success.

An abundance of opportunities

I often think about the student experience as something like a jewellery box. The undergraduate experience is abundant with opportunities. I imagine some learners reaching into that jewellery box and taking out internships, study abroad experiences, research projects, societies and many more ‘high return’ co-curricular activities. We know that these activities are highly valued by employers and have a positive impact on student belonging and satisfaction. ‘Who does what’ at university became a preoccupying issue for me after dipping into the datasets and observing distinct patterns of participation. These ‘game changing’ experiences were concentrated in sections of the student body and I hypothesised that we could shift this by helping students to connect with the right opportunity at the right time.

Always carry books around with you

In June 2015 I had been reading Nudge by Thaler and Sunstein. I’m not sure why I picked up the book but I was having a great time reading it when I met with Professor Jonathan Grant, Director of the Policy Institute, for an introductory meeting. I explained to Jonathan that I had been thinking about how behavioural economics could help to improve the student experience for widening participation learners at King’s. Jonathan kindly offered to introduce me to the Behavioural Insights Team (BIT) at the Cabinet Office. Fast forward a few weeks and I was presenting the challenges of full lifecycle widening participation and explaining how I thought behavioural insights could play a role in improving student outcomes and experiences to David Halpern and Raj Chande, CEO and head of education at BIT respectively. Interesting work on the application of behavioural insights to the university experience was already being led by Professor Ben Castleman (mastermind of the Obama ‘Better Make Room’ campaign and author of ‘The 160 Character Solution’) at the University of Virginia, and Professor Philip Oreopoulous (author of ‘Behavioural Economics of Education: Progress and Possibilities’) at the University of Toronto.  Following my conversation with the team at BIT they agreed that this could be an interesting collaboration and so we commenced the first ever project looking at the application of ‘nudge’ in a UK university context.

A two-year pilot project

The project was considered by the ethics office at King’s College London and we moved into a two year pilot programme overseen by an expert advisory board. Student journey workshops and panel surveys have given us rich insights into the lived student experience. What is so exciting about these models of exploration is the way in which they asked students about them rather than about the institution. This represented a significant departure from the traditional higher education satisfaction survey model and allowed us to work out what matters to students and when we might best offer an intervention. We have delivered a range of complex and simple randomised control trials within the King’s College London ecosystem using institutional datasets to measure efficacy. First year students have benefitted from a range of programmes and pointers to help them make the most of their time at the university. Acrobatic analysis and application of machine learning techniques has helped us to understand the impact of our intervention on different student groups and begin to reshape how we structure the student experience at King’s. Our blog will detail the tribulations and triumphs of our experimental approach. We look forward to sharing our results and lessons learned in the hope that others can take encouragement from our work and adopt behaviourally inspired methods in their own contexts. By bringing new ideas together with old problems I believe we can make faster progress in helping students make the most of their talents and opportunities.