Contextual Admissions: A-Level Playing Field

By Chiamaka Nwosu, Data & Research Analyst and Yasemin Genç, Research Associate|

While it is a university’s prerogative to recruit the best talent, it has been acknowledged that prior attainment isn’t necessarily the best reflection of potential for all students, and that fair access does not necessarily mean treating everyone the same. Contextual admissions aim to remedy this, however, there is limited insight regarding how students who receive a contextual offer (which may include reduced grades) do once admitted. In this blog we discuss contextual admissions and the merits of certain evaluative approaches.

Contextual Admissions help disadvantaged students reach their potential

Disadvantaged young people are 8 times less likely to enter a high tariff university compared to their most advantaged peers[i]. This is often due to their lower levels of attainment in combination with the high academic entry requirements of selective universities[ii] .

Imagine a situation where a climber burdened with weights reaches the summit shortly after their partner who climbed freely. Would you dismiss the weighted climber’s ability; tell them that they are not as good as the other climber; that they are no longer allowed to climb? No. In fact, you would instead consider their potential; perhaps they are in fact better than the other climber; how might they have done if they too climbed freely.

The purpose of contextual admissions is to identify the ‘weights’ a student bears, to consider them accordingly, and to provide a mechanism to recognise each student’s true potential. Contextual admissions policies give these students allowances such as reduced grade offers so that their specific circumstances do not limit them from entering selective degrees.

Contextual Admissions in Practice: The example of King’s

King’s introduced contextual admissions in 2014. Though eligibility criteria have expanded since then, the guiding principle – that applicants should be considered as a whole rather than letting attainment data alone speak for them – remains.

If an applicant lives in a disadvantaged area (e.g. ACORN, POLAR4[iii]) our admissions team “flags” them for contextual consideration. To provide a more nuanced picture, our WP team combines these area-based indicators with individual-level data (e.g., parent’s occupation, disability or participation in one of our flagship outreach programmes; K+) and pay special attention to applicants’ personal statement and references. Where more background information is still needed, we invite candidates to interview to assess their contextual eligibility [iv]. Once an applicant is identified as contextual, they are often given a non-standard offer (e.g. an offer with a reduced tariff; or where not appropriate, an offer to study an alternative programme).

Evaluation is important to overcome uncertainty and reluctance

In Scotland, the national Widening Access Commission ensures that Scottish universities do not evaluate a student’s grades, or other attributes without first understanding students’ circumstances [v]. In England, this is not yet the case.

English universities are encouraged [vi] to use contextual admissions processes. However, this is at each institution’s’ discretion. Many universities (especially selective ones) are reluctant [vii] to implement contextual admissions; due to the (presumed) uncertainty of the effect they have on the students (e.g., are they being set up to fail?), and universities (e.g., does lowering grades mean lowering standards?). Evaluation is important to overcome this uncertainty and reluctance to further encourage implementation of contextual admissions, and to provide opportunities to those who deserve them.

Access to quality data plays a crucial role

Access to data and the quality of the data available plays a crucial role for this. Higher education institutions have access to several years of student cohort data, including granular individual-level data on student demographics, attainment (pre- and post-HE entry) as well as compound proxies of disadvantage (e.g. IMD, ACORN). This array of historic data makes it possible to conduct a rich assessment of attainment by group.

However, a descriptive approach on its own wouldn’t highlight confounding factors, making it extremely difficult to isolate the effects of contextual admissions on student attainment and overall wellbeing. For instance, there might be core characteristics within groups which may correlate with the outcome of interest. Historical data can also come with limitations due to changes in practices over time, such as modifications to entry criteria or changes to the classification of certain groups of students. Additionally, the difference in policies across institutions means our findings cannot necessarily be generalised.

Finding Out What Works

When good quality, historical data is available, quasi-experimental methods (i.e. non- randomised, before-after intervention studies) are the best approach because they allow us to estimate the causal impact of a program in cases where randomisation is difficult or impossible – such as being considered for contextual admissions. Using quasi-experimental methods account for some common issues that would otherwise be experienced if using standard regression models only, such as:

  • Ecological Fallacy: controls that rely on area-level measures (e.g. ACORN and POLAR) do not tell us much about the disadvantage of the students, as – particularly in London – low-income students can come from high-income postcodes and vice versa.
  • Uncaptured disadvantage: there are likely additional sources of disadvantage that are correlated with the outcomes and that the formal measures in the model do not capture.
  • Incompleteness: Contextual factors are not reported for all students – specifically for students who meet their standard offer even if they come from a POLAR Quintile 1 neighborhood. As confounding factors for disadvantage, as masked by POLAR, are likely a strong predictor of attainment and completion, this misses an important variable.

Choosing an appropriate mode of evaluation: RDD or DID?

Regression Discontinuity Designs (RDD) and the Difference-in-Differences (DID) method are examples of quasi-experimental research designs. The RDD uses a cutoff to determine the assignment of an intervention and its causal impact, while the DID estimates causality by calculating the differential effects between two distinct groups. There are benefits and limitations to using either, and the decision on which method to use is largely driven by the data.

An RDD allows us to compare similar students on opposite sides of a certain threshold. For instance, if a potential student narrowly misses a contextual offer, after controlling for confounding factors, they should not be significantly different from another student that just meets the criteria and gains a place. Therefore, we can assume that any differences in outcome may be attributed to the treatment i.e. the contextual offer.

The DID method, on the other hand, relies on the existence of two groups and multiple time periods. It compares the trend in the outcome variable between students in the group of contextual offer holders with their statistical counterparts in the comparison group of non-offer holders before and after an intervention. Comparing the difference between the two differences is the effect of the intervention.

At King’s we’re investigating contextual students’ attainment, conversion and retention

Provided that all assumptions for either quasi-experimental method are met, we can accurately estimate a ‘what if’ scenario i.e. the change in an individual’s observed outcome over time due to the treatment received. In this case, the effect of receiving a contextual offer on the student’s attainment, conversion and retention.

We can also explore the effects of singular indicators e.g. individual factors such as household income compared to area data like ACORN, to determine what works best together to form a basket of evidence-based indicators.

This kind of quasi-experimental analysis has the potential to identify which contextualised admission policies do (or don’t) work, and for which students. We will also be able to identify areas that need to be strengthened to ensure that effective practices continue, and that students with potential who may have been held back academically get the opportunity to succeed.

What Next? Co-operation and results-sharing to influence policies

Every applicant wants to know where they stand when applying to university. They want the surety which comes with clear equations: doing x, plus y, equals success. Universities, for effectiveness and consistency, would like that too. We hope to make a first step in identifying this for King’s with our analysis, to ensure a level playing field for students with the potential to succeed.

Given the variety of policies across the sector, our results may only be valid for King’s. If you are interested to learn more about our approach or would like to co-operate and/or share your own results or methods, feel free to contact us at whatworks@kcl.ac.uk.

_______________________________________________________________________

Click here to join our mailing list.
Follow us on Twitter: @KCLWhatWorks

[i] UCAS (2019) 2010 End of Cycle Report.  Cheltenham: UCAS. Available at: https://www.ucas.com/file/311296/download?token=p1nWONan

[ii] Boliver, V., Gorard, S., & Siddiqui, N. (2019). Using contextual data to widen access to higher education. Perspectives: Policy and Practice in Higher Education, 1-7.

[iii] ACORN, POLAR, and IMD are recognised measures of disadvantaged based on averages by area.

[iv] King’s does this to overcome the reliance on crude measures that can result in false positives and lack validity

[v] Government of Scotland (2016). A Blueprint for Fairness: Final Report of the Commission on Widening Access. [Online]. Available at: https://www.gov.scot/publications/blueprint-fairness-final-report-commission-widening-access/

[vi] Office for Fair Access (2013) Office for Fair Access. OFFA Guidance on Preparing Access Agreements for 2014–2015; OFFA: Bristol, UK, 2013.

[vii] Adnett, N., McCaig, C., Slack, K., & Bowers‐Brown, T. (2011). Achieving ‘Transparency, Consistency and Fairness’ in English Higher Education Admissions: Progress since Schwartz?. Higher Education Quarterly, 65(1), 12-33.

 

1 Comment

Leave a Reply to pre workout migliore Cancel reply

Your email address will not be published.


*