Instructor: Dr. Alice Collier
Module: Chemistry and Chemistry with Biomedicine MSci & BSc Integrated laboratory module 1 & 2 (level 4 NMS)
Assessment activity: using exemplars in workshops with student-negotiated criteria
Why did you introduce the use of exemplars?
The laboratory component of the Chemistry BSc and MSci programmes aims to develop scientific writing skills through multiple cycles of assessing written assignments (laboratory notes, laboratory reports and literature reviews) and providing feedback. A common assessment rubric is used across all 3 years in order to provide consistency and the opportunity for students to act on feedback from previous assignments.
It is commonly accepted in Higher Education theory that, to be effective, feedback must be clear and provide a bridge between students’ prior knowledge and where they need to be in relation to the standards set (Hattie and Timperley, 2007). In practice, due to the need for transparency of marking processes, tutors often use the formal language of assessment criteria in their written feedback. This can leave learners feeling confused by academic terminology and unable to use the feedback provided (Winstone, 2017).
One way of addressing this problem is to develop a shared understanding of assessment criteria through in-class discussion and the use of exemplars (Rust, Price, and O’Donovan, 2003). According to Winstone et al. (2017), engaging students with marking criteria is said to have the following benefits:
- Enhanced learners self-reported awareness of learning objectives
- Increased undergraduates appreciation of the assessment process and expectations upon them
- Increased accuracy in self-assessment
- Learners claimed to be more likely to consult marking criteria when completing subsequent work
Paul Orsmond (2004) also believes it allows students to have some ownership of the assessment process and helps them develop as autonomous learners.
How did you set up the exemplar activity?
In the module, we already ran a 1st term workshop on writing laboratory notes, but this was less effective than was hoped. Therefore, we adapted this workshop to include more exemplars, more discussion and an exercise on class negotiated criteria. This process is summarised in the table below:
|Format of previous workshop||Format of new workshop|
||· Students are provided with written details of the content of laboratory notes and exemplars and asked to read these before the face-to-face session
· In small groups students discuss the exemplars and comment on strengths and weaknesses
· Tutor leads whole class discussion of strengths and weaknesses to come to a class consensus
· In small groups students reflect on discussion and produce their own descriptors for Excellent, good, satisfactory and weak for individual criteria.
· Tutor leads a whole class discussion and consensus on descriptors is reached.
· Faculty/College marking criteria are given to students and tutor leads on a class discussion of how their descriptors match the existing Faculty criteria
This format was repeated in the 2nd term with laboratory reports and students were asked to apply their marking criteria by providing peer feedback on the 1st of three laboratory reports submitted.
What benefits did you see?
In addition to benefits for students, the discussions provided some key insights for me into how students interpret marking criteria and where they concentrate their efforts in assessments. On occasions where students voiced interpretations or efforts that were misplaced these could be addressed immediately with the whole class present before the students tackled any assignments. It was also possible to get ideas and inspiration for how to word feedback comments to help students understand where they had gone wrong.
Students were positive about the peer-assessment exercise in the free-text comments of the end of year module evaluations and there was an increase in students agreeing to the question ‘assessment criteria were made clear in advance’.
The most encouraging evidence came from student engagement with the interactive coversheets used for laboratory reports. The coversheet asks students to reflect on feedback they have received but also gives them the opportunity to request feedback. The sheet asks students to use the marking criteria to help them be specific in their requests for feedback. These coversheets have been used for the last 3 years for the 2nd and 3rd laboratory reports submitted by students. The table below shows the percentage of students in each year group who requested feedback and then what percentage of these students made detailed specific requests (as opposed to simply naming sections of the report e.g. introduction, results, discussion). Completion of the coversheet is optional and there is clear variation across cohorts in percentage of students who engage with the coversheet in general. Of those who do engage, the percentage of students requesting detailed feedback does appear to increase following the introduction of the revised workshop. Anecdotally, the number of students using the weekly drop in sessions to request help before submitting their assignment also increased.
|Request for feedback||2015-16||2016-17||2017-18 (after delivery of revised workshops)|
|2nd Laboratory report||Total % of cohort||70%||26%||44%|
|% of total requests that were specific||28%||22%||35%|
|3rd Laboratory report||Total % of cohort||70%||15%||32%|
|% of total requests that were specific||26%||15%||44%|
This points to students taking a more active role in the assessment process and enhanced use of the coversheets has increased the richness and avenues for feedback dialogue between tutors and students.
What challenges did you encounter and how did you address them?
Time: The adapted workshop required no extra face-to-face contact time and needed very little extra preparation as the exemplars and resources were already available. If the class-derived criteria needs to be transcribed then this takes a small amount of time after the workshop but if a visualiser is used and ideas are collated using pen and paper or entered directly into a projected document then these can be uploaded to KEATS as scans/pdfs directly following the workshop.
Different cohort groups: The 1st year cohort for chemistry is currently between 80-100 students. The cohort is split into 2 groups and the workshop described is run in duplicate. This means that 2 versions of co-created criteria are produced which makes it difficult to simply use the student criteria for marking as there may be subtle differences. This is why the decision was made to include the activity of fitting the student criteria to the College criteria in the last stage.
Student preparation: There were issues with students not completing the pre-reading of the exemplars. In these cases students were either asked to return the next day for the duplicate workshop or they were given 30 minutes during the group discussions to quietly read the exemplars and make their own individual assessment of strengths and weaknesses before joining in the class discussion.
What advice would you give to colleagues who are thinking of using exemplars?
Good choice of exemplars is key to eliciting good points for discussion. Take some time to choose exemplars (both good and average) which highlight some of the criteria you feel is key. This way most students will spot these important criteria without you having to lead them to the ‘answers’.