Student-authored exam questions: a cross-institutional student-centred approach to summative assessment preparation

students from UCL doing the exam question activity

UCL students collaborating to write exam questions in Antonio’s course. Reproduced with the permission of the students featured.

Instructors: David Sheard (KCL) and Antonio d’Alfonso del Sordo (UCL)

Email: David.1.sheard@kcl.ac.uk and a.dalfonsodelsordo@ucl.ac.uk

Module: Level 4, BSc Mathematics 4CCM111A Calculus I (Mathematical Sciences, NMES, KCL) and BSc Earth Sciences MATH0049 Mathematics for Engineers 1 (Dept of Mathematics, UCL)

Assessment activity:  In this case study, we present two versions of the same type of activity which took place on two modules at two similar institutions, namely asking students to write exam-style questions, and then solve and assess each others’ work.

Why did you design this activity?

Two big problems we face as educators when it comes to exam/revision teaching are 1) preparing students sufficiently for their exam without teaching to the exam; and 2) addressing the wide disparity among our students when it comes to revision/exam technique and related aspects of the so-called hidden curriculum.

By replacing traditionally taught revision sessions with students writing their own exam questions, students are obliged to reflect on the course content; their understanding of how different topics are related; the learning outcomes (whether stated explicitly, or understood implicitly); and how these elements are assessed.

Students are sometimes lured into counter-productive revision strategies, or perhaps find that strategies which worked in the past (e.g. in school exams) are no longer useful for university exams. Asking students to reflect on the purpose and method of their assessment, and then to implement this in practice may help them identify what revision/exam techniques are most helpful.

Students writing exam (style) questions is not a completely novel idea and has been tried in several disciplines, usually with multiple choice questions (MCQs). The Unverity of Auckland in new Zealand has an online platform, Peerwise, where questions written by students in a wide range of subjects (including Mathematics) have been collected [PeerWise]. Using student-authored MCQs has been investigated [Hutchinson 2013], particularly in the context of Biology [McLeod 1996], [Sibomana 2020], and [Jobs 2013]; and medicine [Craft 2017] and [Palmer 2006]. They have also been used as a tool in continuous assessment for learning [Davis 2013]. Some research on student generated MCQs outside if biology and medicine has also been done, for example in accounting [Geiger 2020], and business studies [Fellenz 2004] and [Sharp 2007]. Doyle and Buckley have quantitively analysed the effect of students creating MCQs on their learning, and found a statistically significant impact on students’ end-of-module assessment marks [Doyle 2019]. To the best of our knowledge, little research on student-authored long-form exam questions has been carried out, nor research which looks specifically at mathematics teaching.

Our activities aimed to cover the full range of the exam setting-sitting process, from what and how to assess, through creating, sitting, marking, and giving feedback on assessment questions. This allowed us to engage students with a wide range of active learning approaches. The overall design was informed by DARTs (directed activities related to texts) [Vester 2008], a strategy sometimes used when teaching English as a foreign language, an area where Antonio is a specialist. In particular, DARTs can be used to develop reading comprehension exercises for students, by students.

How did you introduce the activity to students?

Writing exam questions is a very complex task which brings together disparate skills and knowledge. In order to prepare students adequately for this, we have found that it is important to lead in by encouraging students to reflect on principles of good revision, and exam technique. We approached this in different ways, eg using interactive polling tools for larger classes, or think-pair-share for smaller classes.

We then introduced general exam question writing principles; in particular, having a balance between familiar questions and unseen material. To put this in practice, students were shown past exam questions, and through discussion with their peers, asked to identify features such as command words (show, state, hence, etc.), relationship with learning outcomes, and identifying  hypotheses/assumptions, and the form required for the final answer. We also gave them an unseen exam style question, and asked them to simulate what they would do in the first five minutes of an exam in terms of planning their approach to the question.

How did you set up the formative activity? 

We implemented this activity on two similar first year calculus modules; one at King’s for mathematicians, and the other at UCL for Earth scientists. Both versions of the activity had three phases: 1) Lead-in: exam question structure and exam technique; 2) Production: guided design of exam questions and mark scheme in groups; and 3) Feedback: group solving and peer marking of each others’ questions.

In David’s version, which was run at King’s with a course of around 300 students, these were spread over three one-hour sessions in the last three weeks of term (alongside with other teaching activities) run in four parallel sets of sessions. In practice, the attendance rate was quite low for these sessions. In the other version at UCL run by Antonio, the three phases were consolidated into a single one-hour session with about 30 students present.

Both of these approaches took place in the Autumn term of the academic year 2023-24 in Mathematics courses for first year students. In each course the content was similar, consolidating and extending what students had previously learned during their A-levels. Throughout these activities, we collaborated to plan the sessions and fed-forward feedback from each other’s sessions to anticipate and mitigate challenges which arose.

We have outlined the Lead In phase in the previous section. For the Production phase, it was time for them to try to generate ideas for exam question writing. In one case, we discussed the possibility of using ChatGPT to come up with the unseen, most challenging parts of the question. Although, it was made clear that ChatGPT does not always return the correct question/answer, so care is required.

We here outline the seven-stage process (Production and Feedback) that was used for the single two-hour version of this activity. The class was divided in (at least 3) groups of 3 to 4 studentsEach group had a team-leader who randomly picked a piece of paper indicating two topics from different parts of the course that they should assess in their question. The team leaders were chosen to be students who had done well on the coursework, to ensure that group members had a range of levels.

  • Stage 1: Plan. Students were given 10 mins to plan the content of the subparts of the question and the learning outcomes they want to assess – students were asked write this out explicitly.
  • Stage 2: Write the questions. Students had 20 mins to formulate their questions, with help from the lecture notes and problem sets. The more time that can be allowed for this the better, even 20 minutes is quite restrictive.
  • Stage 3: Worked solution/mark scheme. Students had 10 minutes to write out solutions. This can be done alongside the previous step. It is important, however, that students write questions and answers on different pieces of paper for the following stages. Students also need plenty of time for this step.
  • Stage 4: Solve another group’s question. Each group gave their question to another group to solve in 10 mins. This is where students are most comfortable; but, in some cases, they spotted mistakes/ambiguities in the questions and were allowed to ask the group who wrote the question for clarification.
  • Stage 5: Mark allocation. A third group is given the question together with the mark scheme. They agree on how to allocate the marks for that question (before seeing the second group’s answers). This stage, along with the next one, should roughly take 10 mins.
  • Stage 6: The marking. The third group marks the second group’s answer and provides that group with feedback. This was probably the most enjoyable part for students as they put themselves in the role of examiner.
  • Stage 7: Session feedback. At the end of the session, students were asked to complete a questionnaire with concept check questions, questions about their experience of the activity, and open-ended feedback questions.

If time permits, the second and third groups can also give feedback to the first group about the level and appropriateness of the question as an exam question. To motivate the students, a prize (consisting of a calculus textbook and some chocolate) was offered for the best question. This helped the students develop some healthy competitiveness.

David’s version of this activity at King’s was broadly similar, except spread over three sessions. The lead-in was in the first session, stages 1-3 in the second, and stages 4 and 6 in the third. One difference was that the mark allocation (stage 5) was combined with stage 3. Stage 7 was incorporated at the end of each session as an exit-ticket.

In this version of the activity, the quality of the questions written by students was extremely variable because they were given less scaffolding in terms of the topic and structure of the questions they should write, and the group size was larger (4-6). So that the third session worked, the three best questions were selected after the second session, and neatly typed up – each group was assigned one of these to solve and then another to mark.

How did you and the students check their learning?

The easiest and most direct way to see how students had performed was to look at the questions they wrote. It turned out to be very easy to judge at a glance which groups had done poorly or well. The poor questions would often only have a single part, if they had more than one part, those parts did not assess distinct skills, and were often either trivial or mathematically impossible. On the other hand, good questions had several parts which assessed different skills, at a reasonable level of difficulty.

When it comes to the lead-in, as the teaching was discursive, it was possible to obtain immediate feedback on their understanding. In the final phase, students gave each other feedback on their questions and answers, allowing them to learn from one another and to appraise their own learning. We also ran exit tickets/feedback surveys at the end of each session. This gave us, as instructors, an insight into how students felt about their learning, but surprisingly (to us) was also a way for students to reflect on their learning.

An important component of these activities is collaborative creation. How this went could, of course, be judged indirectly by the quantity of the output: the questions. We could also observe this aspect directly as we walked among the groups during stages 1-3 and discussed question ideas with the students. Here it was easy to identify instances where some students either took over their groups, or did not engage meaningfully with the activity.

How do students benefit  from this activity?

The clearest benefit that these activities are designed to provide is to better prepare students for their exams. There are several facets of exam preparation which these activities touch on, so it is worth taking a moment to look at them.

  • By asking students to write exam questions, we test knowledge of how different parts of the course content relate to each other; we make them reflect on the learning outcomes, and the possible ways of assessing them; and ask students to evaluate what can reasonably be done under exam conditions. All of these aspects can be quite difficult to target using more standard teaching activities, which was pointed out in the feedback: “[this activity gave me a] unique understanding of exam style questions one wouldn’t gain with any other activity”.
  • Our students arrive at university with an already developed set of revision and exam techniques, which sometimes conflict with the way university assessments judge their learning. By authoring exam questions, students develop their metacognition on what and how different types of exam questions assess the learning outcomes. It is, itself, a revision technique they can choose to repeat with their peers outside of the classroom. It also provides them with a new set of exam-style questions to practise with, as well as with a way to generate more.
  • From the feedback questionnaire responses of the Earth Science students, it was clear they had gained better empathy with the examiner/marker in terms of the types of skills assessed by certain questions – “[this activity showed me] what can let me get marks” – as well as emotionally with the exam setter, with one student saying they now understood “how hard it is to make a perfect question”.

There are some benefits which are less direct.

  • Students collaborate in groups, so they develop their team working skills, and can pool their knowledge and experience. One student said the activity had unlocked their “forgotten knowledge”: things they once knew were recalled to them by their peers.
  • Written communication, particularly as it relates to the accurate use of subject specific terminology, is a key part of writing clear questions and mark schemes, which this activity allows students to practise.
  • It was gratifying to witness evidence of learners’ awareness of many of the other benefits emerging spontaneously through the feedback questionnaires. In this way, the feedback is not only valuable as an instructor, but is a vehicle for students’ own reflection on their learning.

What challenges did you encounter and how did you address them?

  • Time: Students find every part of the activity challenging, and so it is important to give them plenty of time. We devoted two or three hours of in-class time, which is a significant commitment. Even so, students said that they would have liked more time in the case of the two-hour session.
  • Clarity and nature of instructions: For the feedback phase (solving and marking questions) to work, it is essential that the production phase results in questions and mark schemes of acceptable quality. Without clear instructions, templates, and teacher input, we found that students often write very poor questions, questions which are sometimes impossible, and incomplete, incorrect, or non-existent mark schemes. This hobbled the next phase, and often meant they did not realise that they had written an impossible question.
  • Format of the sessions: When the activity was spread over three sessions, a large number of students did not understand the purpose of what they were doing, or how their work related to previous and future sessions. Without that strong student buy-in, they reported perceiving the activity as a waste of their time. This was not an issue when the activity was run as a single session, and students were very engaged and motivated throughout.
  • Managing time and teamwork: There were several instances where a team of 4-5 students working for 30 minutes would have only a single sentence question to s
  • Accessibility: while we did not observe any problems in our sessions, it is possible that some students may struggle with being thrown immediately into group work, e.g. some neurodiverse students. To mitigate against this, one could allow 1-2 minutes at the start of stage 1 for individual thought. Note, however, the collaboration is essential, so should not be impacted too much.

What advice do you have for colleagues trying this?

If you want to run an activity like this, very careful planning is not optional if you want to avoid problems. Students need precise instructions, continuous guidance, and thorough scaffolding throughout. Writing questions is difficult for lecturers, and we must remember that we are asking students to do it before they have even finished the course, never mind revised. Some key things to think about are the following:

  • Clear instructions are essential, and the way this activity is introduced and motivated throughout is very important for students’ buy-in.
  • Constructive feedback from a strong-performing student was that a template with key command words could mean that they do not spend too much time thinking about the structure of the question. An example could be: “(a) State/ Give the definition of… (b) Hence, show that … c) Find/Determine…”.
  • Allow students plenty of time to think and write. If a full question is too ambitious, one could think of doing only two parts (a) and (b). Sometimes less is more.
  • Assigning specific topics to each group can speed up the planning process and focus students’ attention on the question writing.
  • Monitoring is key, particularly when it comes to time management and ensuring they complete all parts of the activity: writing the question, mark scheme, and linking their question to the learning outcomes/key skills.
  • Where practical, try to assign students to groups so that the ability levels are mixed.
  • Writing exam questions is very difficult for students, so lower your expectations about the quality of the questions students wrote. If there is a break between the Production and Feedback phases, you can select the better questions, and address any minor issues these had whilst maintaining the authenticity of the students’ original questions.
  • Because it is challenging, students may not always enjoy this kind of activity – but this does not mean they do not benefit from it. Clearly providing the rationale throughout the activity motivates them because they understand how they benefit from it.
  • Some of the essential skills involved can be built in the course so that the students are prepared when they do this activity for the first time. One example could be including some peer marking earlier in the term, or asking students to compare and assess some sample responses to an exam question.

References

[Craft 2017] Craft, J.A., Christensen, M., Shaw, N., Bakon, N. (2017) “Nursing students collaborating to develop multiple-choice exam revision questions: A student engagement study”. Nurse Education Today, Volume 59, p6-11.

[Davis 2013] Davis, T. A. (2013) “Connecting Students to Content: Student-Generated Questions”. Bioscene: Journal of College Biology Teaching,  Volume 39(2), p32-34.

[Doyle 2019] Doyle, E., Buckley, P. (2022) “The impact of co-creation: an analysis of the effectiveness of student authored multiple choice questions on achievement of learning outcomes” Interactive Learning Environments, Volume 30(9), p1726-1735.

[Fellenz 2004] Fellenz, M.R. (2004) “Using assessment to support higher level learning: the multiple choice item development assignment” Assessment & Evaluation in Higher Education. Volume 29(6).

[Geiger 2020] Geiger, M.A., Middleton, M.M., Tahseen, M. (2021) “Assessing the Benefit of Student Self-Generated Multiple-Choice Questions on Examination Performance” Issues in Accounting Education. Volume 36(2), p1–20.

[Hutchinson 2013] Hutchinson, D. and Wells, J. G. (2013) “An Inquiry into the Effectiveness of Student Generated MCQs as a Method of Assessment to Improve Teaching and Learning.” Creative Education 4, p117-125.

[Jobs 2013] Jobs, A., Twesten, C., Göbel, A., Bonnemeier, H., Lehnert, H., Weitz, G. (2013) “Question-writing as a learning tool for students-outcomes from curricular exams”. BMC Med Educ. Volume 13(89).

[McLeod 1996] McLeod, P. J. and Snell L. (1996) “Student-generated MCQs”. Medical Teacher, Volume 18(1), p23-25.

[Palmer 2006] Palmer, E., Devitt, P., (2006) “Constructing multiple choice questions as a method for learning”.  Ann Acad Med Singap. Volume 35(9), p604-608.

[PeerWise] University of Auckland, accessed 11th February 2024. peerwise.cs.auckland.ac.nz.

[Sharp 2007] Sharp, A., Sutherland, A. (2007) “Learning Gains…“My (ARS)” The impact of student empowerment using Audience Response Systems Technology on Knowledge Construction, Student Engagement and Assessment” Assessment design for learner responsibility.

[Sibomana 2020] Sibomana, I., Karenzi, I. D., Niyongombwa, I., Byiringiro, J. C., Gashegu, and J., Ntirenganya, F. (2020) “Use of Student-Generated Multiple Choice Questions to Enhance Team-Based Learning of Anatomy at the University of Rwanda”. Adv Med Educ Pract.  Volume 3(11), p. 825-832.

[Vester 2008] Verster, C. “Interacting with texts – Directed activities related to texts (DARTs)”. British Council, Accessed 10th February 2024. https://www.teachingenglish.org.uk/professional-development/teachers/managing-resources/articles/interacting-texts-directed-activities

 

 

 

 

 

 

 

 

 

 

 

Be the first to comment

Leave a Reply

Your email address will not be published.


*