This article has been divided in two parts. Part 1 discusses the background and training required for implementing Kaltura, as well as the reasons and methods for using it. Part 2 discusses the benefits of having a university video platform and strategies for student engagement. Continue reading “Part 2: Kaltura at the University of Padua”
This article has been divided in two parts. Part 1 discusses the background and training required for implementing Kaltura, as well as the reasons and methods for using it. Part 2 discusses the benefits of having a university video platform and strategies for student engagement. Continue reading “Part 1: Kaltura at the University of Padua”
With the move to fully online teaching, it soon became apparent the most advanced KEATS (Moodle) training session, KEATS 3: Personalising the Learning Experience, was not appropriate for synchronous delivery. The session was re-designed as a completely flipped session, but attendees would often miss the pre-work instructions. The use of Microsoft Power Automate was explored to automate instructional emails, but the uses of the tool were further reaching than initially considered. Continue reading “Using Automation to Facilitate Flipped Learning”
When the Covid-19 pandemic put us fully online, colleagues in King’s Academy needed to expand our repertoire with a range of evolving technologies. Since we lead educational development programmes and sessions, we strive to demonstrate intrepid, successful designs which make best use of our learning environments. In the foreseeable future those environments would be digital. This post gives a rationale for carving out regular time to test things out together, followed by details about how we set this up to be low-maintenance. Continue reading “Tech Test Thursdays for Digital Capabilities”
Pedagogical research provides a clear rationale for asking students to provide feedback to their peers on formative assignments. By giving feedback, students understand better the demand of the task, internalize our marking criteria and have an opportunity to benchmark their own work against that of their peers. Additionally, feedback provided by peers is often more understandable than the feedback provided by a lecturer and can be timelier, particularly with large cohorts of students.
This is what motivated us to introduce peer feedback, initially on a third year module, with approximately 90-110 students. Later, rather daringly, we attempted it on a first year module of nearly 600 (now nearly 700) students. In both cases the students provided feedback, not a mark, on a written piece of work, an essay and a lab report, respectively.
One of us had previously introduced peer feedback on draft essays in the third year module, distributing via email anonymised essays for feedback (two essays/student). The students could then modify their draft, which was re-submitted for tutor marking. This had been well received and very successful, but it was not sustainable with increases in student numbers. Therefore we decided it was necessary to find a way to do this automatically in KEATS. But how?
The obvious choice of tool seemed to be TurnitIn PeerMark, which had been developed precisely for peer feedback and marking. Sadly, PeerMark turned out to have a bug and did not work for us, in particular with student numbers of over 200. Random essays were not allocated to any reviewer and this had to be corrected manually. Therefore, we decided -rather in a hurry- to move to the Workshop activity in KEATS, which had the potential to be applied to perform the same task.
We only had two or three days to make the transition, because the deadline for submission had already been given to the students. This made it impossible to ask for CTEL support. An extremely well designed wiki from UCL (publicly available) was our only guide. This guide helped considerably, even if the Moodle version used by UCL has (or at least had) a slightly different functionality from the King’s version, KEATS.
Workshop organises the activity in phases, as shown in the screenshot below. One switches between phases simply by clicking on them. One needs to worry about three areas only of workshop administration, indicated by the arrows.
The “edit settings” are fairly self-explanatory, but it is very important to note that they include text boxes for the description of the task in general, the submission and the provision of feedback. These areas need to be completed very carefully and clearly.
An example of a general description of the task is provided below. It is essential in the general instructions to tell the students to save their work frequently, as the system does not auto-save.
It is also important to note that Workshop, in contrast to PeerMark, does not allow the students to provide inline comments.
However, it is possible to get around this limitation by asking students to download the allocated assignment, comment offline, and re-upload the reviewed work. We asked only third year students to do it, on a voluntary basis. In this case it is essential to provide instructions to the students in the “edit settings” section, and to ask students to submit their work as a word file. The provision of this detailed feedback is otherwise impossible, unless they have software to annotate PDFs.
To help the students to provide useful feedback, we found it beneficial to set up a few questions that they need to answer to. This ensures some consistency in the type of feedback provided, and its quantity. This is done in “edit assessment forms.”
The final element to set up is “allocate submissions.” We found that the scheduled allocation, shown below, worked well. We set up the number of reviews to two per submission (which means 2/student).
Providing feedback to two lab reports or essays and thus receiving two sets of feedback ensured that our students received enough feedback, whilst not having to review too many assignments. One review only is unadvisable because standards vary, but with two, most students will receive at least one useful feedback. It is true that the pedagogical literature suggests giving feedback is even more useful than receiving it, still it is very demoralising receiving only poor quality feedback!
Once the setup phase is completed, one can move to the submission phase. It is important to remember to switch phase, or submission will not open on the date you set up in workshop administration. The move from submission to assessment phase can be automatic or manual.
Once the assessment phase has finished, it is important to move manually to the evaluation and then the closed phase. Failing to do so means that the students can no longer give feedback but cannot yet access the feedback received.
It is very easy to forget to transition between phases which happened to us a couple of times. Luckily the students are very prompt in complaining, so one can fix this rapidly.
We have now run Workshop for peer feedback for 2 years, for a total of over 1,200 students, with no major disaster. Mistakes are easy to rectify. This year 84% (580 students) of the first year class submitted a formative report. Of those, 83% provided peer feedback (97 students did not). This means that 5.5% of students did not receive any feedback from their two peers – feedback was provided by one of us (CK ). The percentage of students taking part was lower for the third year class (75%, 62 out of 83), but all students who submitted a formative essay draft provided peer feedback. The percentage of third year students taking part has, however, increased dramatically, from 30% of students when we first introduced peer feedback (30 out of 99) to the current 75%. We can now tell the students that in the past the students taking part in peer feedback achieve a significantly higher mark in the summative essay. This has probably led to the very substantial increase in participation.
When they were asked in a questionnaire whether they found the exercise useful, the overwhelming majority of third year students rated the experience 6-10 on a 0-10 scale, as shown below. We have not yet formally sought the opinion of first year students—anecdotally their experience is more mixed with some of them enjoying the opportunity and others finding the varying quality of feedback that they receive from their peers dissatisfying. However, we like think to think that the benefit of this exercise may be reaped later, by preparing students in their first year at university for a broader range and style of assessments and by beginning to train them in the highly relevant skills of providing feedback and reviewing others’ work.
The main technical issue to consider in relation to the use of Workshop seems to be that of anonymity.
First year students are less worried about anonymity, since with a cohort of 600+ they are extremely unlikely to know the students they are assigned to. However, anonymity is required by the third year students, who otherwise can find it challenging to provide completely honest feedback.
The UCL Moodle allows module organisers to make the exercise anonymous, changing users’ permissions. KEATS instead is set up in such a way that the students can see each other’s names, and the module organiser cannot change that. This needs to be done by the Faculty’s learning technologists each time, so it is important to remember to ask in time.
In conclusion, it is a little scary to use Workshop for the first time. There are many settings to consider, and until the allocation is done one does feel somewhat anxious. But if everything is set up carefully, the system seems robust.
It is essential, though, to provide very clear instructions for the students within the Workshop itself, including deadlines and a warning to remember to save their work often.
We found it also essential to provide a session explaining the rationale of the exercise, and the procedure. For the third year module, in which the feedback provided by the students is more thorough, we have a tutorial before the formative essay submission. In this tutorial we go over the marking criteria together, and the students are provided with good examples of peer feedback from previous years. This has ensured that from the second year that peer feedback was introduced the standard of feedback provided has been more consistent and of a higher quality.
Whilst third year students tend to be more experienced with the learning environment at university, most of our first year students are used to assessments at school where they are rarely asked to provide feedback on each others’ work. As expected, they often react with considerable anxiety around this exercise. To provide these students with a clear framework around the exercise we run ‘How to write a lab report’ sessions and provide complementary materials on KEATS early in the semester where we not only provide a clear guideline for what is expected in the lab report, but also explain the rationale and benefits of participating in the peer feedback exercise.
For the first time this year we also offered debrief ‘Lab report clinic/Q&A’ sessions after the peer feedback exercise in which common strengths and mistakes were discussed using anonymised lab reports from previous year’s students. Students attending these sessions responded positively and found them ‘very helpful.’
Remember to switch phase at the appropriate time!
Written by Clemens Kiecker & Isabella Gavazzi
Clemens is a developmental neuroanatomist and a Senior Lecturer in Neuroscience Education. He is the module lead of the second year core Neuroscience module 5BBA2081 and, together with Isabella, one of the leads of the Common Year One module 4BBY1030 Cell Biology and Neuroscience. He is the Education Lead of the IoPPN’s School of Neuroscience and a member of the College’s Education Strategy Steering Group.
Isabella is a Neuroscientist and a Senior Lecturer in Neuroscience Education. She is the module lead of the third year module 6BBYN306 Research project in Neuroscience and of the second year module 5BBL0205 Social Impact of the Biosciences. She is the deputy lead of the Common Year One module 4BBY1030 Cell Biology and Neuroscience (with Clemens), of the third year module 6BBYN302 Perspective of Pain and Nervous System Disorders (with Anna Battaglia) and of two further third year project modules, 6BBL0360 and 6BBL0361 (with Giovanni Mann). She is also Senior Tutor for Neuroscience and has a keen interest in assessment in general and peer feedback in particular. She was awarded a Master in Academic Practice at King’s with a dissertation on implementing peer review to support learning.
King’s Summer Programmes delivers pre-university and undergraduate-level summer school courses to students from around the world, as well as creating study abroad experiences for year-round King’s students. The emergency deployment of online provision in 2020 posed particular challenges to provide our style of educational experience for students, and the effective use of Microsoft Teams helped to deliver this in addition to a lot of hard work and dedication from tutors and professional services colleagues – in this blog I’m going to focus on Teams though. Continue reading “Bringing Teams together for Summer”
In March 2020 we found ourselves needing to transition to online education as a primary means of delivering our teaching at King’s. Whilst using online resources for teaching was not new to us (we have, after all, been using Keats, Blackboard and Lecture Capture for many years to support education) it was a challenge for many of us to become accustomed to the new way of preparing and delivering education to our students. Hundreds of academics at King’s had to adapt to using Teams, chunking their lectures and uploading material in new formats to Keats; and this is even before we have to consider developing an online curriculum. Continue reading “Tutorials for Educators”
Mahara e-portfolios are used across the Faculty of Arts and Humanities (A&H) to broaden the assessment diet, support practical and situated learning and build digital literacies. Increasingly, teaching in A&H lends itself to the use of Mahara as both a site for a product of student learning. This will be of interest to colleagues who are interested in exploring and developing the use of e-portfolios in their own areas. Continue reading “Using Mahara e-portfolio across Arts and Humanities”
Last year, a group of neuroscience students led by Mattia Veronese sought to make current highlights of neuroscience research more accessible to the scientific community. Continue reading “Big Bright Brain: a neuroscience video initiative”
King’s Academic Skills for Learning was launched in September 2019 to provide students at King’s with resources to help them develop their academic skills. Students (and staff) can self-enrol via the KEATS (Moodle) dashboard. Continue reading “King’s Academic Skills for Learning on KEATS”