Evaluation and Reflection, Technologies

Facilitating Groups Assessments on KEATS

Introduction

Increasingly, modules are diversifying their assessment types and branching out into group assessments instead of essays and other more traditional assignments. As Professional Services, we want to support this in our schools and departments wherever possible, without it adding further administrative burden or creating scope for error. The goal was to facilitate students submitting assessments in groups whilst also enabling module leads and students to get the most out of other functions in KEATS which groups allow, such as group forums, restricting content by group, and creating marking groups for markers. In other words, we wanted a smooth experience for students submitting their assessments, whilst also enabling module leads and contributors to utilise the full extent of KEATS and maximise student engagement.

The Challenge

Previously, students would be a member of only one group within a module page to keep things “simpler”, which unfortunately restricted how students and staff could engage with KEATS. Alternatively, we have kept manual records of student groups and submissions which added additional administrative burden. Neither of these solutions were the ideal we were looking for, so we decided to turn our attention to the underused groupings feature on KEATS. We knew from experience that the settings in groupings must be set up exactly or students may be prevented from submitting or submit in the wrong group. Therefore, the challenge was to learn how to set up the settings on KEATS to interact with different activities successfully. Whilst it was an investment in time and resources at the beginning to work out how groupings operate, I was hopeful that it would streamline things going forward and create a better student and staff experience.

The process commenced with an initial consultation with the SSPP TEL team. I learned that the “groupings” functions as a group of groups and different activities on KEATS could be set up to pull groups from specific groupings. For example, there may be several presentation groups within a module which could all be included in the umbrella group presentation grouping. The Moodle assignment activity associated with this group presentation could then be set up to pull groups directly from the group presentation grouping, hereby avoiding the confusion of potentially pulling the groups from the seminar groups or forum groups instead. The team set me up with a sandbox on KEATS, effectively a practice area, where I could set up the journey that a student would take.

Screenshot of Groupings in KEATS
Screenshot of Groupings in KEATS

The three main things I wished to test were:

  • signing up to a group via the Group Choice Activity
  • being added to other groups in different groupings,
  • submitting as part of a group to see if all ran smoothly.

I replicated the settings on this practice page, following SSPP TEL guidance on groups and groupings, and I enlisted my very patient and enthusiastic Professional Service colleagues from ECS to be enrolled as students on the practice page, asking them to take the journey of the students. When colleagues submitted their mock assessments, the groups were pulled from the correct assessment groups grouping. As I had selected the settings that allowed one student to upload on behalf of all their group members on the Moodle assignment activity, when one group member submitted, the correct group members were also populated with their group’s submission. It appeared the testing showed everything to go to plan and there was no scrambling of submissions. I was in the clear to replicate these settings throughout the module pages where there were group assessments.

The Outcomes/ Lessons Learnt

Outside of the specific details of how to technically set up the various activities, I learned to think holistically about the KEATS pages and always consider how one setting or action might affect and interact with other activities or areas of the page. It’s also underpinned how crucial the initial set up of activities and course pages are, so going forward, I’d always advise plenty of time and, of course, meticulous checking of these settings in advance of busy times such as submission deadlines.

Conclusion/ Recommendations

In conclusion, I’d strongly advise my colleagues, not to be scared of groupings and groups! Yes, they require careful set up, but once up and running, they enable us to utilise a variety of activities and functions on KEATS, increasing interaction and, minimising administration in the long term whilst maximising engagement for students.

Written By Joanne Devenney

Senior Programme Officer
School of Education, Communication and Society

Evaluation and Reflection, Technologies

Giving Recorded Audio Feedback is Easy!

Introduction

In the Law Faculty, we trialed providing recorded audio feedback for formative assessments in Contract Law.  The result?  Of the 91 students who had received either recorded audio feedback or written feedback for their assessments, 66% preferred recorded audio feedback.  3 of the 4 staff members who responded to our survey also preferred providing recorded audio feedback.

Why?

The NSS score for “Assessment and Feedback” at KCL remains stubbornly low at 68%. With the growing body of literature which extols the virtues of recorded audio feedback, we decided to see if it works for Law. We also wanted to investigate the effect it has on students with particular specific learning difficulties.

How?

The Technology

The platform which we used for providing feedback was the Assignment Tool through KEATS (Moodle). Turnitin also has an audio feedback tool but this only allows for three minutes of feedback to be recorded which felt unnecessarily restrictive.

The Law School Professional Services team set up the Assignment Tool submission link on KEATS. They also created a drop-down menu which allowed for grouping of the students by tutorial group. The Law TEL team created bespoke guidance for the teaching team on using the chosen technologies.

Image of KEATS example assignment
Image of KEATS Assignment Submission.

The student submission process was almost identical to the standard Turnitin submission process and we doubt whether many of the students even noticed the difference.

We used Kaltura Express Capture to provide the feedback. This was easy to use (although it was reported by tutors that in order to record the feedback it did necessitate rather a lot of “click throughs”).

Screenshot of assignment inbox with Kaltura recorded feedback.
Screenshot of assignment inbox with Kaltura recorded feedback.

The Feedback Process

In our instructions to students, we requested that they number the paragraphs in their submission to make it easier for us to refer verbally to specific points in their answers.

In our training for markers, we suggested that the feedback should be between 3 and 5 minutes long. We suggested the following structure to markers:

  1. Start with some general positive feedback about the answer.
  2. Identify some specific areas for improvement. It is really important that you explain how they can improve rather than just pointing out what has gone wrong.
  3. Identify specific things which they have done well. (These could obviously be intertwined with the specific areas for improvement above).
  4. Please finish on some really positive things about the answer. Whilst it is important for students to have some specific ways in which they can improve, students should come away from this experience feeling encouraged.
  5. In terms of the language which should be used, when you are pointing out some of the negative things, please do not use the second person e.g. “You did not explain this area very clearly.” Instead, use language such as: “It would have been better if the answer explained this area more clearly.” It is preferable not to pin the negative aspects on students personally.
Screenshot of recording audio feedback video tutorial.
Screenshot of recording audio feedback video tutorial.

The Student Survey

Students completed three pieces of formative work for Contract Law. Students either received written feedback, recorded audio feedback or a combination of recorded audio feedback with a few comments on the paper. We only surveyed those students who had received recorded audio feedback or recorded audio feedback with some comments written on the paper for at least one formative assessment (i.e. we did not survey those students who had only received written feedback).

We were interested in finding out about the experience of students with a specific learning difficulty. Consequently, in our survey we asked whether students had been diagnosed with or suspected they may have: dyslexia, dyscalculia, dyspraxia, dysgraphia or attention deficit hyperactivity disorder.

Student Survey Results

Recorded Audio Feedback versus Written Feedback:

  • 91 students (out of a cohort of approximately 250) responded who had received recorded audio feedback for at least one assessment and who had not experienced the combination of recorded audio feedback and written feedback.
  • 60 preferred recorded audio feedback.
  • 31 preferred written feedback.

5 of these students had been diagnosed / suspected they have a specific learning difficulty. Of these 5, 4 preferred recorded audio feedback and 1 preferred written feedback.

Recorded Audio Feedback versus a combination of Recorded Audio Feedback and Written Feedback or Written Feedback on its own:

16 students received a combination of recorded audio feedback and written feedback for at least 1 assessment.

  • 10 preferred the combination.
  • 1 preferred recorded audio feedback.
  • 5 preferred written feedback.

5 of these students had been diagnosed / suspected they have a specific learning difficulty. Of these 5, 2 preferred the combination and 3 preferred written feedback.

Staff Survey

4 members of staff (out of a team of 7 colleagues with marking responsibilities on the module who provided recorded audio feedback) responded to the survey. Of those, 3 of the 4 preferred providing recorded audio feedback and 1 preferred providing written feedback. None of the 4 members of staff who responded to the survey had provided a combination of recorded audio feedback with a few comments on the paper.

Technology Gripes

Whilst the tech served the desired purpose, there are a number of ways in which it could be stream-lined. If the following issues with the technology were resolved, this would make the process much easier for staff:

  • Leaving audio feedback with Kaltura Express Capture requires an irritating number of clicks.
  • You cannot pause Kaltura Express Capture (e.g. if you lose your train of thought.)
  • Captions are not switched on automatically. This means that each tutor has to click on each piece of feedback to turn them on.
  • The Assignment Tool does not allow the feedback to be set to be released automatically on a certain date in advance. Although it can all be released in one batch, you have manually to release it by clicking a button (not a big issue but you need to put a note in the diary to remember!)
  • Obtaining viewing analytics is not as streamlined as Turnitin’s audio feedback feature and requires manually going through each video on Kaltura.

The Outcomes / Lessons Learnt

When given the choice between recorded audio feedback and written feedback, the majority of students prefer recorded audio feedback. The most common 5 reasons they gave were:

  1. I felt that it was more personal.
  2. I felt that the explanations were easier to understand;
  3. It was helpful to hear the tone of my tutor’s voice;
  4. The volume (i.e. amount) of feedback was greater.
  5. The process of receiving recorded audio feedback is more active than receiving written feedback.

In terms of students with specific learning difficulties, the majority of these students also preferred audio feedback to written feedback.

When given the choice between recorded audio feedback, written feedback or recorded audio feedback with some comments written on the paper, students preferred the combination. (Please note, however, that the sample of students who were given the combination was very small and it was only 1 tutor who provided this method of feedback.)
However, students reporting specific learning difficulties did not favour combined audio and written feedback. Given the option, the preference for them was a return to the familiarity of purely written feedback. It may have been that the combination of both feedback formats made it more challenging for the students with specific learning difficulties to interpret the key take-home message(s) from the feedback. Thus, the combined approach, rather than clarifying or supporting the comments made, actually muddied the waters.

When it came to staff, the most common reasons why the three staff members preferred providing recorded audio feedback to written feedback were as follows:

  1. I found it faster to provide recorded audio feedback.
  2. I felt that it was more personal.
  3. I felt that my explanations were easier to understand.
  4. I liked the fact that the students could hear the tone of my voice.
  5. I thought that the process of students receiving recorded audio feedback was more active than receiving written feedback.

Conclusion / Recommendations

The commentary from our students and their clear preference for audio feedback supports current pedagogical research regarding the benefits of recording audio feedback above traditional written feedback, such as the perception amongst our students that the feedback experience is richer and more personal. We know from Caruthers et al (2015) that this leads students to be more likely to engage with their feedback. Also, the perception that feedback was both greater in volume yet easier to understand supports Chaing’s (2009) claim that audio feedback provides a greater depth to feedback in comparison to uncontextualized written feedback comments.

Though our overall cohort of students who declared a specific learning difficulty was admittedly small in our data set, it was still interesting to see that these students experienced recorded audio feedback in a more positive way than compared with their past feedback experiences. Though it is clear that for these students particularly, a mixed approach to feedback (i.e. providing a combination of both audio and written feedback) should be avoided.

Overall, colleagues’ experience of providing audio feedback was positive and once familiar with the recording process and given clear support on how to format their audio feedback (including the appropriate length for such feedback to ensure consistency amongst markers), colleagues were able to leave (according to our students) greater volumes of clear, individualized, meaningful feedback. We would, therefore, recommend that other colleagues who have not yet provided audio feedback to their students try the medium as our experience has been, overall, extremely positive from both a student and staff perspective.

Useful Links

Audio Feedback

About The Authors

Caroline van Hensbergen is a Senior Lecturer in Law (Education) at the Dickson Poon School of Law, King’s College London.

Dr Michelle Johnson is a Lecturer in Law (Education) and Faculty Inclusive Education Lead at the Dickson Poon School of Law, King’s College London.

Evaluation and Reflection, Technologies

Accepting Multiple Assignment Attempts

KEATS (Moodle) allows assignment submissions in many ways – this is a record of how a simple question became an extended investigation.

Academic Staff Requirements

Can I check what my students have previously uploaded?”

An academic colleague had used Blackboard (another Virtual Learning Environment) before coming to the Faculty of Natural, Mathematical & Engineering Sciences (NMES) at King’s. He asked if KEATS, our Moodle instance, could behave like Blackboard and allow students to submit multiple attempts to a programming assignment any time they want.

After a follow-up call with the academic colleague, it became clear that the aim was to be able to access anything students had uploaded prior to their final submission, as the latter might contain a wrong or broken file, and grade with reference to a previous submission (programme code or essay draft).

He had the following requirements:

  • Notification emails to both staff and students when a file is uploaded successfully
  • Students to be able to submit as often as they want
  • Marker to be able to review all uploaded attempts to:
    • be able to award marks if an earlier submitted programme code worked fine but a later submission introduced bugs breaking the programme, and
    • monitor the programming code development and make comments, compare changes, and to prevent collusion

This investigation looks at practical solutions to administering programming assignments as well as non-programming ones such as essays.

Background: Assessments on Blackboard (VLE)

In a Blackboard  assignment, students are required to click a Submit button for the markers to access their work. If multiple attempts are allowed, students can submit other attempts at any time, which will be stored as Attempt 1, Attempt 2 and so on and are available for the markers to view. This way, staff can review previous submissions, however, they cannot access drafts.

Screenshot of a Blackboard assignment allows multiple attempts
Blackboard assignment allows multiple attempts

KEATS: Moodle Assignment

The first step was to investigate the options and settings in Moodle Assignment, which is the tool that was already used by most colleagues for similar assignments.

With our current default settings, students can make changes to their uploaded files as much as they want, and submission is finalised only at the assignment deadline. Although instructors can see the latest uploaded files (draft)  even before the deadline, files removed/replaced by students will no longer be accessible to staff. This means, only one version is accessible to markers.

Multiple submissions can be enabled with the use of the “Require student to click the Submit button” setting for staff to review previous attempts, as on Blackboard. Feedback can be left on each attempt. However, students cannot freely submit new attempts because staff need to be involved to manually grant additional attempts to each student. Submissions are time-stamped and can be reviewed by students and markers, but students can only get notification emails after grading whereas markers can get notifications for submissions. Our problem was not resolved yet.

Screenshot of a marker accessing unsubmitted drafts and leaving feedback
Marker accessing unsubmitted drafts and leaving feedback
Screenshot of a student reviewing feedback for different attempts
Student reviewing feedback for different attempts.
Screenshot of a student reviewing feedback for different attempts (cropped version with highlight).
Student reviewing feedback for different attempts (cropped version with highlight).

KEATS: Moodle Quiz

We then considered Moodle Quiz, which some departments at King’s already use to collect scanned exam scripts: a Quiz containing an Essay-type question that allows file upload.

Screenshot of quiz attempt.

While exams usually only allow one single attempt, Moodle Quiz can be set to allow multiple attempts (Grade > Attempts allowed). The “Enforced delay between attempts” setting (from seconds to weeks) under “Extra restrictions on attempts” may be used to avoid spamming attempts. Student can submit new attempts as often as needed because no staff intervention is needed. The drawback is that there is no submission notification or emails, but the quiz summary screen should indicate to the student that the file is submitted. The Quiz attempts page for markers allows for easy review of previous attempts and feedback on each attempt. It is also possible to download all submissions as in Moodle Assignment. This was recommended to the academic colleague as an interim solution while we continued the investigation.

Possible Policy Concerns

Regarding unlimited re-submissions, Quality Assurance colleagues reminded us that students may challenge (i) a perceived inequality in opportunities to get feedback, or (ii) subconscious bias based on previous submissions. Good communication with students and a structured schedule or arrangements should improve expectations from both sides.

Turnitin Assignment and Other Assessment Options

Although the Moodle Quiz appeared to be a solution, we also considered other tools, some of which are readily integrated with KEATS at King’s:

Turnitin assignment allows multiple submissions as an option, but re-submissions will overwrite previously uploaded files. Alternatively, if it is set to a multi-part assignment, each part will be considered mandatory. However, the workflow for Turnitin assignment is not optimal for programming assignments.

Turnitin’s Gradescope offers Multi-Version Assignments for certain assignment types. It is available on KEATS for the Faculty of Natural, Mathematical, and Engineering Sciences (NMES). However, its programming assignment does not support assignment versioning yet.

Edit history is available for Moodle Wiki and OU Wiki; whereas Moodle Forum, Open Forum, Padlet and OU Blog allow continuous participation and interaction between students. These tools could be useful for group programming or other social collaborative learning projects, which is not a direct replacement for an individual programming assignment but an alternative mode of assessment.

Portfolios: Mahara has Timeline (version tracking) as an experimental feature. This may be suitable for essays but not for programming assignments.

Tracking Changes

Tracking changes is an important feature to show development in programming assignments or essays, and cloud platforms (OneDrive, Google Drive, GitHub) can host files and track changes. When used for assignments, student can submit a Share link to allow instructors to access and assess their work and how the work evolved over time. The disadvantage for this option is that the grading experience will be less integrated with Moodle. Some cloud platforms offer a File request feature where students can submit their files to a single location.

Programming Assignments

Industries such as software development use Git as a standard and all changes are tracked. GitHub offers GitHub Classroom, and it can be used with different VLEs including Moodle, but it is not readily integrated with KEATS and requires setup. There may be privacy concerns as students need to link their own accounts.

The Outcomes / Lessons learnt

  • This showcases how a simple question from academic colleague can lead to the exploration of various existing options and exploration of new tools and solutions.
  • Different options are available on KEATS with their pros and cons.
  • Existing tools, possible solutions, policies, and other considerations come into play.

Conclusion / Recommendations

KEATS Quiz matches the case requirements and was recommended to the academic colleague. It went smoothly and our colleague mentioned there were no complaints from students and they are happy with the recommended solution. It is relatively easy to setup and straightforward for students to submit. Clear step-by-step instructions  to staff and students should be enough, but trialling this with a formative assignment would also help.

Depending on the task or subject nature, other tools may work better for different kinds of tasks. TEL colleagues are always there to help!


Useful Links


Written by Antonio Cheung

Antonio is a Senior TEL Officer at the Faculty of Natural, Mathematical and Engineering Sciences (NMES).

September 2023

Evaluation and Reflection

KEATS Similarity Checker Project

Overview of project

Between July 2022 and February 2023, the SSPP TEL team conducted a pilot project to improve the student experience when submitting assignments by creating a special area for students to check the plagiarism/similarity score of their assignments. The goal of the pilot was to make it easier for students with Mitigating Circumstances and the Programme Office staff to manage the process of submitting assignments to KEATS.

Any student who is not subject to Mitigating Circumstances can submit a draft and/or reupload their submission as many times as they wish up to the assessment’s original due date. Many students use this opportunity to submit a draft to check their similarity score before they make their final submission. At the moment, due to technical limitations within KEATS/ Turnitin, students who are granted an extension to an assessment via the Mitigating Circumstances process cannot submit a draft to check their similarity score; they are only allowed to submit once, and after the due date for the assignment passes they no longer have the option to upload their final version.

This is particularly problematic for students who have submitted a draft (sometimes long before the original due date) and then realise they need to apply for Mitigating Circumstances: as they are not able to delete the draft themselves, this draft will be considered their final submission and their MC claim may be rejected on the basis that they have already made a submission. In some departments, PS Staff sometimes agrees to submit and/or delete a draft for a student, but this is time consuming, not consistently applied, and it relies too much on PS Staff being available and inclined to help outside of their normal duties; it is also not sustainable when taking into account the very high number of MC claims we process at the moment.

First Steps

The departments of Geography and Global Health and Social Medicine in the Faculty of SSPP took part in the initial pilot project for their re-sit and dissertation students, and the Similarity Checker (SM) area was created and placed on their Handbook pages on KEATS. Accompanying it was a video and PDF to explain to students how to use the SM, as well as a warning text to reinforce the idea that this did not count as a submission and would not be checked by staff.

Feedback from this small cohort of students led to some revisions and changes to the SM, the most notable of which was around the language used. We had used the words “test area”, meaning to check or trial something, but students for whom English was not their native language found this confusing and equated “test” to mean exam. This was revised and the wording was changed from “test submission area” and “test area” to “Similarity Checker” and “practice area” respectively.

Once we were happy with the revisions, the SM was then rolled out to the rest of the School of Global Affairs, War Studies, and Education, Communication and Society. All Similarity Checker areas have the same layout, same wording and same instructions for parity across all the Schools. Communications for staff and students were also created by Soshana and these were used by Departments to make students and academic staff aware of the existence of the SM.

Layout

The Similarity Checker is made up of several parts. This includes an introductory text explaining what it would be used for, how to use it and a disclaimer that nothing submitted here would ever be moved nor assessed. An explainer video and PDF instructions were added to ensure accessibility and inclusive design were adhered to, so that all students would be able to clearly understand the functionality.

Screenshot of the home screen of the similarity checker.
Screenshot of the geography similarity checker.

The submission areas were divided by level and surname. There is no functional necessity for this, but it aims to prevent Turnitin from getting overloaded by all students in the one department trying to access it at the same time. If students submit in the wrong area there are no effects on their score or submission.

Screenshot of the different Turnitin Submissions.
Screenshot of the different Turnitin Submissions.

Student Feedback

A survey was created by Soshana and shared with all participating Schools, with almost 100 responses. Feedback was generally positive, with students highlighting how the SM improved their experience and confirming that it constitutes an equalising factor for students with extensions. Overall, 90% of respondents have used the SM, 93% found it useful, and 16% used it in the context of an assessment extension (mitigating circumstances).There was also some negative feedback from students who did not find it particularly beneficial, mainly due to the long turnaround time for their score after their third submission, as well as the fact that their score changed repeatedly when uploading a new draft of the same work, depending on how close the assessment due date was. These concerns will be addressed, and elements of response will be provided in future communications.

Overview of survey respondents.
Overview of survey respondents.
Respondents usage by level of study.
Respondents usage by level of study.
Respondents use of the Similarity Checker.
Respondents use of the Similarity Checker.

Conclusion and next steps

The pilot project was a successful start to improving the experience of students and staff using KEATS and Turnitin during their submission period. This was initially to improve the experience of those with Mitigating Circumstances, but we can see that many students without extensions are also using it to check their work.

Next steps will include rolling this out further to other Schools or Departments so that all students in SSPP can access it. Some Departments have their own versions, which we would like to replace with this more modern iteration of the Similarity Checker.

As next steps, the TEL team would like to address some of the points that the students raised as part of the feedback process, and create a communications plan to ensure this is being communicated to students at all relevant points of the academic year.

An all-Faculty stance should also be drawn up if/when a student submits their paper to the Similarity Checker instead of their module page and how this should be dealt with.


Written by Leanne Kelly Leanne Kelly

Leanne is the Digital Education Manager for the Faculty of Social Science and Public Policy (SSPP) at King’s College London. She is responsible for a wide range for digital education processes within the Faculty including instructional design, accessibility, training, innovation and developing new online programmes.

She has a background in publishing and eLearning, and is passionate about using technology to improve the learning experience and make it more accessible to all. She is interested in developing new ways of working, scaling projects and reusing content in new ways, and making online learning an enjoyable process for all.

Written by Soshana Fearn

Soshana Fearn

Soshana is the Senior Postgraduate Programme Officer for the Department of Geography (SSPP) at King’s College London. She delivers the day-to-day administration of taught postgraduate programmes (Masters), offers comprehensive and authoritative advice and support for all staff and students in respect of programme regulations and curriculum choices, services the relevant boards and committees, and oversees the processing of Mitigating Circumstances requests.

She has a background in project coordination and is dedicated to improve the experience of both students and staff through the development and implementation of streamlined innovative solutions, including projects related to institutional processes, policymaking and technology-enhanced learning resources.