Evaluation and Reflection, Technologies

Analysing Student Engagement with KEATS and Academic Performance

Summary

In this post, we describe the research presented at the King’s Education Conference 2024, which focused on the impact of student engagement with KEATS on academic performance.

The Story

Motivation

Virtual Learning Environments like KEATS offer students convenient and flexible access to course materials. Engaging with these platforms requires a deliberate investment of time and effort from students, with the potential rewards of significantly enhancing their academic performance. Therefore, understanding this relationship is essential for creating effective online learning environments that promote student success. Our study aimed to investigate how the total hours students spent on KEATS correlates with their final exam grades.

Study Context

Our research focused on a one-year mathematics foundation module at King’s Foundation during the 2022-2023 academic year. This module followed a blended learning approach, combining online and face-to-face components. The study involved 238 students, aged 17 to 20, from different nationalities and educational backgrounds. The students were required to spend 2 hours each week on KEATS, engaging with instructional videos, practice problems, and exercises, while also attending weekly seminars and live lectures in person.

Data Collection

We analysed two main datasets: KEATS engagement data and student performance data. The KEATS engagement data included total time spent online, and the performance data consisted of grades from various summative assessments, including the final exam.

The Outcomes

Exploratory Data Analysis

We first used descriptive statistics and visualizations to assess the data. The scatter plot below suggested a linear relationship between time spent on KEATS and final exam grade, indicating that students who spent more time on KEATS generally achieved higher grades. Interestingly, the plot also revealed an empty area in the bottom right corner, showing that no students with high engagement had low final exam grades.

Figure plotting Total Hours Online vs Final Exam Grade

Identifying Patterns

Next, we categorized engagement into quartiles and found that students with higher engagement levels consistently achieved better final exam grades, as shown in the boxplots below. The analysis strongly suggests that lower engagement with KEATS is associated with lower grades and that this pattern is unlikely to be coincidental.

Figure Showing Final Exam Grades by Engagement Level

Correlation and Regression Analysis

Finally, the Pearson correlation coefficient revealed a moderate positive relationship between KEATS engagement and final exam grades (r=0.33). To further investigate this relationship, we used multiple linear regression models. The best model, which included all summative tests and total hours online, explained 39% of the variance in final exam grades. This model suggests that each additional hour spent on KEATS was associated with an increase of approximately 0.2 percentage points in the final exam grade.

Conclusion & Recommendations

Our study indicates that increased interaction with KEATS leads to better academic outcomes. The results of this analysis can inform the development of targeted strategies to support all students, promoting a more inclusive and effective learning environment.

Educational Implications

Based on our findings, we recommend the following actions:

  • Encourage active participation in KEATS to enhance academic performance. Tools like the KEATS Course Participation Report can send targeted messages and reminders to boost engagement.
  • Identify students needing additional support by analysing KEATS data and implementing timely interventions.
  • Monitor KEATS usage and assessment results regularly to track student progress and tailor pedagogical approaches.

We hope this blog provides valuable insights into the importance of student engagement with KEATS. For any questions or further discussion, please feel free to contact me at eleonora.pinto_de_moura@kcl.ac.uk

Technical Note

The data analysis was conducted in Python, using libraries such as pandas for data manipulation, matplotlib and seaborn for data visualization, and statsmodels for statistical modelling.

Useful Links

Intended Audience

Educators, Academic Researchers, Learning Technologists and Administrators.


About the Author

Image of Eleonora Pinto de MouraMy name is Eleonora Pinto de Moura, and I’m a Lecturer in Mathematics Education at King’s Foundations. My research interests include educational technology, data-driven teaching strategies, and enhancing student engagement through innovative learning digital tools.

Evaluation and Reflection, Technologies

Facilitating Groups Assessments on KEATS

Introduction

Increasingly, modules are diversifying their assessment types and branching out into group assessments instead of essays and other more traditional assignments. As Professional Services, we want to support this in our schools and departments wherever possible, without it adding further administrative burden or creating scope for error. The goal was to facilitate students submitting assessments in groups whilst also enabling module leads and students to get the most out of other functions in KEATS which groups allow, such as group forums, restricting content by group, and creating marking groups for markers. In other words, we wanted a smooth experience for students submitting their assessments, whilst also enabling module leads and contributors to utilise the full extent of KEATS and maximise student engagement.

The Challenge

Previously, students would be a member of only one group within a module page to keep things “simpler”, which unfortunately restricted how students and staff could engage with KEATS. Alternatively, we have kept manual records of student groups and submissions which added additional administrative burden. Neither of these solutions were the ideal we were looking for, so we decided to turn our attention to the underused groupings feature on KEATS. We knew from experience that the settings in groupings must be set up exactly or students may be prevented from submitting or submit in the wrong group. Therefore, the challenge was to learn how to set up the settings on KEATS to interact with different activities successfully. Whilst it was an investment in time and resources at the beginning to work out how groupings operate, I was hopeful that it would streamline things going forward and create a better student and staff experience.

The process commenced with an initial consultation with the SSPP TEL team. I learned that the “groupings” functions as a group of groups and different activities on KEATS could be set up to pull groups from specific groupings. For example, there may be several presentation groups within a module which could all be included in the umbrella group presentation grouping. The Moodle assignment activity associated with this group presentation could then be set up to pull groups directly from the group presentation grouping, hereby avoiding the confusion of potentially pulling the groups from the seminar groups or forum groups instead. The team set me up with a sandbox on KEATS, effectively a practice area, where I could set up the journey that a student would take.

Screenshot of Groupings in KEATS
Screenshot of Groupings in KEATS

The three main things I wished to test were:

  • signing up to a group via the Group Choice Activity
  • being added to other groups in different groupings,
  • submitting as part of a group to see if all ran smoothly.

I replicated the settings on this practice page, following SSPP TEL guidance on groups and groupings, and I enlisted my very patient and enthusiastic Professional Service colleagues from ECS to be enrolled as students on the practice page, asking them to take the journey of the students. When colleagues submitted their mock assessments, the groups were pulled from the correct assessment groups grouping. As I had selected the settings that allowed one student to upload on behalf of all their group members on the Moodle assignment activity, when one group member submitted, the correct group members were also populated with their group’s submission. It appeared the testing showed everything to go to plan and there was no scrambling of submissions. I was in the clear to replicate these settings throughout the module pages where there were group assessments.

The Outcomes/ Lessons Learnt

Outside of the specific details of how to technically set up the various activities, I learned to think holistically about the KEATS pages and always consider how one setting or action might affect and interact with other activities or areas of the page. It’s also underpinned how crucial the initial set up of activities and course pages are, so going forward, I’d always advise plenty of time and, of course, meticulous checking of these settings in advance of busy times such as submission deadlines.

Conclusion/ Recommendations

In conclusion, I’d strongly advise my colleagues, not to be scared of groupings and groups! Yes, they require careful set up, but once up and running, they enable us to utilise a variety of activities and functions on KEATS, increasing interaction and, minimising administration in the long term whilst maximising engagement for students.

Written By Joanne Devenney

Senior Programme Officer
School of Education, Communication and Society

Evaluation and Reflection, Technologies

Giving Recorded Audio Feedback is Easy!

Introduction

In the Law Faculty, we trialed providing recorded audio feedback for formative assessments in Contract Law.  The result?  Of the 91 students who had received either recorded audio feedback or written feedback for their assessments, 66% preferred recorded audio feedback.  3 of the 4 staff members who responded to our survey also preferred providing recorded audio feedback.

Why?

The NSS score for “Assessment and Feedback” at KCL remains stubbornly low at 68%. With the growing body of literature which extols the virtues of recorded audio feedback, we decided to see if it works for Law. We also wanted to investigate the effect it has on students with particular specific learning difficulties.

How?

The Technology

The platform which we used for providing feedback was the Assignment Tool through KEATS (Moodle). Turnitin also has an audio feedback tool but this only allows for three minutes of feedback to be recorded which felt unnecessarily restrictive.

The Law School Professional Services team set up the Assignment Tool submission link on KEATS. They also created a drop-down menu which allowed for grouping of the students by tutorial group. The Law TEL team created bespoke guidance for the teaching team on using the chosen technologies.

Image of KEATS example assignment
Image of KEATS Assignment Submission.

The student submission process was almost identical to the standard Turnitin submission process and we doubt whether many of the students even noticed the difference.

We used Kaltura Express Capture to provide the feedback. This was easy to use (although it was reported by tutors that in order to record the feedback it did necessitate rather a lot of “click throughs”).

Screenshot of assignment inbox with Kaltura recorded feedback.
Screenshot of assignment inbox with Kaltura recorded feedback.

The Feedback Process

In our instructions to students, we requested that they number the paragraphs in their submission to make it easier for us to refer verbally to specific points in their answers.

In our training for markers, we suggested that the feedback should be between 3 and 5 minutes long. We suggested the following structure to markers:

  1. Start with some general positive feedback about the answer.
  2. Identify some specific areas for improvement. It is really important that you explain how they can improve rather than just pointing out what has gone wrong.
  3. Identify specific things which they have done well. (These could obviously be intertwined with the specific areas for improvement above).
  4. Please finish on some really positive things about the answer. Whilst it is important for students to have some specific ways in which they can improve, students should come away from this experience feeling encouraged.
  5. In terms of the language which should be used, when you are pointing out some of the negative things, please do not use the second person e.g. “You did not explain this area very clearly.” Instead, use language such as: “It would have been better if the answer explained this area more clearly.” It is preferable not to pin the negative aspects on students personally.
Screenshot of recording audio feedback video tutorial.
Screenshot of recording audio feedback video tutorial.

The Student Survey

Students completed three pieces of formative work for Contract Law. Students either received written feedback, recorded audio feedback or a combination of recorded audio feedback with a few comments on the paper. We only surveyed those students who had received recorded audio feedback or recorded audio feedback with some comments written on the paper for at least one formative assessment (i.e. we did not survey those students who had only received written feedback).

We were interested in finding out about the experience of students with a specific learning difficulty. Consequently, in our survey we asked whether students had been diagnosed with or suspected they may have: dyslexia, dyscalculia, dyspraxia, dysgraphia or attention deficit hyperactivity disorder.

Student Survey Results

Recorded Audio Feedback versus Written Feedback:

  • 91 students (out of a cohort of approximately 250) responded who had received recorded audio feedback for at least one assessment and who had not experienced the combination of recorded audio feedback and written feedback.
  • 60 preferred recorded audio feedback.
  • 31 preferred written feedback.

5 of these students had been diagnosed / suspected they have a specific learning difficulty. Of these 5, 4 preferred recorded audio feedback and 1 preferred written feedback.

Recorded Audio Feedback versus a combination of Recorded Audio Feedback and Written Feedback or Written Feedback on its own:

16 students received a combination of recorded audio feedback and written feedback for at least 1 assessment.

  • 10 preferred the combination.
  • 1 preferred recorded audio feedback.
  • 5 preferred written feedback.

5 of these students had been diagnosed / suspected they have a specific learning difficulty. Of these 5, 2 preferred the combination and 3 preferred written feedback.

Staff Survey

4 members of staff (out of a team of 7 colleagues with marking responsibilities on the module who provided recorded audio feedback) responded to the survey. Of those, 3 of the 4 preferred providing recorded audio feedback and 1 preferred providing written feedback. None of the 4 members of staff who responded to the survey had provided a combination of recorded audio feedback with a few comments on the paper.

Technology Gripes

Whilst the tech served the desired purpose, there are a number of ways in which it could be stream-lined. If the following issues with the technology were resolved, this would make the process much easier for staff:

  • Leaving audio feedback with Kaltura Express Capture requires an irritating number of clicks.
  • You cannot pause Kaltura Express Capture (e.g. if you lose your train of thought.)
  • Captions are not switched on automatically. This means that each tutor has to click on each piece of feedback to turn them on.
  • The Assignment Tool does not allow the feedback to be set to be released automatically on a certain date in advance. Although it can all be released in one batch, you have manually to release it by clicking a button (not a big issue but you need to put a note in the diary to remember!)
  • Obtaining viewing analytics is not as streamlined as Turnitin’s audio feedback feature and requires manually going through each video on Kaltura.

The Outcomes / Lessons Learnt

When given the choice between recorded audio feedback and written feedback, the majority of students prefer recorded audio feedback. The most common 5 reasons they gave were:

  1. I felt that it was more personal.
  2. I felt that the explanations were easier to understand;
  3. It was helpful to hear the tone of my tutor’s voice;
  4. The volume (i.e. amount) of feedback was greater.
  5. The process of receiving recorded audio feedback is more active than receiving written feedback.

In terms of students with specific learning difficulties, the majority of these students also preferred audio feedback to written feedback.

When given the choice between recorded audio feedback, written feedback or recorded audio feedback with some comments written on the paper, students preferred the combination. (Please note, however, that the sample of students who were given the combination was very small and it was only 1 tutor who provided this method of feedback.)
However, students reporting specific learning difficulties did not favour combined audio and written feedback. Given the option, the preference for them was a return to the familiarity of purely written feedback. It may have been that the combination of both feedback formats made it more challenging for the students with specific learning difficulties to interpret the key take-home message(s) from the feedback. Thus, the combined approach, rather than clarifying or supporting the comments made, actually muddied the waters.

When it came to staff, the most common reasons why the three staff members preferred providing recorded audio feedback to written feedback were as follows:

  1. I found it faster to provide recorded audio feedback.
  2. I felt that it was more personal.
  3. I felt that my explanations were easier to understand.
  4. I liked the fact that the students could hear the tone of my voice.
  5. I thought that the process of students receiving recorded audio feedback was more active than receiving written feedback.

Conclusion / Recommendations

The commentary from our students and their clear preference for audio feedback supports current pedagogical research regarding the benefits of recording audio feedback above traditional written feedback, such as the perception amongst our students that the feedback experience is richer and more personal. We know from Caruthers et al (2015) that this leads students to be more likely to engage with their feedback. Also, the perception that feedback was both greater in volume yet easier to understand supports Chaing’s (2009) claim that audio feedback provides a greater depth to feedback in comparison to uncontextualized written feedback comments.

Though our overall cohort of students who declared a specific learning difficulty was admittedly small in our data set, it was still interesting to see that these students experienced recorded audio feedback in a more positive way than compared with their past feedback experiences. Though it is clear that for these students particularly, a mixed approach to feedback (i.e. providing a combination of both audio and written feedback) should be avoided.

Overall, colleagues’ experience of providing audio feedback was positive and once familiar with the recording process and given clear support on how to format their audio feedback (including the appropriate length for such feedback to ensure consistency amongst markers), colleagues were able to leave (according to our students) greater volumes of clear, individualized, meaningful feedback. We would, therefore, recommend that other colleagues who have not yet provided audio feedback to their students try the medium as our experience has been, overall, extremely positive from both a student and staff perspective.

Useful Links

Audio Feedback

About The Authors

Caroline van Hensbergen is a Senior Lecturer in Law (Education) at the Dickson Poon School of Law, King’s College London.

Dr Michelle Johnson is a Lecturer in Law (Education) and Faculty Inclusive Education Lead at the Dickson Poon School of Law, King’s College London.

Evaluation and Reflection, News and Events, Pedagogy, Technologies

Exploring the Digital Frontier: Revolutionizing Feedback Delivery with Excel and VBA Macros Part 2

Part 2: My experience and reflections

Part 1 of this blog can be found here: https://blogs.kcl.ac.uk/digitaleducation/exploring-the-digital-frontier-revolutionizing-feedback-delivery-with-excel-and-vba-macros-part-1/

Icons of the Microsoft 365 shown

I embarked on this journey to a few years ago when I led a large course with more than 700 students and a team of 15 markers. It was a challenging for an early career lecturer to manage the administrative tasks and collaborate with the whole team: standardization, moderation, and uploading/creating feedback documents. The demanding nature of the courses allows for no mistakes or human errors. Working through the clunky and often user-unfriendly interface of Moodle/Turnitin is another difficult obstacle: one would require the Internet to do the marking and as some other colleagues would agree that we often work better off the Internet. Traditional methods were often time-consuming and inconsistent, resulting in delayed feedback that left students wanting more. The technology-driven solution aimed to address these challenges head-on. When I joined KCL in September 2022, I faced the same problems and have been made aware of different initiatives at KBS to improve the feedback and assessment process at KBS. I gathered my own systems over the years and implemented the process during the Jan 2023 marking season. Over the Spring Term 2023, I refined the process with feedback from colleagues who shared the excitements and interest. In June 2023, I presented at the festival to share the practices and implementation strategies for an innovative automation system.

The process involved harnessing the power of Microsoft Excel and VBA Macros within Microsoft Word. These technologies allowed us to streamline and automate feedback delivery. Imagine, no more laborious hours spent typing feedback comments and no human errors involved in exporting and uploading the feedback documents to Keats/Turnitin! Instead, we could focus on providing students with valuable insights to help them excel.

Screenshot of Digital Skills Hub

**Challenges Faced:**

Of course, no transformative journey is without its challenges. Some educators were initially resistant to change, finding the prospect of learning VBA Macros daunting. Additionally, ensuring the new system was compatible with various devices and platforms presented a technical hurdle. As I mentioned in the guidance (see from my SharePoint), the set-up and troubleshooting at the beginning can be quite a challenge, particularly for colleagues using the MacOS system (it’s less so for Windows users). Compatibility issues were addressed through rigorous testing and continuous monitoring of system performance. Clear communication with your marking team is also needed to make sure everyone is on the same page with the new system.

But I promise it’s worth the effort and the subsequential usages will be a much smoother sail. And from a marker’s perspective, it is much less work than working through the traditional channels.

The journey from traditional feedback systems to an automated approach using Excel and VBA Macros has been nothing short of transformative. It’s a testament to the power of technology in education, where innovative solutions can overcome challenges and improve the overall learning experience.

As we continue this path of exploration and adaptation, the future of feedback delivery looks brighter than ever to improved student satisfaction and educational outcomes. I hope that a wider adoption of the process could help deliver a more insightful and time effective feedback to our students, thereby addressing the burning issues identified from the student surveys, as well as helping deliver impacts to the quality of feedback giving and student experience, as identified in King’s Vision 2029 and the TEF framework.

Screenshot of TEF award

It takes time and communications with colleagues to identify compatibility issues and resolve them. So far, the method has been used by six Economics courses at KBS, two from the University of Glasgow; and colleagues from Marketing and Adult Nursing, have expressed their interests in using it in their courses.
It is definitely not perfect, and I am very much looking forward to feedback, comments, and of course successful implementations of colleagues.

The blog discusses a transformative journey in education, initiated during The Festival of Technology 2022 at KCL. It explores the adoption of Excel and VBA Macros within Microsoft Word to revolutionize feedback delivery. The main reasons for this change were to enhance feedback quality and efficiency, addressing challenges like resistance to change and compatibility issues. Through workshops, ongoing support, and rigorous testing, the adoption of technology resulted in a more efficient, user-friendly, and collaborative feedback system, empowering educators and improving the overall learning experience.

I would like to thank KBS colleagues, Jack Fosten, Dragos Radu, and Chahna Gonsalves for their encouragement, important suggestions and feedback as well as allowing me to pilot the process in their modules. I also thank various colleagues across other faculties for providing feedback and suggestions as well as identifying compatibility issues (with solutions).

For additional resources, including the workshop slides and a detailed guide with relevant codes and FAQs, please refer to the SharePoint folder linked here.

I am a Lecturer in Economics at the Department of Economics, King’s Business School, King’s College London. I am also an academic supervisor at the Institute of Finance and Technology, University College of London, and a chief examiner for the University of London International Programme (Econometrics). Before joining King’s, I lectured and conducted research at the London School of Economics as an LSE Fellow in Economics, and at the University of Warwick as a postdoctoral fellow (in Economics). I completed my PhD in Economics at the University of Nottingham in 2018.

I have lectured courses in econometrics and macroeconomics at King’s, LSE, and Warwick, and led seminars (tutorials) in various courses at Nottingham. From March 2023, I am the GTA Lead at King’s Business School.

Evaluation and Reflection, News and Events, Pedagogy, Technologies

Exploring the Digital Frontier: Revolutionizing Feedback Delivery with Excel and VBA Macros Part 1

Part 1: The practical guide

In today’s rapidly evolving digital age, the need for efficient and effective systems in education is more pronounced than ever. Traditional platforms like Moodle and Turnitin have served us well, but as educators, we must acknowledge their limitations in providing timely, user-friendly, and collaborative feedback on assignments, exams, and dissertations.

This tutorial aims to be your guiding light towards a better, more streamlined approach to feedback delivery. Drawing upon my workshop presented during The Festival of Technology 2022 at KCL, where I shared practical insights and implementation strategies for this automation system, we’ll delve into the fascinating world of Excel and VBA Macros within Microsoft Word. This comprehensive resource builds upon the principles discussed in that workshop.

By embarking on this journey, you’ll equip yourself with the skills and knowledge to revolutionize your approach to feedback giving. Here’s what you stand to gain:

1. **Efficiency:** Say goodbye to the laborious and time-consuming task of manually providing feedback using KEATs (Moodle/Turnitin). With Excel and VBA Macros, you’ll learn how to automate the process, saving valuable time that can be redirected towards more meaningful feedback and interactions with your students.

 

Screenshot illustrating what the purpose of the document
Picture 1: Our Aims and Objectives

2. **User-friendliness:** Discover how to create a user-friendly feedback documents for both yourself, the marking team, and your students. Your feedback system will become intuitive and accessible, ensuring that learners can easily understand and act upon your comments with a nicely formatted feedback document.

A screenshot showing the step by step summary for collecting marking and feedback
Picture 2: A summary of steps

3. **Collaboration:** Break free from the constraints of limited collaboration within traditional systems. The method will allow a marking team to efficiently collaborate and moderate, making feedback delivery a seamless and cooperative effort.

Screenshot of marking folder contents
Picture 3: What the marking folder looks like? It is sharable with the marking team

4. **Comprehensive Feedback:** Dive into the world of detailed and constructive feedback. You’ll gain the expertise to provide tailored insights that empower students to excel in their academic pursuits.

Screenshot of excel file showing comments
Picture 4: What a short comment looks like? Totally customizable.

This tutorial isn’t just about learning a new tool; it’s about transforming your approach to education. By mastering Excel and VBA Macros for feedback delivery, you’ll become a more effective educator, making a lasting impact on your students. The system will:
– Enhance your teaching methods, creating a more engaging and supportive learning environment.
– Free up your time that was spent on administrative tasks or dealing with Turnitin/Keats for more meaningful activities such as preparing feedback comments and communication with your team and students.
– For repeated courses/assessments, you can prepare a bank of modal comments for lateral uses, as well as a record of common mistakes and suggestions for improvements to communicate with students.

Screenshot of excel sheet
Picture 5: What the end product of a long feedback document looks like? Totally customizable.

Education is a dynamic field, and keeping pace with technological advancements is essential. The automation possibilities offered by Excel and VBA Macros are not just practical but also intriguing. Discovering how to harness these tools to optimize your feedback process can be genuinely exciting.

Screenshot displaying cautions with using excel
Picture 6: A few cautions?

For additional resources, including the workshop slides and a detailed guide with relevant codes and FAQs, please refer to the SharePoint folder linked here. This tutorial serves as a bridge between the insights shared during the workshop and the practical implementation of an automated feedback system. It’s an opportunity to further explore and master these valuable techniques, all while enhancing the overall learning experience for students. Join us as we embark on this transformative journey together.

Part 2 of this blog can be found here: https://blogs.kcl.ac.uk/digitaleducation/exploring-the-digital-frontier-revolutionizing-feedback-delivery-with-excel-and-vba-macros-part-2/

Evaluation and Reflection, Technologies

Accepting Multiple Assignment Attempts

KEATS (Moodle) allows assignment submissions in many ways – this is a record of how a simple question became an extended investigation.

Academic Staff Requirements

Can I check what my students have previously uploaded?”

An academic colleague had used Blackboard (another Virtual Learning Environment) before coming to the Faculty of Natural, Mathematical & Engineering Sciences (NMES) at King’s. He asked if KEATS, our Moodle instance, could behave like Blackboard and allow students to submit multiple attempts to a programming assignment any time they want.

After a follow-up call with the academic colleague, it became clear that the aim was to be able to access anything students had uploaded prior to their final submission, as the latter might contain a wrong or broken file, and grade with reference to a previous submission (programme code or essay draft).

He had the following requirements:

  • Notification emails to both staff and students when a file is uploaded successfully
  • Students to be able to submit as often as they want
  • Marker to be able to review all uploaded attempts to:
    • be able to award marks if an earlier submitted programme code worked fine but a later submission introduced bugs breaking the programme, and
    • monitor the programming code development and make comments, compare changes, and to prevent collusion

This investigation looks at practical solutions to administering programming assignments as well as non-programming ones such as essays.

Background: Assessments on Blackboard (VLE)

In a Blackboard  assignment, students are required to click a Submit button for the markers to access their work. If multiple attempts are allowed, students can submit other attempts at any time, which will be stored as Attempt 1, Attempt 2 and so on and are available for the markers to view. This way, staff can review previous submissions, however, they cannot access drafts.

Screenshot of a Blackboard assignment allows multiple attempts
Blackboard assignment allows multiple attempts

KEATS: Moodle Assignment

The first step was to investigate the options and settings in Moodle Assignment, which is the tool that was already used by most colleagues for similar assignments.

With our current default settings, students can make changes to their uploaded files as much as they want, and submission is finalised only at the assignment deadline. Although instructors can see the latest uploaded files (draft)  even before the deadline, files removed/replaced by students will no longer be accessible to staff. This means, only one version is accessible to markers.

Multiple submissions can be enabled with the use of the “Require student to click the Submit button” setting for staff to review previous attempts, as on Blackboard. Feedback can be left on each attempt. However, students cannot freely submit new attempts because staff need to be involved to manually grant additional attempts to each student. Submissions are time-stamped and can be reviewed by students and markers, but students can only get notification emails after grading whereas markers can get notifications for submissions. Our problem was not resolved yet.

Screenshot of a marker accessing unsubmitted drafts and leaving feedback
Marker accessing unsubmitted drafts and leaving feedback
Screenshot of a student reviewing feedback for different attempts
Student reviewing feedback for different attempts.
Screenshot of a student reviewing feedback for different attempts (cropped version with highlight).
Student reviewing feedback for different attempts (cropped version with highlight).

KEATS: Moodle Quiz

We then considered Moodle Quiz, which some departments at King’s already use to collect scanned exam scripts: a Quiz containing an Essay-type question that allows file upload.

Screenshot of quiz attempt.

While exams usually only allow one single attempt, Moodle Quiz can be set to allow multiple attempts (Grade > Attempts allowed). The “Enforced delay between attempts” setting (from seconds to weeks) under “Extra restrictions on attempts” may be used to avoid spamming attempts. Student can submit new attempts as often as needed because no staff intervention is needed. The drawback is that there is no submission notification or emails, but the quiz summary screen should indicate to the student that the file is submitted. The Quiz attempts page for markers allows for easy review of previous attempts and feedback on each attempt. It is also possible to download all submissions as in Moodle Assignment. This was recommended to the academic colleague as an interim solution while we continued the investigation.

Possible Policy Concerns

Regarding unlimited re-submissions, Quality Assurance colleagues reminded us that students may challenge (i) a perceived inequality in opportunities to get feedback, or (ii) subconscious bias based on previous submissions. Good communication with students and a structured schedule or arrangements should improve expectations from both sides.

Turnitin Assignment and Other Assessment Options

Although the Moodle Quiz appeared to be a solution, we also considered other tools, some of which are readily integrated with KEATS at King’s:

Turnitin assignment allows multiple submissions as an option, but re-submissions will overwrite previously uploaded files. Alternatively, if it is set to a multi-part assignment, each part will be considered mandatory. However, the workflow for Turnitin assignment is not optimal for programming assignments.

Turnitin’s Gradescope offers Multi-Version Assignments for certain assignment types. It is available on KEATS for the Faculty of Natural, Mathematical, and Engineering Sciences (NMES). However, its programming assignment does not support assignment versioning yet.

Edit history is available for Moodle Wiki and OU Wiki; whereas Moodle Forum, Open Forum, Padlet and OU Blog allow continuous participation and interaction between students. These tools could be useful for group programming or other social collaborative learning projects, which is not a direct replacement for an individual programming assignment but an alternative mode of assessment.

Portfolios: Mahara has Timeline (version tracking) as an experimental feature. This may be suitable for essays but not for programming assignments.

Tracking Changes

Tracking changes is an important feature to show development in programming assignments or essays, and cloud platforms (OneDrive, Google Drive, GitHub) can host files and track changes. When used for assignments, student can submit a Share link to allow instructors to access and assess their work and how the work evolved over time. The disadvantage for this option is that the grading experience will be less integrated with Moodle. Some cloud platforms offer a File request feature where students can submit their files to a single location.

Programming Assignments

Industries such as software development use Git as a standard and all changes are tracked. GitHub offers GitHub Classroom, and it can be used with different VLEs including Moodle, but it is not readily integrated with KEATS and requires setup. There may be privacy concerns as students need to link their own accounts.

The Outcomes / Lessons learnt

  • This showcases how a simple question from academic colleague can lead to the exploration of various existing options and exploration of new tools and solutions.
  • Different options are available on KEATS with their pros and cons.
  • Existing tools, possible solutions, policies, and other considerations come into play.

Conclusion / Recommendations

KEATS Quiz matches the case requirements and was recommended to the academic colleague. It went smoothly and our colleague mentioned there were no complaints from students and they are happy with the recommended solution. It is relatively easy to setup and straightforward for students to submit. Clear step-by-step instructions  to staff and students should be enough, but trialling this with a formative assignment would also help.

Depending on the task or subject nature, other tools may work better for different kinds of tasks. TEL colleagues are always there to help!


Useful Links


Written by Antonio Cheung

Antonio is a Senior TEL Officer at the Faculty of Natural, Mathematical and Engineering Sciences (NMES).

September 2023

Evaluation and Reflection

KEATS Similarity Checker Project

Overview of project

Between July 2022 and February 2023, the SSPP TEL team conducted a pilot project to improve the student experience when submitting assignments by creating a special area for students to check the plagiarism/similarity score of their assignments. The goal of the pilot was to make it easier for students with Mitigating Circumstances and the Programme Office staff to manage the process of submitting assignments to KEATS.

Any student who is not subject to Mitigating Circumstances can submit a draft and/or reupload their submission as many times as they wish up to the assessment’s original due date. Many students use this opportunity to submit a draft to check their similarity score before they make their final submission. At the moment, due to technical limitations within KEATS/ Turnitin, students who are granted an extension to an assessment via the Mitigating Circumstances process cannot submit a draft to check their similarity score; they are only allowed to submit once, and after the due date for the assignment passes they no longer have the option to upload their final version.

This is particularly problematic for students who have submitted a draft (sometimes long before the original due date) and then realise they need to apply for Mitigating Circumstances: as they are not able to delete the draft themselves, this draft will be considered their final submission and their MC claim may be rejected on the basis that they have already made a submission. In some departments, PS Staff sometimes agrees to submit and/or delete a draft for a student, but this is time consuming, not consistently applied, and it relies too much on PS Staff being available and inclined to help outside of their normal duties; it is also not sustainable when taking into account the very high number of MC claims we process at the moment.

First Steps

The departments of Geography and Global Health and Social Medicine in the Faculty of SSPP took part in the initial pilot project for their re-sit and dissertation students, and the Similarity Checker (SM) area was created and placed on their Handbook pages on KEATS. Accompanying it was a video and PDF to explain to students how to use the SM, as well as a warning text to reinforce the idea that this did not count as a submission and would not be checked by staff.

Feedback from this small cohort of students led to some revisions and changes to the SM, the most notable of which was around the language used. We had used the words “test area”, meaning to check or trial something, but students for whom English was not their native language found this confusing and equated “test” to mean exam. This was revised and the wording was changed from “test submission area” and “test area” to “Similarity Checker” and “practice area” respectively.

Once we were happy with the revisions, the SM was then rolled out to the rest of the School of Global Affairs, War Studies, and Education, Communication and Society. All Similarity Checker areas have the same layout, same wording and same instructions for parity across all the Schools. Communications for staff and students were also created by Soshana and these were used by Departments to make students and academic staff aware of the existence of the SM.

Layout

The Similarity Checker is made up of several parts. This includes an introductory text explaining what it would be used for, how to use it and a disclaimer that nothing submitted here would ever be moved nor assessed. An explainer video and PDF instructions were added to ensure accessibility and inclusive design were adhered to, so that all students would be able to clearly understand the functionality.

Screenshot of the home screen of the similarity checker.
Screenshot of the geography similarity checker.

The submission areas were divided by level and surname. There is no functional necessity for this, but it aims to prevent Turnitin from getting overloaded by all students in the one department trying to access it at the same time. If students submit in the wrong area there are no effects on their score or submission.

Screenshot of the different Turnitin Submissions.
Screenshot of the different Turnitin Submissions.

Student Feedback

A survey was created by Soshana and shared with all participating Schools, with almost 100 responses. Feedback was generally positive, with students highlighting how the SM improved their experience and confirming that it constitutes an equalising factor for students with extensions. Overall, 90% of respondents have used the SM, 93% found it useful, and 16% used it in the context of an assessment extension (mitigating circumstances).There was also some negative feedback from students who did not find it particularly beneficial, mainly due to the long turnaround time for their score after their third submission, as well as the fact that their score changed repeatedly when uploading a new draft of the same work, depending on how close the assessment due date was. These concerns will be addressed, and elements of response will be provided in future communications.

Overview of survey respondents.
Overview of survey respondents.
Respondents usage by level of study.
Respondents usage by level of study.
Respondents use of the Similarity Checker.
Respondents use of the Similarity Checker.

Conclusion and next steps

The pilot project was a successful start to improving the experience of students and staff using KEATS and Turnitin during their submission period. This was initially to improve the experience of those with Mitigating Circumstances, but we can see that many students without extensions are also using it to check their work.

Next steps will include rolling this out further to other Schools or Departments so that all students in SSPP can access it. Some Departments have their own versions, which we would like to replace with this more modern iteration of the Similarity Checker.

As next steps, the TEL team would like to address some of the points that the students raised as part of the feedback process, and create a communications plan to ensure this is being communicated to students at all relevant points of the academic year.

An all-Faculty stance should also be drawn up if/when a student submits their paper to the Similarity Checker instead of their module page and how this should be dealt with.


Written by Leanne Kelly Leanne Kelly

Leanne is the Digital Education Manager for the Faculty of Social Science and Public Policy (SSPP) at King’s College London. She is responsible for a wide range for digital education processes within the Faculty including instructional design, accessibility, training, innovation and developing new online programmes.

She has a background in publishing and eLearning, and is passionate about using technology to improve the learning experience and make it more accessible to all. She is interested in developing new ways of working, scaling projects and reusing content in new ways, and making online learning an enjoyable process for all.

Written by Soshana Fearn

Soshana Fearn

Soshana is the Senior Postgraduate Programme Officer for the Department of Geography (SSPP) at King’s College London. She delivers the day-to-day administration of taught postgraduate programmes (Masters), offers comprehensive and authoritative advice and support for all staff and students in respect of programme regulations and curriculum choices, services the relevant boards and committees, and oversees the processing of Mitigating Circumstances requests.

She has a background in project coordination and is dedicated to improve the experience of both students and staff through the development and implementation of streamlined innovative solutions, including projects related to institutional processes, policymaking and technology-enhanced learning resources.


 

Evaluation and Reflection, Technologies

Introducing CMALT programme at King’s College – Part 1

In September 2021 King’s launched its first CMALT (Certified Membership of the Association for Learning Technology) programme cohort aimed at helping 15 colleagues put together an evidence-based portfolio in order to gain CMALT accreditation.

What is CMALT?
Certified Membership of the Association for Learning Technology (CMALT) is the learning technologist’s ‘kitemark.’ This professional certification (and membership) recognises your expertise and experience in your field. Benefits to candidates are in the form of reflection on their professional practice, mentoring from experienced colleagues and peer-to-peer support. CMALT also, increasingly, appears on TEL job specifications so gaining CMALT (and with it post-nominal letters which you are allowed then to use) provides both CPD and career-development opportunities. You join an established community or practice, are invited to ALT meetings and events; and can view and contribute to publications.

The CMALT Accreditation Framework provides pathways to peer-assessed accreditation for a cross-section of learning technology-focused professionals, educators and administrators in the UK and internationally. Accreditation is achieved by the successful submission of a reflective, online portfolio, which evidences skills and experience in learning technology across four core areas and a specialist area. There are three different pathways to choose from to best match an individual’s experience: Associate Certified Member, Certified Member and Senior Certified Member.

First steps:
•  Having joined King’s in 2018, I quickly realised I was one of only a few colleagues in the institution that held CMALT accreditation. Given the size of King’s and the number of TEL colleagues, I wanted to see if I could support colleagues in gaining CMALT recognition.
•  I originally attempted to launch a pilot programme in late February 2019 with a small group of colleagues from my team in CTEL, however, the Covid pandemic hit and efforts were instead prioritised elsewhere.
•  In early 2021, I along with three other colleagues (David Reid Matthews, Danielle Johnston and Fariha Choi) came together to resurrect the CMALT programme. We formed the CMALT planning team to create a year-long programme and agreed to become mentors to the colleagues taking part.
We successfully bid for funding from the Students and Education Directorate (SED) for up to 20 places on the new programme (see CMALT registration fees for more info).
•  We began sharing information about the new programme and asked interested parties to complete a show of interest form (Google Form).
In the summer of 2021, we invited all interested colleagues to an online (MS Teams) 1h CMALT Information Session to provide further details about the accreditation and what the programme entailed. Of 25 that attended the session 15 opted to sign up for the 2021/22 CMALT programme with the remaining either deferring to the following academic year or deciding to remove their interest from the programme.

Make up the first Cohort:

Pie Chart showing the distribution of the 15 members who attend the programme. With Faculty TEL having the highest with 8, Followed by CTEL with 5 and Other being 2.

The programme schedule
•  The programme would start in September 2021 with each month focusing on a section of the portfolio, delivered purely online via MS Teams. We provided two months (December and June) when no meetings took place to allow colleagues to catch up.

Here is the full schedule of our CMALT programme:

For more detail please refer to Part 2 of this blog:

https://blogs.kcl.ac.uk/digitaleducation/?p=1511&preview=true

 

Written by Sultan Wadud and David Reid Matthews

Wadud works as a Learning Technologist, Faculty Liaison at CTEL, working closely with Academic Faculties and Departments to support and drive the implementation of the King’s Technology Enhanced Learning ‘Transformation in Digital Learning’ strategy.
Wadud supports the management and delivery of multiple projects aimed at both the development of academics’ pedagogic understanding and the practice of technology enhanced learning.
Wadud is the product owner for Kaltura and one of the leads for the CMALT programme at King’s. In addition to this Wadud oversees the Digital Education Blog.

David is the TEL Manager for Arts & Humanities and joined King’s in 2018. He leads a team of learning technologists supporting a large and complex faculty, providing mainly 2nd line support, training and troubleshooting on our core, recognised and recommended TEL tools. David has worked in learning technology since 2011, having previously (and improbably) been a Lecturer in Theatre Studies. His particular interests are in legislation and policy around TEL, as well as IT Service Management and Delivery. David is one of the leads for the CMALT programme at King’s.

Evaluation and Reflection, Pedagogy, Technologies

Introducing CMALT programme at King’s College – part 2

This is Part 2 of Introducing CMALT programme at King’s College (Read part 1)

Resources and interactions
• We created a Moodle site to host all the information relating to CMALT accreditation and provided resource links, session recordings and presentations for colleagues to be able to refer to or catch up on anything they have missed.

KEATS page for CMALT programme

• In addition to the Moodle area we set up a Microsoft Teams site to allow us to send general announcements, plan for meetings and private areas for mentor groups:


Example of Teams announcement to Cohort 1:

Mentor support
• Whilst each meeting had an opportunity for colleagues to have shared contact time with their mentors, additional mentor support was provided on an ad hoc and individual basis. In Cohort 1, some colleagues utilised this consistently throughout the programme whilst a few left it to the end to seek help.
• 93% of Cohort 1 either strongly agreed or agreed their mentors facilitated appropriate discussion and reflection throughout the programme:

• 12 out of the 15 colleagues took the opportunity and connected with their mentors outside of the monthly Teams meetings.

Cohort 1 completion
• Overall, we had 14 submissions to ALT with one colleague deciding to re-join the programme with Cohort 3.
• We received feedback from all colleagues who took part in the programme, with the majority offering positive feedback. Nearly all colleagues fed back that the frequency (monthly meetings) and length of sessions (1h) of the sessions were just right.
The majority utilised the Moodle areas during their time on the programme.
• “Being a part of a cohort was great and enabled me to work collaboratively/share ideas with others on this project. However, starting very early on in the process without the pressure of fixed deadlines meant I probably took it too easy, so having deadlines for (formative) feedback in the 6 months run-up to our submission date would have been helpful”
• We took this feedback on board to introduce two draft deadlines for sections 1-2 by January and 2-3 by late March. In addition to this, we encouraged colleagues not to leave it to the end to seek help and have regular contact with mentors.
• Moodle discussion board – except for two posts in the Moodle discussion board we noticed majority of interactions were taking place in our Teams areas. For Cohort 2 we decided to remove the Moodle discussion board and replace it with one in the Teams area.

Cohort 2 and beyond
• In 2022 we expanded the programme for Cohort 2 to include all three pathways of CMALT which resulted in 22 signups (x18 CMALT, x3 Associate CMALT and x1 Senior CMALT).
• If funding is provided for a third cohort, we will offer the senior CMALT pathway as for Cohort 2 Senior CMALT was only available to the mentors.
• King’s has recently applied to do CMALT in-house accreditation which we hope will allow us to provide quicker assessment and feedback turnout.
• ALT requires CMALT holders to refresh their portfolios after three years of obtaining accreditation. This is something the planning team is anticipating offering to the first cohort in 2026.
• The long-term aspiration of the CMALT programme at King’s is for it to become self-seeding after the first few years. We have already had one Cohort 1 candidate who has become a Cohort 2 mentor, and, in future, we expect CMALT holders to move up the CMALT pathways once they gain more experience as well as come back to mentor and support the next generation.

Written by Sultan Wadud and David Reid Matthews

Wadud works as a Learning Technologist, Faculty Liaison at CTEL, working closely with Academic Faculties and Departments to support and drive the implementation of the King’s Technology Enhanced Learning ‘Transformation in Digital Learning’ strategy.
Wadud supports the management and delivery of multiple projects aimed at both the development of academics’ pedagogic understanding and the practice of technology enhanced learning.
Wadud is the product owner for Kaltura and one of the leads for the CMALT programme at King’s. In addition to this Wadud oversees the Digital Education Blog.

David is the TEL Manager for Arts & Humanities and joined King’s in 2018. He leads a team of learning technologists supporting a large and complex faculty, providing mainly 2nd line support, training and troubleshooting on our core, recognised and recommended TEL tools. David has worked in learning technology since 2011, having previously (and improbably) been a Lecturer in Theatre Studies. His particular interests are in legislation and policy around TEL, as well as IT Service Management and Delivery. David is one of the leads for the CMALT programme at King’s.

Evaluation and Reflection, Technologies, Uncategorized

Using Microsoft Power Platform to Support Staff Development

This blog post is a follow-up to the post Using Automation to Facilitate Flipped Learning. 

Due to the success of using Microsoft Power Automate to support flipped learning for an advanced KEATS training session, the Power Automate process was rolled out to all training sessions offered by Centre for Technology Enhanced Learning (CTEL) and other elements of the Microsoft Power Platform have been introduced. This blog post will outline the processes, successes, and challenges. 

Rollout of Power Automate Process 

Once the decision was made by the CTEL CPD Chair to roll out the Power Automate process to all CTEL training sessions, we needed to decide how this would work practically. The established process, or flow, used for the KEATS: Personalising the Learning Experience training session was built specifically for that session and was the only one that included pre-session work. We navigated around this by creating individual flows for each of the sessions run by CTEL and removing and adding steps as appropriate. The Assistant Learning Technologist for the team and I worked closely with the session leads, customising and creating the flows and encouraging personalisation of their flows to better reflect their sessions, such as adding attachments and editing email text.

Another risk was that the flows would reference the same Microsoft Excel spreadsheet hosted on a SharePoint site, and with multiple people accessing and changing the data, flows could be affected and send out emails at the wrong time. We navigated around this by creating individual spreadsheets and pointing the flows to the relevant spreadsheets for each session, which allowed for further customisation from session leads if desired. 

Flows are usually triggered 2 working days before the session is scheduled, but this can vary if the pre-session work required will take more time to complete. The core template automatically completes 5 core steps when triggered: 

  1. Creates a Microsoft Teams Meeting and invites all attendees to the meeting. 
  2. Sends an email containing further information about the session if needed. This is on a 5-minute delay to allow for any manual intervention should any mistakes be made in step 1. 
  3. Sends an email to participants as soon as the scheduled session is finished containing links to further resources and a request for feedback to be left via the Microsoft Form. 
  4. Sends an email 2 working days to participants after the session has been completed, asking for feedback if it hasn’t been left. 
  5. Sends an email 10 working days after the session to participants to ask if they have attempted any content that was covered in the session and if they have any success stories to share or need any further support. 

The above template is core and session leads can add further steps as relevant to their own flows. Working days are calculated within formulas in the Excel spreadsheet and are utilised to increase the response rate, rather than send emails out over a weekend or bank holiday which can be ignored. 

Displaying Feedback in Power BI 

The Power Automate process outlined above is fairly simple in terms of its structure and aims; it sends out emails at predetermined times based on the date and time of the relevant session, which is calculated in the Microsoft Excel spreadsheet. Each email sent to attendees once the session has been completed contains a link to CTELs feedback form. Once feedback is submitted, it is collected, anonymised, and stored. I designed a Power BI report to display the quantitative and qualitative data submitted to display the impact of the sessions and assist each session lead with making any changes using free text submissions.

An Overview of CTEL Feedback organised by session titles, calendar month and by faculty. The graph on the top-left presents which workshops in order of being given the highest feedback. The graph on the top-right presents volume of feedback given by different faculties. The graph on the bottom left presents the average and highs of how much feedback was given per calendar month. The bottom right graphs show when the drilldown date of workshops and the graph below shows overall feedback of 373 out of 746.
Figure 1: The Power BI report for feedback submitted for CTEL training sessions. Data is organised by session title, calendar month, and by faculty. 373 pieces of feedback were left for 2021/22.

After organising this data and gaining experience in Power BI, I was able to link this data to attendance data extracted from SkillsForge, to gain insights into our historical CPD attendance and how this relates to our feedback submissions. An advantage to this was to see if the flows had an impact on gathering feedback for our sessions.

Figure 2 presents the attendance of CTEL training sessions. Organised by calendar month, attendance type by sessions, and by faculty. The graphs on the top left present the overview of attendance of 2021/2022 Academic Year. With 1243 total sign ups out of 2486 and 166 number of sessions. The bottom left graph shows signups by workshop over time by index per month. The top-left graph shows signups by faculties ranging 0-200+. The bottom left graph shows attendance types by sessions of each workshop.
Figure 2: The Power BI report for attendance at CTEL training sessions. Data is organised by calendar month, attendance type by sessions, and by faculty.

Key findings revealed that attendance for 2020/21 was significantly high with 2209 members of King’s staff signing up for a session offered by CTEL, with feedback submissions at 324, so around 14.7% of attendees left feedback. Attendance dropped for the year 2021/22 with 1247 members of King’s staff signing up for a session. This is to be expected as we saw an increase in face-to-face teaching taking place and there were fewer modules delivering fully online teaching, but feedback submissions increased slightly with 373 submissions, around 30% of attendees submitting feedback. This is a positive revelation as although attendance figures fell by almost three quarters, the total amount of feedback submitted increased, and the percentage rate doubled. However, we need to be aware of several caveats with this data.

Figure 3 shows a stacked graph of bookings against feedback submissions for the academic years 2020/21 and 2021/22. With 3456 Bookings, 697 feedback submissions and 20.17% percentage.
Figure 3: Data displayed in a stacked bar graph of bookings against feedback submissions for the academic years 2020/21 and 2021/22.
  • As previously mentioned, the teaching in the academic year 2020/21 was delivered fully online. 2021/22 saw a gradual increase in face-to-face teaching from January onwards, so this data isn’t 100% comparable due to a significant change in circumstances.
  • CTEL ran a total of 29 ‘Breakout Rooms in Microsoft Teams Meetings’ training sessions in 2020/21 with 861 sign-ups, which massively increases attendance data for that academic year. 719 sign-ups occurred in September alone. Breakout rooms were a highly desirable feature of Microsoft Teams Meetings, but the functionality was not robust enough to be rolled out en masse, which may have impacted feedback submissions.
  • As digital capabilities in King’s staff increased throughout the months of the pandemic and demands on staff members’ time have reduced, this may have resulted in more time to engage with and submit feedback.

During the academic year 2020/21, CTEL joined other departments across King’s to offer a full suite of training opportunities in delivering teaching online. During that time, a generic feedback form was sent to attendees which were mainly concerned with joining instructions to Teams Meeting links, so we cannot see detailed responses to questions usually asked on the CTEL feedback form. Based on the 373 feedback responses submitted in 2021/22:

  • ~94% agreed or strongly agreed that they would recommend a CTEL training session to a colleague
  • ~95% agreed or strongly agreed that the session they attended will have a positive impact on their teaching.
  • ~95% agreed or strongly agreed that taking the session was worth their time.
Figure 4 shows feedback of workshops for 2021/22. The top left graph presents subject matter understanding before attending the course. With the factors measuring from novice, basic, proficient and advanced. The top right graph presents subject matter understanding after attending the course. With the factors measuring from novice, basic, proficient and advanced. The bottom graph presents agreement results from selecting different statements. Ranging from strongly disagree, disagree, neutral, agree and strongly agree.
Figure 4: Data displayed for 2021/22 quantitative questions.

The above is very positive as it demonstrates the impact that CTELs training sessions are having across the King’s community. The feedback form has not been changed for the academic year 2022/23 and so data can be easily compared in the future.

Based on an overview of the data, the Microsoft Power Automate process appears to be working well as the feedback response rate has remained steady as attendance figures have dropped, and Power BI has been a very useful tool to display and filter feedback data. Session leads have fed back that the Power BI report is beneficial and allows greater insight into the feedback for their sessions, and the CTEL CPD Chair has passed on positive feedback regarding the overall attendance data visualisations. I am currently working on an additional Power BI report that will utilise row level security to allow Technology Enhanced Learning Managers across the university to see attendance figures for their own faculties and tailor demand or promote courses that CTEL offer at strategic points in the year.

I am pleased that the automated process works and has helped free up time for CTEL staff and helped increase our feedback response rate, but I am dissatisfied that the session leads need to access two pieces of software (Microsoft Excel and Microsoft Power Automate) to get this to work. I am currently investigating whether I can achieve the same results with a Power App (another Microsoft Power Platform application) to improve usability and increase satisfaction.

Useful Links:


Written by Dave Busson-Crowe

Dave Busson-Crowe is a Learning Technologist at the Centre for Technology Enhanced Learning and has been involved with Learning Technology in some capacity for approximately 6 years.

He has a keen interest in the use of artificial intelligence in education.