Evaluation and Reflection, News and Events, Pedagogy, Technologies

Exploring the Digital Frontier: Revolutionizing Feedback Delivery with Excel and VBA Macros Part 2

Part 2: My experience and reflections

Part 1 of this blog can be found here: https://blogs.kcl.ac.uk/digitaleducation/exploring-the-digital-frontier-revolutionizing-feedback-delivery-with-excel-and-vba-macros-part-1/

Icons of the Microsoft 365 shown

I embarked on this journey to a few years ago when I led a large course with more than 700 students and a team of 15 markers. It was a challenging for an early career lecturer to manage the administrative tasks and collaborate with the whole team: standardization, moderation, and uploading/creating feedback documents. The demanding nature of the courses allows for no mistakes or human errors. Working through the clunky and often user-unfriendly interface of Moodle/Turnitin is another difficult obstacle: one would require the Internet to do the marking and as some other colleagues would agree that we often work better off the Internet. Traditional methods were often time-consuming and inconsistent, resulting in delayed feedback that left students wanting more. The technology-driven solution aimed to address these challenges head-on. When I joined KCL in September 2022, I faced the same problems and have been made aware of different initiatives at KBS to improve the feedback and assessment process at KBS. I gathered my own systems over the years and implemented the process during the Jan 2023 marking season. Over the Spring Term 2023, I refined the process with feedback from colleagues who shared the excitements and interest. In June 2023, I presented at the festival to share the practices and implementation strategies for an innovative automation system.

The process involved harnessing the power of Microsoft Excel and VBA Macros within Microsoft Word. These technologies allowed us to streamline and automate feedback delivery. Imagine, no more laborious hours spent typing feedback comments and no human errors involved in exporting and uploading the feedback documents to Keats/Turnitin! Instead, we could focus on providing students with valuable insights to help them excel.

Screenshot of Digital Skills Hub

**Challenges Faced:**

Of course, no transformative journey is without its challenges. Some educators were initially resistant to change, finding the prospect of learning VBA Macros daunting. Additionally, ensuring the new system was compatible with various devices and platforms presented a technical hurdle. As I mentioned in the guidance (see from my SharePoint), the set-up and troubleshooting at the beginning can be quite a challenge, particularly for colleagues using the MacOS system (it’s less so for Windows users). Compatibility issues were addressed through rigorous testing and continuous monitoring of system performance. Clear communication with your marking team is also needed to make sure everyone is on the same page with the new system.

But I promise it’s worth the effort and the subsequential usages will be a much smoother sail. And from a marker’s perspective, it is much less work than working through the traditional channels.

The journey from traditional feedback systems to an automated approach using Excel and VBA Macros has been nothing short of transformative. It’s a testament to the power of technology in education, where innovative solutions can overcome challenges and improve the overall learning experience.

As we continue this path of exploration and adaptation, the future of feedback delivery looks brighter than ever to improved student satisfaction and educational outcomes. I hope that a wider adoption of the process could help deliver a more insightful and time effective feedback to our students, thereby addressing the burning issues identified from the student surveys, as well as helping deliver impacts to the quality of feedback giving and student experience, as identified in King’s Vision 2029 and the TEF framework.

Screenshot of TEF award

It takes time and communications with colleagues to identify compatibility issues and resolve them. So far, the method has been used by six Economics courses at KBS, two from the University of Glasgow; and colleagues from Marketing and Adult Nursing, have expressed their interests in using it in their courses.
It is definitely not perfect, and I am very much looking forward to feedback, comments, and of course successful implementations of colleagues.

The blog discusses a transformative journey in education, initiated during The Festival of Technology 2022 at KCL. It explores the adoption of Excel and VBA Macros within Microsoft Word to revolutionize feedback delivery. The main reasons for this change were to enhance feedback quality and efficiency, addressing challenges like resistance to change and compatibility issues. Through workshops, ongoing support, and rigorous testing, the adoption of technology resulted in a more efficient, user-friendly, and collaborative feedback system, empowering educators and improving the overall learning experience.

I would like to thank KBS colleagues, Jack Fosten, Dragos Radu, and Chahna Gonsalves for their encouragement, important suggestions and feedback as well as allowing me to pilot the process in their modules. I also thank various colleagues across other faculties for providing feedback and suggestions as well as identifying compatibility issues (with solutions).

For additional resources, including the workshop slides and a detailed guide with relevant codes and FAQs, please refer to the SharePoint folder linked here.

I am a Lecturer in Economics at the Department of Economics, King’s Business School, King’s College London. I am also an academic supervisor at the Institute of Finance and Technology, University College of London, and a chief examiner for the University of London International Programme (Econometrics). Before joining King’s, I lectured and conducted research at the London School of Economics as an LSE Fellow in Economics, and at the University of Warwick as a postdoctoral fellow (in Economics). I completed my PhD in Economics at the University of Nottingham in 2018.

I have lectured courses in econometrics and macroeconomics at King’s, LSE, and Warwick, and led seminars (tutorials) in various courses at Nottingham. From March 2023, I am the GTA Lead at King’s Business School.

Evaluation and Reflection, News and Events, Pedagogy, Technologies

Exploring the Digital Frontier: Revolutionizing Feedback Delivery with Excel and VBA Macros Part 1

Part 1: The practical guide

In today’s rapidly evolving digital age, the need for efficient and effective systems in education is more pronounced than ever. Traditional platforms like Moodle and Turnitin have served us well, but as educators, we must acknowledge their limitations in providing timely, user-friendly, and collaborative feedback on assignments, exams, and dissertations.

This tutorial aims to be your guiding light towards a better, more streamlined approach to feedback delivery. Drawing upon my workshop presented during The Festival of Technology 2022 at KCL, where I shared practical insights and implementation strategies for this automation system, we’ll delve into the fascinating world of Excel and VBA Macros within Microsoft Word. This comprehensive resource builds upon the principles discussed in that workshop.

By embarking on this journey, you’ll equip yourself with the skills and knowledge to revolutionize your approach to feedback giving. Here’s what you stand to gain:

1. **Efficiency:** Say goodbye to the laborious and time-consuming task of manually providing feedback using KEATs (Moodle/Turnitin). With Excel and VBA Macros, you’ll learn how to automate the process, saving valuable time that can be redirected towards more meaningful feedback and interactions with your students.

 

Screenshot illustrating what the purpose of the document
Picture 1: Our Aims and Objectives

2. **User-friendliness:** Discover how to create a user-friendly feedback documents for both yourself, the marking team, and your students. Your feedback system will become intuitive and accessible, ensuring that learners can easily understand and act upon your comments with a nicely formatted feedback document.

A screenshot showing the step by step summary for collecting marking and feedback
Picture 2: A summary of steps

3. **Collaboration:** Break free from the constraints of limited collaboration within traditional systems. The method will allow a marking team to efficiently collaborate and moderate, making feedback delivery a seamless and cooperative effort.

Screenshot of marking folder contents
Picture 3: What the marking folder looks like? It is sharable with the marking team

4. **Comprehensive Feedback:** Dive into the world of detailed and constructive feedback. You’ll gain the expertise to provide tailored insights that empower students to excel in their academic pursuits.

Screenshot of excel file showing comments
Picture 4: What a short comment looks like? Totally customizable.

This tutorial isn’t just about learning a new tool; it’s about transforming your approach to education. By mastering Excel and VBA Macros for feedback delivery, you’ll become a more effective educator, making a lasting impact on your students. The system will:
– Enhance your teaching methods, creating a more engaging and supportive learning environment.
– Free up your time that was spent on administrative tasks or dealing with Turnitin/Keats for more meaningful activities such as preparing feedback comments and communication with your team and students.
– For repeated courses/assessments, you can prepare a bank of modal comments for lateral uses, as well as a record of common mistakes and suggestions for improvements to communicate with students.

Screenshot of excel sheet
Picture 5: What the end product of a long feedback document looks like? Totally customizable.

Education is a dynamic field, and keeping pace with technological advancements is essential. The automation possibilities offered by Excel and VBA Macros are not just practical but also intriguing. Discovering how to harness these tools to optimize your feedback process can be genuinely exciting.

Screenshot displaying cautions with using excel
Picture 6: A few cautions?

For additional resources, including the workshop slides and a detailed guide with relevant codes and FAQs, please refer to the SharePoint folder linked here. This tutorial serves as a bridge between the insights shared during the workshop and the practical implementation of an automated feedback system. It’s an opportunity to further explore and master these valuable techniques, all while enhancing the overall learning experience for students. Join us as we embark on this transformative journey together.

Part 2 of this blog can be found here: https://blogs.kcl.ac.uk/digitaleducation/exploring-the-digital-frontier-revolutionizing-feedback-delivery-with-excel-and-vba-macros-part-2/

Evaluation and Reflection, Technologies

Accepting Multiple Assignment Attempts

KEATS (Moodle) allows assignment submissions in many ways – this is a record of how a simple question became an extended investigation.

Academic Staff Requirements

Can I check what my students have previously uploaded?”

An academic colleague had used Blackboard (another Virtual Learning Environment) before coming to the Faculty of Natural, Mathematical & Engineering Sciences (NMES) at King’s. He asked if KEATS, our Moodle instance, could behave like Blackboard and allow students to submit multiple attempts to a programming assignment any time they want.

After a follow-up call with the academic colleague, it became clear that the aim was to be able to access anything students had uploaded prior to their final submission, as the latter might contain a wrong or broken file, and grade with reference to a previous submission (programme code or essay draft).

He had the following requirements:

  • Notification emails to both staff and students when a file is uploaded successfully
  • Students to be able to submit as often as they want
  • Marker to be able to review all uploaded attempts to:
    • be able to award marks if an earlier submitted programme code worked fine but a later submission introduced bugs breaking the programme, and
    • monitor the programming code development and make comments, compare changes, and to prevent collusion

This investigation looks at practical solutions to administering programming assignments as well as non-programming ones such as essays.

Background: Assessments on Blackboard (VLE)

In a Blackboard  assignment, students are required to click a Submit button for the markers to access their work. If multiple attempts are allowed, students can submit other attempts at any time, which will be stored as Attempt 1, Attempt 2 and so on and are available for the markers to view. This way, staff can review previous submissions, however, they cannot access drafts.

Screenshot of a Blackboard assignment allows multiple attempts
Blackboard assignment allows multiple attempts

KEATS: Moodle Assignment

The first step was to investigate the options and settings in Moodle Assignment, which is the tool that was already used by most colleagues for similar assignments.

With our current default settings, students can make changes to their uploaded files as much as they want, and submission is finalised only at the assignment deadline. Although instructors can see the latest uploaded files (draft)  even before the deadline, files removed/replaced by students will no longer be accessible to staff. This means, only one version is accessible to markers.

Multiple submissions can be enabled with the use of the “Require student to click the Submit button” setting for staff to review previous attempts, as on Blackboard. Feedback can be left on each attempt. However, students cannot freely submit new attempts because staff need to be involved to manually grant additional attempts to each student. Submissions are time-stamped and can be reviewed by students and markers, but students can only get notification emails after grading whereas markers can get notifications for submissions. Our problem was not resolved yet.

Screenshot of a marker accessing unsubmitted drafts and leaving feedback
Marker accessing unsubmitted drafts and leaving feedback
Screenshot of a student reviewing feedback for different attempts
Student reviewing feedback for different attempts.
Screenshot of a student reviewing feedback for different attempts (cropped version with highlight).
Student reviewing feedback for different attempts (cropped version with highlight).

KEATS: Moodle Quiz

We then considered Moodle Quiz, which some departments at King’s already use to collect scanned exam scripts: a Quiz containing an Essay-type question that allows file upload.

Screenshot of quiz attempt.

While exams usually only allow one single attempt, Moodle Quiz can be set to allow multiple attempts (Grade > Attempts allowed). The “Enforced delay between attempts” setting (from seconds to weeks) under “Extra restrictions on attempts” may be used to avoid spamming attempts. Student can submit new attempts as often as needed because no staff intervention is needed. The drawback is that there is no submission notification or emails, but the quiz summary screen should indicate to the student that the file is submitted. The Quiz attempts page for markers allows for easy review of previous attempts and feedback on each attempt. It is also possible to download all submissions as in Moodle Assignment. This was recommended to the academic colleague as an interim solution while we continued the investigation.

Possible Policy Concerns

Regarding unlimited re-submissions, Quality Assurance colleagues reminded us that students may challenge (i) a perceived inequality in opportunities to get feedback, or (ii) subconscious bias based on previous submissions. Good communication with students and a structured schedule or arrangements should improve expectations from both sides.

Turnitin Assignment and Other Assessment Options

Although the Moodle Quiz appeared to be a solution, we also considered other tools, some of which are readily integrated with KEATS at King’s:

Turnitin assignment allows multiple submissions as an option, but re-submissions will overwrite previously uploaded files. Alternatively, if it is set to a multi-part assignment, each part will be considered mandatory. However, the workflow for Turnitin assignment is not optimal for programming assignments.

Turnitin’s Gradescope offers Multi-Version Assignments for certain assignment types. It is available on KEATS for the Faculty of Natural, Mathematical, and Engineering Sciences (NMES). However, its programming assignment does not support assignment versioning yet.

Edit history is available for Moodle Wiki and OU Wiki; whereas Moodle Forum, Open Forum, Padlet and OU Blog allow continuous participation and interaction between students. These tools could be useful for group programming or other social collaborative learning projects, which is not a direct replacement for an individual programming assignment but an alternative mode of assessment.

Portfolios: Mahara has Timeline (version tracking) as an experimental feature. This may be suitable for essays but not for programming assignments.

Tracking Changes

Tracking changes is an important feature to show development in programming assignments or essays, and cloud platforms (OneDrive, Google Drive, GitHub) can host files and track changes. When used for assignments, student can submit a Share link to allow instructors to access and assess their work and how the work evolved over time. The disadvantage for this option is that the grading experience will be less integrated with Moodle. Some cloud platforms offer a File request feature where students can submit their files to a single location.

Programming Assignments

Industries such as software development use Git as a standard and all changes are tracked. GitHub offers GitHub Classroom, and it can be used with different VLEs including Moodle, but it is not readily integrated with KEATS and requires setup. There may be privacy concerns as students need to link their own accounts.

The Outcomes / Lessons learnt

  • This showcases how a simple question from academic colleague can lead to the exploration of various existing options and exploration of new tools and solutions.
  • Different options are available on KEATS with their pros and cons.
  • Existing tools, possible solutions, policies, and other considerations come into play.

Conclusion / Recommendations

KEATS Quiz matches the case requirements and was recommended to the academic colleague. It went smoothly and our colleague mentioned there were no complaints from students and they are happy with the recommended solution. It is relatively easy to setup and straightforward for students to submit. Clear step-by-step instructions  to staff and students should be enough, but trialling this with a formative assignment would also help.

Depending on the task or subject nature, other tools may work better for different kinds of tasks. TEL colleagues are always there to help!


Useful Links


Written by Antonio Cheung

Antonio is a Senior TEL Officer at the Faculty of Natural, Mathematical and Engineering Sciences (NMES).

September 2023

Evaluation and Reflection

KEATS Similarity Checker Project

Overview of project

Between July 2022 and February 2023, the SSPP TEL team conducted a pilot project to improve the student experience when submitting assignments by creating a special area for students to check the plagiarism/similarity score of their assignments. The goal of the pilot was to make it easier for students with Mitigating Circumstances and the Programme Office staff to manage the process of submitting assignments to KEATS.

Any student who is not subject to Mitigating Circumstances can submit a draft and/or reupload their submission as many times as they wish up to the assessment’s original due date. Many students use this opportunity to submit a draft to check their similarity score before they make their final submission. At the moment, due to technical limitations within KEATS/ Turnitin, students who are granted an extension to an assessment via the Mitigating Circumstances process cannot submit a draft to check their similarity score; they are only allowed to submit once, and after the due date for the assignment passes they no longer have the option to upload their final version.

This is particularly problematic for students who have submitted a draft (sometimes long before the original due date) and then realise they need to apply for Mitigating Circumstances: as they are not able to delete the draft themselves, this draft will be considered their final submission and their MC claim may be rejected on the basis that they have already made a submission. In some departments, PS Staff sometimes agrees to submit and/or delete a draft for a student, but this is time consuming, not consistently applied, and it relies too much on PS Staff being available and inclined to help outside of their normal duties; it is also not sustainable when taking into account the very high number of MC claims we process at the moment.

First Steps

The departments of Geography and Global Health and Social Medicine in the Faculty of SSPP took part in the initial pilot project for their re-sit and dissertation students, and the Similarity Checker (SM) area was created and placed on their Handbook pages on KEATS. Accompanying it was a video and PDF to explain to students how to use the SM, as well as a warning text to reinforce the idea that this did not count as a submission and would not be checked by staff.

Feedback from this small cohort of students led to some revisions and changes to the SM, the most notable of which was around the language used. We had used the words “test area”, meaning to check or trial something, but students for whom English was not their native language found this confusing and equated “test” to mean exam. This was revised and the wording was changed from “test submission area” and “test area” to “Similarity Checker” and “practice area” respectively.

Once we were happy with the revisions, the SM was then rolled out to the rest of the School of Global Affairs, War Studies, and Education, Communication and Society. All Similarity Checker areas have the same layout, same wording and same instructions for parity across all the Schools. Communications for staff and students were also created by Soshana and these were used by Departments to make students and academic staff aware of the existence of the SM.

Layout

The Similarity Checker is made up of several parts. This includes an introductory text explaining what it would be used for, how to use it and a disclaimer that nothing submitted here would ever be moved nor assessed. An explainer video and PDF instructions were added to ensure accessibility and inclusive design were adhered to, so that all students would be able to clearly understand the functionality.

Screenshot of the home screen of the similarity checker.
Screenshot of the geography similarity checker.

The submission areas were divided by level and surname. There is no functional necessity for this, but it aims to prevent Turnitin from getting overloaded by all students in the one department trying to access it at the same time. If students submit in the wrong area there are no effects on their score or submission.

Screenshot of the different Turnitin Submissions.
Screenshot of the different Turnitin Submissions.

Student Feedback

A survey was created by Soshana and shared with all participating Schools, with almost 100 responses. Feedback was generally positive, with students highlighting how the SM improved their experience and confirming that it constitutes an equalising factor for students with extensions. Overall, 90% of respondents have used the SM, 93% found it useful, and 16% used it in the context of an assessment extension (mitigating circumstances).There was also some negative feedback from students who did not find it particularly beneficial, mainly due to the long turnaround time for their score after their third submission, as well as the fact that their score changed repeatedly when uploading a new draft of the same work, depending on how close the assessment due date was. These concerns will be addressed, and elements of response will be provided in future communications.

Overview of survey respondents.
Overview of survey respondents.
Respondents usage by level of study.
Respondents usage by level of study.
Respondents use of the Similarity Checker.
Respondents use of the Similarity Checker.

Conclusion and next steps

The pilot project was a successful start to improving the experience of students and staff using KEATS and Turnitin during their submission period. This was initially to improve the experience of those with Mitigating Circumstances, but we can see that many students without extensions are also using it to check their work.

Next steps will include rolling this out further to other Schools or Departments so that all students in SSPP can access it. Some Departments have their own versions, which we would like to replace with this more modern iteration of the Similarity Checker.

As next steps, the TEL team would like to address some of the points that the students raised as part of the feedback process, and create a communications plan to ensure this is being communicated to students at all relevant points of the academic year.

An all-Faculty stance should also be drawn up if/when a student submits their paper to the Similarity Checker instead of their module page and how this should be dealt with.


Written by Leanne Kelly Leanne Kelly

Leanne is the Digital Education Manager for the Faculty of Social Science and Public Policy (SSPP) at King’s College London. She is responsible for a wide range for digital education processes within the Faculty including instructional design, accessibility, training, innovation and developing new online programmes.

She has a background in publishing and eLearning, and is passionate about using technology to improve the learning experience and make it more accessible to all. She is interested in developing new ways of working, scaling projects and reusing content in new ways, and making online learning an enjoyable process for all.

Written by Soshana Fearn

Soshana Fearn

Soshana is the Senior Postgraduate Programme Officer for the Department of Geography (SSPP) at King’s College London. She delivers the day-to-day administration of taught postgraduate programmes (Masters), offers comprehensive and authoritative advice and support for all staff and students in respect of programme regulations and curriculum choices, services the relevant boards and committees, and oversees the processing of Mitigating Circumstances requests.

She has a background in project coordination and is dedicated to improve the experience of both students and staff through the development and implementation of streamlined innovative solutions, including projects related to institutional processes, policymaking and technology-enhanced learning resources.


 

Evaluation and Reflection, Technologies

Introducing CMALT programme at King’s College – Part 1

In September 2021 King’s launched its first CMALT (Certified Membership of the Association for Learning Technology) programme cohort aimed at helping 15 colleagues put together an evidence-based portfolio in order to gain CMALT accreditation.

What is CMALT?
Certified Membership of the Association for Learning Technology (CMALT) is the learning technologist’s ‘kitemark.’ This professional certification (and membership) recognises your expertise and experience in your field. Benefits to candidates are in the form of reflection on their professional practice, mentoring from experienced colleagues and peer-to-peer support. CMALT also, increasingly, appears on TEL job specifications so gaining CMALT (and with it post-nominal letters which you are allowed then to use) provides both CPD and career-development opportunities. You join an established community or practice, are invited to ALT meetings and events; and can view and contribute to publications.

The CMALT Accreditation Framework provides pathways to peer-assessed accreditation for a cross-section of learning technology-focused professionals, educators and administrators in the UK and internationally. Accreditation is achieved by the successful submission of a reflective, online portfolio, which evidences skills and experience in learning technology across four core areas and a specialist area. There are three different pathways to choose from to best match an individual’s experience: Associate Certified Member, Certified Member and Senior Certified Member.

First steps:
•  Having joined King’s in 2018, I quickly realised I was one of only a few colleagues in the institution that held CMALT accreditation. Given the size of King’s and the number of TEL colleagues, I wanted to see if I could support colleagues in gaining CMALT recognition.
•  I originally attempted to launch a pilot programme in late February 2019 with a small group of colleagues from my team in CTEL, however, the Covid pandemic hit and efforts were instead prioritised elsewhere.
•  In early 2021, I along with three other colleagues (David Reid Matthews, Danielle Johnston and Fariha Choi) came together to resurrect the CMALT programme. We formed the CMALT planning team to create a year-long programme and agreed to become mentors to the colleagues taking part.
We successfully bid for funding from the Students and Education Directorate (SED) for up to 20 places on the new programme (see CMALT registration fees for more info).
•  We began sharing information about the new programme and asked interested parties to complete a show of interest form (Google Form).
In the summer of 2021, we invited all interested colleagues to an online (MS Teams) 1h CMALT Information Session to provide further details about the accreditation and what the programme entailed. Of 25 that attended the session 15 opted to sign up for the 2021/22 CMALT programme with the remaining either deferring to the following academic year or deciding to remove their interest from the programme.

Make up the first Cohort:

Pie Chart showing the distribution of the 15 members who attend the programme. With Faculty TEL having the highest with 8, Followed by CTEL with 5 and Other being 2.

The programme schedule
•  The programme would start in September 2021 with each month focusing on a section of the portfolio, delivered purely online via MS Teams. We provided two months (December and June) when no meetings took place to allow colleagues to catch up.

Here is the full schedule of our CMALT programme:

For more detail please refer to Part 2 of this blog:

https://blogs.kcl.ac.uk/digitaleducation/?p=1511&preview=true

 

Written by Sultan Wadud and David Reid Matthews

Wadud works as a Learning Technologist, Faculty Liaison at CTEL, working closely with Academic Faculties and Departments to support and drive the implementation of the King’s Technology Enhanced Learning ‘Transformation in Digital Learning’ strategy.
Wadud supports the management and delivery of multiple projects aimed at both the development of academics’ pedagogic understanding and the practice of technology enhanced learning.
Wadud is the product owner for Kaltura and one of the leads for the CMALT programme at King’s. In addition to this Wadud oversees the Digital Education Blog.

David is the TEL Manager for Arts & Humanities and joined King’s in 2018. He leads a team of learning technologists supporting a large and complex faculty, providing mainly 2nd line support, training and troubleshooting on our core, recognised and recommended TEL tools. David has worked in learning technology since 2011, having previously (and improbably) been a Lecturer in Theatre Studies. His particular interests are in legislation and policy around TEL, as well as IT Service Management and Delivery. David is one of the leads for the CMALT programme at King’s.

Evaluation and Reflection, Pedagogy, Technologies

Introducing CMALT programme at King’s College – part 2

This is Part 2 of Introducing CMALT programme at King’s College (Read part 1)

Resources and interactions
• We created a Moodle site to host all the information relating to CMALT accreditation and provided resource links, session recordings and presentations for colleagues to be able to refer to or catch up on anything they have missed.

KEATS page for CMALT programme

• In addition to the Moodle area we set up a Microsoft Teams site to allow us to send general announcements, plan for meetings and private areas for mentor groups:


Example of Teams announcement to Cohort 1:

Mentor support
• Whilst each meeting had an opportunity for colleagues to have shared contact time with their mentors, additional mentor support was provided on an ad hoc and individual basis. In Cohort 1, some colleagues utilised this consistently throughout the programme whilst a few left it to the end to seek help.
• 93% of Cohort 1 either strongly agreed or agreed their mentors facilitated appropriate discussion and reflection throughout the programme:

• 12 out of the 15 colleagues took the opportunity and connected with their mentors outside of the monthly Teams meetings.

Cohort 1 completion
• Overall, we had 14 submissions to ALT with one colleague deciding to re-join the programme with Cohort 3.
• We received feedback from all colleagues who took part in the programme, with the majority offering positive feedback. Nearly all colleagues fed back that the frequency (monthly meetings) and length of sessions (1h) of the sessions were just right.
The majority utilised the Moodle areas during their time on the programme.
• “Being a part of a cohort was great and enabled me to work collaboratively/share ideas with others on this project. However, starting very early on in the process without the pressure of fixed deadlines meant I probably took it too easy, so having deadlines for (formative) feedback in the 6 months run-up to our submission date would have been helpful”
• We took this feedback on board to introduce two draft deadlines for sections 1-2 by January and 2-3 by late March. In addition to this, we encouraged colleagues not to leave it to the end to seek help and have regular contact with mentors.
• Moodle discussion board – except for two posts in the Moodle discussion board we noticed majority of interactions were taking place in our Teams areas. For Cohort 2 we decided to remove the Moodle discussion board and replace it with one in the Teams area.

Cohort 2 and beyond
• In 2022 we expanded the programme for Cohort 2 to include all three pathways of CMALT which resulted in 22 signups (x18 CMALT, x3 Associate CMALT and x1 Senior CMALT).
• If funding is provided for a third cohort, we will offer the senior CMALT pathway as for Cohort 2 Senior CMALT was only available to the mentors.
• King’s has recently applied to do CMALT in-house accreditation which we hope will allow us to provide quicker assessment and feedback turnout.
• ALT requires CMALT holders to refresh their portfolios after three years of obtaining accreditation. This is something the planning team is anticipating offering to the first cohort in 2026.
• The long-term aspiration of the CMALT programme at King’s is for it to become self-seeding after the first few years. We have already had one Cohort 1 candidate who has become a Cohort 2 mentor, and, in future, we expect CMALT holders to move up the CMALT pathways once they gain more experience as well as come back to mentor and support the next generation.

Written by Sultan Wadud and David Reid Matthews

Wadud works as a Learning Technologist, Faculty Liaison at CTEL, working closely with Academic Faculties and Departments to support and drive the implementation of the King’s Technology Enhanced Learning ‘Transformation in Digital Learning’ strategy.
Wadud supports the management and delivery of multiple projects aimed at both the development of academics’ pedagogic understanding and the practice of technology enhanced learning.
Wadud is the product owner for Kaltura and one of the leads for the CMALT programme at King’s. In addition to this Wadud oversees the Digital Education Blog.

David is the TEL Manager for Arts & Humanities and joined King’s in 2018. He leads a team of learning technologists supporting a large and complex faculty, providing mainly 2nd line support, training and troubleshooting on our core, recognised and recommended TEL tools. David has worked in learning technology since 2011, having previously (and improbably) been a Lecturer in Theatre Studies. His particular interests are in legislation and policy around TEL, as well as IT Service Management and Delivery. David is one of the leads for the CMALT programme at King’s.

Evaluation and Reflection, Technologies, Uncategorized

Using Microsoft Power Platform to Support Staff Development

This blog post is a follow-up to the post Using Automation to Facilitate Flipped Learning. 

Due to the success of using Microsoft Power Automate to support flipped learning for an advanced KEATS training session, the Power Automate process was rolled out to all training sessions offered by Centre for Technology Enhanced Learning (CTEL) and other elements of the Microsoft Power Platform have been introduced. This blog post will outline the processes, successes, and challenges. 

Rollout of Power Automate Process 

Once the decision was made by the CTEL CPD Chair to roll out the Power Automate process to all CTEL training sessions, we needed to decide how this would work practically. The established process, or flow, used for the KEATS: Personalising the Learning Experience training session was built specifically for that session and was the only one that included pre-session work. We navigated around this by creating individual flows for each of the sessions run by CTEL and removing and adding steps as appropriate. The Assistant Learning Technologist for the team and I worked closely with the session leads, customising and creating the flows and encouraging personalisation of their flows to better reflect their sessions, such as adding attachments and editing email text.

Another risk was that the flows would reference the same Microsoft Excel spreadsheet hosted on a SharePoint site, and with multiple people accessing and changing the data, flows could be affected and send out emails at the wrong time. We navigated around this by creating individual spreadsheets and pointing the flows to the relevant spreadsheets for each session, which allowed for further customisation from session leads if desired. 

Flows are usually triggered 2 working days before the session is scheduled, but this can vary if the pre-session work required will take more time to complete. The core template automatically completes 5 core steps when triggered: 

  1. Creates a Microsoft Teams Meeting and invites all attendees to the meeting. 
  2. Sends an email containing further information about the session if needed. This is on a 5-minute delay to allow for any manual intervention should any mistakes be made in step 1. 
  3. Sends an email to participants as soon as the scheduled session is finished containing links to further resources and a request for feedback to be left via the Microsoft Form. 
  4. Sends an email 2 working days to participants after the session has been completed, asking for feedback if it hasn’t been left. 
  5. Sends an email 10 working days after the session to participants to ask if they have attempted any content that was covered in the session and if they have any success stories to share or need any further support. 

The above template is core and session leads can add further steps as relevant to their own flows. Working days are calculated within formulas in the Excel spreadsheet and are utilised to increase the response rate, rather than send emails out over a weekend or bank holiday which can be ignored. 

Displaying Feedback in Power BI 

The Power Automate process outlined above is fairly simple in terms of its structure and aims; it sends out emails at predetermined times based on the date and time of the relevant session, which is calculated in the Microsoft Excel spreadsheet. Each email sent to attendees once the session has been completed contains a link to CTELs feedback form. Once feedback is submitted, it is collected, anonymised, and stored. I designed a Power BI report to display the quantitative and qualitative data submitted to display the impact of the sessions and assist each session lead with making any changes using free text submissions.

An Overview of CTEL Feedback organised by session titles, calendar month and by faculty. The graph on the top-left presents which workshops in order of being given the highest feedback. The graph on the top-right presents volume of feedback given by different faculties. The graph on the bottom left presents the average and highs of how much feedback was given per calendar month. The bottom right graphs show when the drilldown date of workshops and the graph below shows overall feedback of 373 out of 746.
Figure 1: The Power BI report for feedback submitted for CTEL training sessions. Data is organised by session title, calendar month, and by faculty. 373 pieces of feedback were left for 2021/22.

After organising this data and gaining experience in Power BI, I was able to link this data to attendance data extracted from SkillsForge, to gain insights into our historical CPD attendance and how this relates to our feedback submissions. An advantage to this was to see if the flows had an impact on gathering feedback for our sessions.

Figure 2 presents the attendance of CTEL training sessions. Organised by calendar month, attendance type by sessions, and by faculty. The graphs on the top left present the overview of attendance of 2021/2022 Academic Year. With 1243 total sign ups out of 2486 and 166 number of sessions. The bottom left graph shows signups by workshop over time by index per month. The top-left graph shows signups by faculties ranging 0-200+. The bottom left graph shows attendance types by sessions of each workshop.
Figure 2: The Power BI report for attendance at CTEL training sessions. Data is organised by calendar month, attendance type by sessions, and by faculty.

Key findings revealed that attendance for 2020/21 was significantly high with 2209 members of King’s staff signing up for a session offered by CTEL, with feedback submissions at 324, so around 14.7% of attendees left feedback. Attendance dropped for the year 2021/22 with 1247 members of King’s staff signing up for a session. This is to be expected as we saw an increase in face-to-face teaching taking place and there were fewer modules delivering fully online teaching, but feedback submissions increased slightly with 373 submissions, around 30% of attendees submitting feedback. This is a positive revelation as although attendance figures fell by almost three quarters, the total amount of feedback submitted increased, and the percentage rate doubled. However, we need to be aware of several caveats with this data.

Figure 3 shows a stacked graph of bookings against feedback submissions for the academic years 2020/21 and 2021/22. With 3456 Bookings, 697 feedback submissions and 20.17% percentage.
Figure 3: Data displayed in a stacked bar graph of bookings against feedback submissions for the academic years 2020/21 and 2021/22.
  • As previously mentioned, the teaching in the academic year 2020/21 was delivered fully online. 2021/22 saw a gradual increase in face-to-face teaching from January onwards, so this data isn’t 100% comparable due to a significant change in circumstances.
  • CTEL ran a total of 29 ‘Breakout Rooms in Microsoft Teams Meetings’ training sessions in 2020/21 with 861 sign-ups, which massively increases attendance data for that academic year. 719 sign-ups occurred in September alone. Breakout rooms were a highly desirable feature of Microsoft Teams Meetings, but the functionality was not robust enough to be rolled out en masse, which may have impacted feedback submissions.
  • As digital capabilities in King’s staff increased throughout the months of the pandemic and demands on staff members’ time have reduced, this may have resulted in more time to engage with and submit feedback.

During the academic year 2020/21, CTEL joined other departments across King’s to offer a full suite of training opportunities in delivering teaching online. During that time, a generic feedback form was sent to attendees which were mainly concerned with joining instructions to Teams Meeting links, so we cannot see detailed responses to questions usually asked on the CTEL feedback form. Based on the 373 feedback responses submitted in 2021/22:

  • ~94% agreed or strongly agreed that they would recommend a CTEL training session to a colleague
  • ~95% agreed or strongly agreed that the session they attended will have a positive impact on their teaching.
  • ~95% agreed or strongly agreed that taking the session was worth their time.
Figure 4 shows feedback of workshops for 2021/22. The top left graph presents subject matter understanding before attending the course. With the factors measuring from novice, basic, proficient and advanced. The top right graph presents subject matter understanding after attending the course. With the factors measuring from novice, basic, proficient and advanced. The bottom graph presents agreement results from selecting different statements. Ranging from strongly disagree, disagree, neutral, agree and strongly agree.
Figure 4: Data displayed for 2021/22 quantitative questions.

The above is very positive as it demonstrates the impact that CTELs training sessions are having across the King’s community. The feedback form has not been changed for the academic year 2022/23 and so data can be easily compared in the future.

Based on an overview of the data, the Microsoft Power Automate process appears to be working well as the feedback response rate has remained steady as attendance figures have dropped, and Power BI has been a very useful tool to display and filter feedback data. Session leads have fed back that the Power BI report is beneficial and allows greater insight into the feedback for their sessions, and the CTEL CPD Chair has passed on positive feedback regarding the overall attendance data visualisations. I am currently working on an additional Power BI report that will utilise row level security to allow Technology Enhanced Learning Managers across the university to see attendance figures for their own faculties and tailor demand or promote courses that CTEL offer at strategic points in the year.

I am pleased that the automated process works and has helped free up time for CTEL staff and helped increase our feedback response rate, but I am dissatisfied that the session leads need to access two pieces of software (Microsoft Excel and Microsoft Power Automate) to get this to work. I am currently investigating whether I can achieve the same results with a Power App (another Microsoft Power Platform application) to improve usability and increase satisfaction.

Useful Links:


Written by Dave Busson-Crowe

Dave Busson-Crowe is a Learning Technologist at the Centre for Technology Enhanced Learning and has been involved with Learning Technology in some capacity for approximately 6 years.

He has a keen interest in the use of artificial intelligence in education.

MLC Accessibility 1
Evaluation and Reflection

King’s Language Centre Approach to Digital Education Accessibility

A holistic approach was adopted by the King’s Language Centre (LC) to comply with the legal requirements of accessibility. The Senior Leadership Team (SLT) has been instrumental in driving the change not only with the academics, but also with members of the Professional Service team (PSS). The approach combined top-down and bottom-up strategies that have been successful in improving all our digital resources, including KEATS pages, educational materials, and templates for essential documentation.  Continue reading “King’s Language Centre Approach to Digital Education Accessibility”

Evaluation and Reflection

Part 2: Embedding Digital Accessibility to the heart of everyday TEL work

This article has been divided in two parts. Part 1 discusses how the SSPP TEL team works to boost staff understanding of digital accessibility baseline. Part 2 provides an overview of the TEL training sessions available on the subject.


Part 2: TEL training sessions

The TEL Hub digital accessibility section is aimed at staff to access information in order to meet the basic requirements of digital accessibility baseline from a more administrative perspective. Continue reading “Part 2: Embedding Digital Accessibility to the heart of everyday TEL work”

Evaluation and Reflection

Part 1: Embedding Digital Accessibility to the heart of everyday TEL work

This article has been divided in two parts. Part 1 discusses how the SSPP TEL team works to boost staff understanding of digital accessibility baseline. Part 2 provides an overview of the TEL training sessions available on the subject.


Allyship:

“the state of being an ally (= a person who helps and supports somebody) to a particular group of people that you yourself do not belong to, in order to help ensure their basic rights and ability to be happy and successful in society”- Oxford learner’s dictionary

What does digital accessibility mean to me? It means being able to embrace a platform for its intended use, because the people building it know how to create a great user experience – anticipating, rather than reacting to the needs of our diverse learning community. Being an ally. Continue reading “Part 1: Embedding Digital Accessibility to the heart of everyday TEL work”