Tests and polls

Hand holding phone displaying Polleverywhere response system

What is it?

Tests, sometimes known as Quizzes, are usually no-stakes or low stakes opportunities for learning (i.e. students work with knowledge as well as recalling what they already know) which can take place within a lecture or large seminar setting. Polls ask students for their perspectives and can surface different viewpoints, alternatives or dilemmas. The vast majority of students have networked mobile device (phone, tablet, laptop) they can use during sessions or share with somebody who doesn’t. King’s offers every staff member an uncapped licence for PollEverywhere, a student response system allowing different types of question and response. The educator poses questions to test knowledge or poll opinion, using closed multiple choice questions or open responses. After students have responded, the software visualises the responses for the educator to display so that they can give feedback or start a discussion. The software can also generate configurable reports which can be downloaded and analysed.

This guide is about in-session tests and polls, typically with immediate feedback given to the whole cohort in person. For guidance on online quizzes which students complete between sessions, see the ‘Assessment for Learning at King’s‘ resource.

Why do it?

Some students – for example, students for whom English is not a first language – struggle to organise their responses to questions quickly enough to voice them, and find that somebody else has got there first. The wait time built into tests and polls enables more equitable participation and gives valuable feedback to the educator about knowledge or views across the cohort.

Considering tests in particular, evidence about the effects of response systems on student attainment is not clearcut. This is not surprising given the diversity of contexts and variable question design; pedagogy, learning strategy and prior knowledge each mediate the effect of response systems. That said, however, many researchers have observed positive effects. Anderson and colleagues (2015) found a positive impact for poorly performing students on more challenging quantitative courses, Shapiro and colleagues (2017) found that both factual and conceptual clicker questions boosted longer-term factual retention in large lecture theatre settings, particularly in students lacking motivation or deep learning strategies. Use of response systems has been shown to increase engagement and attention (Heaslip and colleagues, 2014; Lane and Harris, 2015).

Before a learning event, testing can be helpful to activate prior knowledge. Afterwards it can be helpful in consolidating learning and weave it into learning from the more distant past. Pooja K. Agarwal (professor of Liberal Arts and cognitive scientist at Berklee College of Music) briefly sets out three reasons why one approach, retrieval practice, advances learning across the disciplines – namely ‘use it or lose it’, desirable difficulties and metacognition.

Testing also functions as feedback for you as educator. It helps ascertain whether your teaching strategies have been successful and provides an occasion to respond to misconceptions. While you could ask students if they understand so far, and look for a critical mass of nods, the least confident students will be reluctant to admit that they are struggling. Testing can let you know whether your students are ready to proceed to the next segment of a lecture.

The digital form allows for rapid processing and visualisation of student responses, giving the educator immediate feedback about their teaching strategies and promoting students’ metacognition.

Classroom response technologies are often deployed in ‘peer instruction‘ activities.

How to set it up

Before the session

  1. Prepare the questions and their order (see Considerations for question types). PollEverywhere question types include multiple choice, open ended, ranking, and clickable images.
  2. Prepare the in-person feedback or discussion prompts to elicit students’ thinking or reasons. Consider what you might say depending on the spread of student responses.

During the session

  1. At the beginning of the session ask students to enable their devices and navigate to the page where the questions will display.
  2. At the appropriate time, activate and introduce the first question.
  3. Give students time to respond (they may need a little more time for the first and second questions to get into the swing of things).
  4. Display the responses as visualised by the software e.g. barchart or scatterplot.
  5. Respond to the discussions with feedback, or by asking students to explain the responses.

After the session

  • Analyse the report and consider how to respond to any patterns which reveal misconceptions or confusion.

Considerations

Testing

Types of questions may be:

  • Retrieval or recall. These check comprehension and are useful across disciplines.
  • Conceptual understanding. These ask students to match, classify or select the best explanation or definition for a concept. To surface and respond to misunderstandings, educators can author distractors based on common misconceptions.
  • Application. These ask students to select the best alternative for a given situation.
  • Critical thinking. These ask students to consider relationships between concepts within a given context and select a well-reasoned answer rather than an objectively correct answer. There is often a subsequent discussion about their reasons.

Well-designed multiple choice questions (MCQs) can support advanced learning. Pooja Agarwal draws on educational research to address some of the reservations about using them as form of learning (rather than only to measure learning). There’s an art to writing good multiple choice questions, summarised with examples in Cynthia Brame’s guidance. Below, Elizabeth Bjork succinctly explains the principles underpinning good MCQs. Her main recommendation is to spend time designing the questions.

There is nothing intrinsic to technologies which will bring about improvements in students’ learning – design decisions and practice are key. For example, Amy Shapiro and colleagues (2017) have reservations about the power of student response systems to boost conceptual understanding in large lecture theatre settings. Their findings support Eric Mazur’s insistence that it is “multiple strategies for involving students in a meaningful way with the course material” rather than technological innovation alone which brings about the benefits.  See our guide on Mazur’s ‘Peer instruction‘ approach for one such strategy.

If you are testing students, plan to space the tests over several sessions rather than concentrating them all in a single revision session at the end.

Even if only a small number of students get a question wrong and you decide to move on without revising the concept, do take a few moments to direct those students to a good explanation of the concept – perhaps some previous lecture notes or text book section. Remind students where they can find support so that nobody feels abandoned or beyond help.

Polling

Polling questions may take the form of collecting students’ perspectives on a question, or monitoring (e.g. “Have you completed a draft of your essay?”, “How many hours did you spend on the project?”).

PollEverywhere at King’s

Responses on PollEverywhere are anonymous by default but you can set questions to prompt students to give their name before they respond. The name only displays for the educator, not other students. To get an unlimited PollEverywhere licence (more than 40 respondents, a customised easy web address to give your respondents) first create an account and then contact IT to ask for it to be upgraded. CTEL support PollEverywhere at King’s – see ‘How to create an account‘.

Flashcards (online)

Flashcards have a closed question with one correct answer on the front, and the correct answer on the back. Despite seeming most suited to foundational knowledge (recall and comprehension), as Elizabeth Bjork indicates above, it is possible to ask questions which test application, integration and evaluation too. At King’s, from the Software Centre you can download Anki (highly-regarded open source, media-friendly flashcards.

Examples and resources

On PollEverywhere:

Cynthia Brame demonstrates multiple choice questions with biology examples.

References

Anderson, S., Goss, A., Inglis, M., Kaplan, A., Samarbakhsh, L., Toffanin, M., 2018. Do clickers work for students with poorer grades and in harder courses? Journal of Further and Higher Education 42, 797–807. https://doi.org/10.1080/0309877X.2017.1323188

Heaslip, G., Donovan, P., Cullen, J.G., 2014. Student response systems and learner engagement in large classes. Active Learning in Higher Education 15, 11–24. https://doi.org/10.1177/1469787413514648.

Lane, E., Harris, S., 2015. A New Tool for Measuring Student Behavioral Engagement in Large University Classes. Journal of College Science Teaching 44, 83–91. http://www.cwsei.ubc.ca/SEI_research/files/Geo_Ocean/Lane-Harris_Meas-Engagement_JCST2015.pdf.

Shapiro, A.M., Sims-Knight, J., O’Rielly, G.V., Capaldo, P., Pedlow, T., Gordon, L., Monteiro, K., 2017. Clickers can promote fact retention but impede conceptual understanding: The effect of the interaction between clicker use and pedagogy on learning. Computers & Education 111, 44–59. https://doi.org/10.1016/j.compedu.2017.03.017.

Be the first to comment

Leave a Reply

Your email address will not be published.


*