Student Voice

The Student Voice Blog

Insights and resources to support better data analysis in education

Student Response Systems in Large Active-Learning Classrooms

By David Griffin

Often in large university lecture theatres only a handful of students actively participate through the asking and answering of questions. According to Caserta et al., (2021), the lack of engagement from the majority, particularly in the early years of university tuition, is associated with a range of negative outcomes including exam failure and course dropout. One broad range of approaches employed to tackle these problems through increased student engagement is termed ‘active learning’. This includes tactics such as introducing class and small group discussions, the use of flipped classrooms, posing questions to a class and polling student responses.

Polling student responses to questions using portable Student Response Systems (SRS; often known simply as ‘clickers’) has been employed by educators for more than two decades. One of the most common uses of these systems is where the lecturer poses a multiple-choice question to the class. Each student then answers the question using their own SRS and the group results are displayed graphically for the class to see and discuss. The widespread ownership of smartphones along with the accessibility of software means most university students nowadays have easy access to their own SRS device at all times.

However, Caserta et al. (2021) recognized the challenge of introducing such active learning techniques in engineering lectures due to the cognitively taxing problems often being addressed. They also wished to tackle a lack of motivation to learn and engage in their students. They decided to test the usefulness of introducing class polling through SRS in a second-year Thermodynamics course within their university’s Chemical Engineering programme. The Thermodynamics course normally included three minor assessments given throughout the academic year. In order to gain a pass in the course overall, students were required to pass either all three minor assessments or a final exam covering the entire year’s work.

For three different class cohorts between 2014 and 2017, students were asked to complete SRS quizzes 1-2 weeks before each of the three minor assessments. The format of each quiz was as follows:

  • Students were asked to answer 10 multiple-choice questions using free SRS software on their laptop or smartphone.
  • They were allowed to interact with other students and refer to notes as they wished, however the time constraints of the quiz limited this.
  • Each question was presented for between 30 and 60 seconds only.
  • The total quiz time including setup and login time was 20-30 minutes.
  • The breakdown of answers for each question were displayed to the class on a histogram after the completion of each quiz. While each student’s individual result was recorded, they retained their anonymity when results were publicly displayed. This was done to increase their level of comfort and reduce feelings of peer pressure.

To assess student perceptions of SRS use in the classroom, students were asked to complete a questionnaire after the quizzes. Three metrics were also used to determine the effect of SRS use. These were:

  • Student withdrawal from the course
  • Student completion of the course
  • Assessment scores

The results for the three cohorts which partook in the SRS quizzes (the SRS Group) were compared to a control group consisting of the three cohorts which immediately preceded them in the years 2011-2014 (the Control Group).

The authors saw mixed results in this work. The main findings can be summarised as follows:

  • There was no significant improvement in student dropout rates in the SRS Group when compared with the Control Group.
  • A statistically insignificant improvement was observed in the number of students passing the minor assessments and therefore completing the course in the SRS Group.
  • There was no significant change in assessment scores between the two groups.
  • Most students expressed a positive perception of the use of SRS.
  • Quiz results strongly correlated with student grades in the preceding assessments.

This work provided some interesting findings. This form of active learning, when introduced alone, failed to significantly improve student retention rates or grades. However, the strong correlation between quiz results and assessment grades led the authors to conclude that using SRS can provide lecturers with advance information on how their students are progressing both collectively and as individuals. This may help lecturers tailor their teaching during the course and focus on specific topics or students if a quiz indicates an area of weakness. This more personalised approach to teaching, suggest the authors, may be of particular benefit during the current SARS-COV-2 pandemic during which students and lecturers have fewer traditional one-on-one interactions. Individual quiz results may also provide each student with a measure of their own progress prior to a formal assessment taking place, allowing them an opportunity to redouble their efforts if needed.

In conclusion, the results of this work encourage the inclusion of SRS-based polling in cognitively demanding lectures, providing both educator and learner with a form of ‘real-time’ feedback prior to formal assessments.


Caserta, S., Tomaiuolo, G., Guido, S., 2021. Use of a smartphone-based student response system in large active-learning chemical engineering thermodynamics classrooms. Educ. Chem. Eng. 36 (2021), 46-52.
DOI: 10.1016/j.ece.2021.02.003

Related Entries