Student Response Systems in Large Active-Learning Classrooms

By David Griffin

Updated Mar 27, 2026

In large university lecture theatres, it is often the same small group of students who ask questions and answer them. That matters because low engagement, especially in the early years of higher education, is associated with negative outcomes such as exam failure and course dropout (Caserta et al., 2021). One broad family of approaches used to address this problem is active learning in STEM classrooms. This includes class and small-group discussions, flipped classrooms, in-class questioning, and polling student responses.

Polling student responses with portable Student Response Systems (SRS), often known simply as ‘clickers’, has been part of university teaching for more than two decades. A common format is straightforward: the lecturer poses a multiple-choice question, students submit their answers, and the class sees the results displayed visually for discussion. Because most students now carry smartphones and can easily access polling software, SRS is a practical way to make participation easier in large classes, alongside other flipped classrooms and polling-based teaching formats.

Caserta et al. (2021) focused on a harder test of that idea. They recognised that engineering lectures often deal with cognitively demanding problems, which can make active-learning techniques more difficult to introduce well. They also wanted to address low motivation and engagement among their students. To do so, they tested class polling through SRS in a second-year Thermodynamics course in a Chemical Engineering programme. The course included three minor assessments across the academic year, and students had to pass either all three of those assessments or a final exam covering the full year of work.

For three cohorts between 2014 and 2017, students completed SRS quizzes one to two weeks before each of the three minor assessments. The format of each quiz was as follows:

  • Students answered 10 multiple-choice questions using free SRS software on a laptop or smartphone.
  • They could interact with other students and refer to notes, although the quiz time limits reduced how much they could do this.
  • Each question was shown for 30 to 60 seconds.
  • Total quiz time, including setup and login, was 20 to 30 minutes.
  • After each quiz, the class saw the distribution of answers as a histogram. Individual results were recorded, but public displays remained anonymous to increase comfort and reduce peer pressure.

To understand how students experienced the quizzes, the researchers also asked them to complete a questionnaire. They then measured the effect of SRS use using three practical metrics:

  • Student withdrawal from the course
  • Student completion of the course
  • Assessment scores

The results for the three cohorts that took part in the SRS quizzes, the SRS Group, were compared with a control group made up of the three cohorts that immediately preceded them between 2011 and 2014, the Control Group.

The findings were mixed, but they still offer a useful takeaway for lecturers deciding whether SRS is worth the classroom time. The main findings can be summarised as follows:

  • There was no significant improvement in student dropout rates in the SRS Group when compared with the Control Group.
  • A statistically insignificant improvement was observed in the number of students passing the minor assessments and therefore completing the course in the SRS Group.
  • There was no significant change in assessment scores between the two groups.
  • Most students expressed a positive perception of the use of SRS.
  • Quiz results strongly correlated with student grades in the preceding assessments.

This study is most useful when read as a guide to what SRS can and cannot do on its own. Used in isolation, this form of active learning did not significantly improve retention or grades. However, the strong correlation between quiz results and assessment grades suggests that SRS can give lecturers earlier visibility into how students are progressing, both collectively and individually. That matters in large classes because it gives teaching staff a chance to adjust their teaching, revisit difficult topics, or support specific students before a formal assessment exposes the problem. The authors also argued that this more personalised approach may be particularly valuable during the SARS-CoV-2 pandemic, when students and lecturers had fewer traditional one-to-one interactions. For students, individual quiz results can also act as an early signal of how well they are likely to perform, giving them time to increase their efforts before an assessment.

In conclusion, this work supports the use of SRS-based polling in cognitively demanding lectures, not as a guaranteed way to raise grades, but as a practical source of near real-time feedback before formal assessments. That early feedback can help both educators and students respond sooner.

FAQ

Q: How do students perceive the impact of SRS on their learning autonomy and confidence?

A: Students generally seem to respond well to SRS because it gives them a low-pressure way to take part in large lectures. The anonymity of clickers or smartphone apps can make it easier for students to reveal what they understand, or where they are confused, without the social pressure of speaking up in front of the whole room. That can strengthen confidence, give students faster feedback on their understanding, and create a greater sense of ownership over their learning. In that sense, SRS also creates a practical route for student voice in teaching environments where many students would otherwise stay silent.

Q: What role does text analysis play in evaluating the effectiveness of SRS quizzes and improving them over time?

A: Text analysis can strengthen this kind of work when educators collect open-ended student feedback alongside quiz data. Analysing that feedback can reveal recurring misconceptions, common pain points, and topics that students find especially helpful or confusing. Those insights can then be used to improve quiz design, refine teaching, and target support more effectively. In practice, text analysis of student feedback adds depth to the numeric quiz results by showing not just where students are struggling, but why.

Q: Can the integration of SRS and active learning techniques be optimized based on student feedback and text analysis to improve engagement and learning outcomes?

A: Yes, student feedback and text analysis can help educators use SRS more effectively. When teaching teams look beyond response rates and scores to the language students use in comments, they can make better decisions about question difficulty, pacing, collaboration, and which topics need more explanation. That creates a more responsive, student-centred learning environment in which active-learning techniques are adjusted using real evidence from the cohort. Over time, that should make SRS activities more engaging and more useful for learning.

References:

Caserta, S., Tomaiuolo, G., Guido, S., 2021. Use of a smartphone-based student response system in large active-learning chemical engineering thermodynamics classrooms. Educ. Chem. Eng. 36 (2021), 46-52.
DOI: 10.1016/j.ece.2021.02.003

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.