Student Voice

Student Response Systems in Large Active-Learning Classrooms

By David Griffin

Often in large university lecture theatres only a handful of students actively participate through the asking and answering of questions. According to Caserta et al., (2021), the lack of engagement from the majority, particularly in the early years of university tuition, is associated with a range of negative outcomes including exam failure and course dropout. One broad range of approaches employed to tackle these problems through increased student engagement is termed ‘active learning’. This includes tactics such as introducing class and small group discussions, the use of flipped classrooms, posing questions to a class and polling student responses.

Polling student responses to questions using portable Student Response Systems (SRS; often known simply as ‘clickers’) has been employed by educators for more than two decades. One of the most common uses of these systems is where the lecturer poses a multiple-choice question to the class. Each student then answers the question using their own SRS and the group results are displayed graphically for the class to see and discuss. The widespread ownership of smartphones along with the accessibility of software means most university students nowadays have easy access to their own SRS device at all times.

However, Caserta et al. (2021) recognized the challenge of introducing such active learning techniques in engineering lectures due to the cognitively taxing problems often being addressed. They also wished to tackle a lack of motivation to learn and engage in their students. They decided to test the usefulness of introducing class polling through SRS in a second-year Thermodynamics course within their university’s Chemical Engineering programme. The Thermodynamics course normally included three minor assessments given throughout the academic year. In order to gain a pass in the course overall, students were required to pass either all three minor assessments or a final exam covering the entire year’s work.

For three different class cohorts between 2014 and 2017, students were asked to complete SRS quizzes 1-2 weeks before each of the three minor assessments. The format of each quiz was as follows:

  • Students were asked to answer 10 multiple-choice questions using free SRS software on their laptop or smartphone.
  • They were allowed to interact with other students and refer to notes as they wished, however the time constraints of the quiz limited this.
  • Each question was presented for between 30 and 60 seconds only.
  • The total quiz time including setup and login time was 20-30 minutes.
  • The breakdown of answers for each question were displayed to the class on a histogram after the completion of each quiz. While each student’s individual result was recorded, they retained their anonymity when results were publicly displayed. This was done to increase their level of comfort and reduce feelings of peer pressure.

To assess student perceptions of SRS use in the classroom, students were asked to complete a questionnaire after the quizzes. Three metrics were also used to determine the effect of SRS use. These were:

  • Student withdrawal from the course
  • Student completion of the course
  • Assessment scores

The results for the three cohorts which partook in the SRS quizzes (the SRS Group) were compared to a control group consisting of the three cohorts which immediately preceded them in the years 2011-2014 (the Control Group).

The authors saw mixed results in this work. The main findings can be summarised as follows:

  • There was no significant improvement in student dropout rates in the SRS Group when compared with the Control Group.
  • A statistically insignificant improvement was observed in the number of students passing the minor assessments and therefore completing the course in the SRS Group.
  • There was no significant change in assessment scores between the two groups.
  • Most students expressed a positive perception of the use of SRS.
  • Quiz results strongly correlated with student grades in the preceding assessments.

This work provided some interesting findings. This form of active learning, when introduced alone, failed to significantly improve student retention rates or grades. However, the strong correlation between quiz results and assessment grades led the authors to conclude that using SRS can provide lecturers with advance information on how their students are progressing both collectively and as individuals. This may help lecturers tailor their teaching during the course and focus on specific topics or students if a quiz indicates an area of weakness. This more personalised approach to teaching, suggest the authors, may be of particular benefit during the current SARS-COV-2 pandemic during which students and lecturers have fewer traditional one-on-one interactions. Individual quiz results may also provide each student with a measure of their own progress prior to a formal assessment taking place, allowing them an opportunity to redouble their efforts if needed.

In conclusion, the results of this work encourage the inclusion of SRS-based polling in cognitively demanding lectures, providing both educator and learner with a form of ‘real-time’ feedback prior to formal assessments.

FAQ

Q: How do students perceive the impact of SRS on their learning autonomy and confidence?

A: Students generally view the impact of SRS on their learning autonomy and confidence positively. By allowing them to participate actively in large lecture settings through the anonymity of clickers or smartphone apps, SRS can help students feel more at ease expressing their understanding and opinions on subject matters. This, in turn, may boost their confidence as they see their responses in relation to their peers' in real-time, providing immediate feedback on their comprehension. Moreover, the opportunity to engage directly with the lecture content through SRS quizzes encourages a sense of ownership over their learning journey, enhancing their autonomy. The aspect of student voice is crucial here, as SRS gives students a platform to express their understanding and misconceptions, which is often not possible in the traditional lecture format due to the scale and pace of these classes.

Q: What role does text analysis play in evaluating the effectiveness of SRS quizzes and improving them over time?

A: Text analysis can play a significant role in evaluating the effectiveness of SRS quizzes and their improvement over time. By analysing the text of student responses and feedback collected through these systems, educators can identify patterns in misconceptions, common areas of difficulty, and topics that generate high engagement. This analysis can inform the refinement of quiz questions to better target learning objectives and address student needs. Furthermore, text analysis of open-ended feedback can provide insights into student perceptions of the quizzes, including what they find helpful or challenging. This can guide instructors in tailoring their teaching methods and content delivery to enhance learning outcomes. The integration of student voice through text analysis thus becomes a powerful tool for continuous improvement in teaching and learning strategies.

Q: Can the integration of SRS and active learning techniques be optimized based on student feedback and text analysis to improve engagement and learning outcomes?

A: Yes, the integration of SRS and active learning techniques can indeed be optimized based on student feedback and text analysis to enhance engagement and learning outcomes. By actively incorporating student voice into the educational process, educators can use the insights gained from text analysis of feedback and quiz responses to fine-tune their teaching approaches and the design of SRS quizzes. This could involve adjusting the difficulty level of questions, introducing more collaborative elements into quizzes, or focusing on topics that require further clarification as indicated by text analysis results. Additionally, this approach allows for a more responsive and student-centred learning environment, where teaching methods evolve in response to real-time data on student needs and preferences. Ultimately, leveraging student feedback and text analysis can lead to more effective engagement with course material, improved understanding, and higher student satisfaction with the learning experience.

References:

Caserta, S., Tomaiuolo, G., Guido, S., 2021. Use of a smartphone-based student response system in large active-learning chemical engineering thermodynamics classrooms. Educ. Chem. Eng. 36 (2021), 46-52.
DOI: 10.1016/j.ece.2021.02.003

Related Entries