Student voices in evaluation - motivations and perceptions

Published Jan 29, 2024 · Updated Feb 28, 2026

Students are far more likely to complete teaching evaluations when they believe someone will read them and act on them. When participation drops, results become harder to trust and easier to ignore (see non-response bias in student evaluations). In UK higher education, student evaluations of teaching (SETs) remain a key tool for assessing teaching quality and course design. Drawing on a recent scoping review, this post explores what motivates students to take part and what institutions can do to increase response rates and usefulness.

The Critical Lens of Student Voice

At the heart of our discussion is the concept of "student voice": the idea that students are not just passive recipients of education, but active contributors to the learning environment. By exploring student motivations and perceptions, we tap into a rich source of insights that can inform educational practices and policies. The study by Sullivan et al suggests that understanding student voice through text analysis of SET responses can help refine the teaching and learning experience.

Practical takeaway: combine quantitative scores with what students write, and use text analysis to surface recurring themes you can act on.

The Value Placed on SETs by Students

The research highlights a key theme: the value students place on SETs is closely linked to whether they see the surveys as significant. Many students recognise SETs as important for improving teaching quality and course content, which suggests a willingness to contribute constructively to the academic environment. However, a recurring point is the need for clearer communication about how SET feedback is used, which underscores the importance of making the process transparent and the outcomes visible.

Practical takeaway: be explicit about why SETs matter, and show students how their feedback leads to decisions and change.

Actionable Feedback: The Catalyst for Engagement

Students are far more inclined to participate in SETs when they believe their feedback leads to concrete changes. This finding underscores the need for institutions not only to act on SET feedback, but also to communicate those actions back to the student body. By closing the feedback loop, universities can foster a culture of continuous improvement and deepen student engagement with the process.

Practical takeaway: share visible "you said, we did" updates so students can see the impact of their comments.

The Role of Confidentiality and Anonymity

Confidentiality and anonymity emerge as critical factors influencing student participation in SETs. While these elements encourage candid feedback, they can also increase the risk of non-constructive criticism. Balancing the need for honest feedback with the protection of teaching staff from undue criticism is a delicate challenge, which points to a need for clear guidelines on constructive feedback and mechanisms for handling inappropriate comments.

Practical takeaway: reassure students about anonymity, set expectations for constructive feedback, and handle abusive language consistently.

Incentivising Participation

The effectiveness of incentives in boosting SET completion rates is a theme with mixed implications. While certain incentives can motivate students, they can also raise ethical questions about the quality of feedback gathered under those conditions. Institutions must consider the types of incentives offered carefully, ensuring they enhance engagement without compromising the integrity of the feedback.

Practical takeaway: if incentives are used, keep them light and frame participation around improving the student experience, not the reward.

Survey Design and Timing: Keys to Enhanced Participation

Optimising the design and timing of SETs is paramount for maximising student participation. Simplified survey interfaces and thoughtful scheduling can reduce barriers to completion. By releasing SETs during less busy periods, and ensuring they are straightforward and user-friendly, universities can encourage more students to share their insights.

Practical takeaway: reduce friction by keeping surveys short, mobile-friendly, and timed away from peak assessment periods.

Towards a Future of Enhanced Student Engagement

This exploration of student motivations, perceptions, and opinions on SETs points to a practical route to more meaningful student engagement in higher education. By harnessing the power of student voice and using text analysis to understand and act on SET feedback, institutions can create a more responsive and inclusive educational environment.

The journey to enhancing SET effectiveness and participation is ongoing, but with a commitment to understanding and addressing student concerns, UK higher education can nurture a culture of shared responsibility and continuous improvement in teaching and learning. In doing so, we can elevate the quality of education and affirm the critical role of student voices in shaping the future of academia.

FAQ

Q: How can institutions effectively incorporate student voice into curriculum design and policy-making beyond SETs?

A: Incorporating student voice into curriculum design and policy-making requires a multi-faceted approach that goes beyond traditional SETs. Institutions can create student advisory panels that regularly contribute to discussions on curriculum development and institutional policies. These panels help ensure student experiences and needs inform decision-making. Institutions can also use social media and online forums as informal platforms for gathering student opinions and suggestions, which can provide timely insight alongside formal feedback mechanisms. Focus groups and interviews can add deeper context, allowing for more nuanced changes in curriculum and policy. The key is to actively seek, value, and act on student voice in all its forms, ensuring education remains dynamic and student-centred.

Q: What are the ethical considerations and privacy implications of using text analysis for interpreting open-ended responses in SETs?

A: The use of text analysis for interpreting open-ended responses in SETs raises significant ethical considerations and privacy implications. The primary concern is protecting anonymity and confidentiality, particularly when personal or sensitive information could be disclosed. Institutions need robust data protection measures to secure feedback and protect identities (see our student comment analysis governance checklist). This includes anonymising data before analysis and ensuring any tools used comply with relevant data protection requirements. Ethical considerations also extend to how insights are used. Findings should be applied constructively to enhance teaching and learning, rather than to penalise or stigmatise individual educators. Transparency with students about how feedback will be analysed and used is an essential part of ethical practice, which helps foster trust and encourages honest communication.

Q: How can universities ensure that the insights gained from text analysis of SET responses lead to actionable changes that are communicated back to students?

A: A structured feedback loop is essential if insights from text analysis are to lead to actionable change. That means analysing the data, agreeing a clear plan for improvements, and assigning ownership and timelines. Once changes are made, it is crucial to communicate these back to students so they can see their feedback has a tangible impact. This can happen through email updates, announcements on digital platforms, or presentations at student forums. Highlighting specific changes made in response to feedback can reinforce the importance of student voice in shaping the educational experience. Universities can also involve students in evaluating the changes, creating a continuous cycle of feedback and improvement. This transparent and inclusive approach can enhance the student experience and foster mutual respect and collaboration between students and faculty.

Reference

[Source] Daniel Sullivan, Richard Lakeman, Debbie Massey, Dima Nasrawi, Marion Tower & Megan Lee (2024) Student motivations, perceptions and opinions of participating in student evaluation of teaching surveys: a scoping review, Assessment & Evaluation in Higher Education, 49:2, 178-189
DOI: 10.1080/02602938.2023.2199486

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.