Student feedback only works when universities show what changed

Updated May 01, 2026

Most universities do not lack student feedback channels. They lack a convincing answer when students ask what changed. That is why Alison Gibb, Gail Angus, and Karen Clancey's Student Engagement in Higher Education Journal paper, "Enhancing Student Voice: Developing the Student Feedback Loop, a practice-based case study", matters. At Student Voice AI, we see the same issue across course evaluations, committee minutes, and survey comments: student voice becomes more credible when institutions can show a visible route from comment to action, not just another request for feedback.

Context and research question

The paper examines a practical problem at the Adam Smith Business School, University of Glasgow. Persistent weakness in the NSS Student Voice section, combined with lower engagement in course evaluations and Student-Staff Liaison Committees after the pandemic, suggested that the issue was not simply collecting feedback. The harder problem was making feedback feel visible, useful, and worth giving.

This is a practice-based case study rather than a controlled trial. The authors describe a school-wide review led by two short-life working groups that included staff and students. They revisited course evaluation questions, Student-Staff Liaison Committee structures, communication practices, and new advisory panels. For UK higher education teams, the research question is highly transferable: how do you redesign the operating model around student feedback so students can see where their input goes and why it matters?

Key findings

The central problem was credibility, not the absence of mechanisms. The school already had course evaluations and Student-Staff Liaison Committees, yet students still reported that feedback was not valued or acted upon. The paper therefore starts from a useful institutional truth: weak student voice systems often fail because they do not show follow-through clearly enough.

The authors describe a wider sector pattern in blunt terms:

students feel like their feedback "goes into a black hole"

The most important changes were procedural and communicative, not just technical. The school developed a revised course evaluation question set that kept the university's core items while adding more targeted questions. Staff were briefed on how to encourage participation and how to respond afterwards through Summary and Response Documents and "You Said, We Listened" slides. That matters because it moves feedback from collection to response, and it aligns with earlier evidence that student evaluations improve when staff and students redesign them together.

Student voice became stronger when students had more control over the conversation. Student-Staff Liaison Committees were reworked so that students chaired discussions, professional services colleagues supported them as co-chairs, and action tracking became more consistent. The school also piloted undergraduate and postgraduate Student Advisory Panels, using themes drawn from course feedback, SSLCs, and NSS results. Staff could attend as audience members, but not interrupt, which helped students speak more openly and surface deeper issues than a standard committee format often allows.

The strongest contribution is the four-stage framework itself: Ask, Analyse, Implement, Communicate. Feedback was gathered through class feedback, SSLCs, NSS, annual reviews, and a year-round "Have your Say" route. It was then analysed for action points, discussed with students, escalated through governance structures, and communicated back to staff and students. The paper also notes a practical enabler that many universities still lack: a dashboard that allowed themes to be reported at course, programme, subject, and school level.

The paper is useful partly because it stays realistic about its limits. The authors argue that structural change can support cultural change, but they do not pretend the work is finished. They note that staff changes affected governance consistency and that the impact of parts of the cycle still needs fuller review. For UK teams, that is an important reminder that a stronger feedback loop still needs ongoing evaluation if it is to stay credible.

Practical implications

For UK universities, the first implication is to treat student voice as a workflow, not a survey estate. Every route for feedback should map to a clear cycle of asking, analysing, implementing, and communicating, with named owners at each step. That gives institutions fewer orphaned comments and a clearer basis for accountability.

Second, universities should connect feedback channels rather than reading them in isolation. If module evaluations, NSS comments, representative meetings, and local panels keep surfacing the same issue, that pattern should be analysed once and tracked properly rather than rewritten into separate reports. This is where Student Voice Analytics fits naturally: it helps institutions group recurring open-text themes at scale, and it works best alongside the kind of benchmarking and triangulation approach that stops single metrics carrying too much interpretive weight.

Third, communication should include the limits of change as well as the changes themselves. One of the strongest details in this paper is that staff explained not only what had changed, but also why some requests could not be implemented, for example because of accreditation rules or course intended learning outcomes. Combined with a repeatable NSS open-text analysis methodology, that kind of explanation gives students a more honest and more usable feedback system.

Finally, institutions should protect spaces where students can speak with less fear of being managed. Student-chaired committees, advance circulation of themes, and advisory-panel formats that reduce staff dominance can improve candour and reduce tokenism. The benefit is better evidence, earlier escalation of recurring issues, and a student voice process that feels more participatory than performative.

FAQ

Q: How should a university start applying this framework without rebuilding every feedback process at once?

A: Start by mapping existing feedback routes against the four stages in the paper: Ask, Analyse, Implement, Communicate. Most universities already have the "ask" stage. The gap is usually in analysis ownership, action tracking, and visible follow-up. A sensible first step is to pilot a shared action log and a standard response format in one school or faculty before extending it more widely.

Q: What are the methodological limits of this paper?

A: This is a practice-based case study from one UK business school, not a controlled evaluation with before-and-after outcome testing. It does not provide a formal causal claim that one framework will automatically improve NSS results or response rates. The authors also note that parts of the impact are still being reviewed, so the paper is best used as a strong implementation model rather than as final proof of effectiveness.

Q: What is the broader lesson for student voice work?

A: Universities should stop equating more feedback routes with better student voice. What matters is whether comments move through governance, trigger action, and return to students as visible change. When that loop is clear, students are more likely to trust the process, staff are more likely to act on evidence, and institutional teams have a firmer basis for improvement.

References

[Paper Source]: Alison Gibb, Gail Angus and Karen Clancey "Enhancing Student Voice: Developing the Student Feedback Loop, a practice-based case study" DOI: 10.66561/sehej.v7i1.1421

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.