Updated Mar 26, 2026
At Student Voice AI, we often see a gap between the amount of student feedback universities collect and the extent to which students can actually see that feedback shaping quality processes. That is why Begoña García-Domingo, Carmen López-Escribano and Jesús M. Rodríguez-Mantilla's paper in Quality in Higher Education, "Accreditation processes and their impact on improvements in the organisation and management of Spanish university degrees: student assessment", matters. For universities that rely on student surveys, representatives, and open comments to improve the student experience, the paper asks a practical question: do students think accreditation and quality assurance systems are making a difference?
Accreditation sits at the heart of institutional quality assurance, but it is often much more legible to quality teams than to students. Universities may gather evidence, write review documents, run enhancement cycles, and implement changes, yet students can still experience the whole process as distant or invisible. When that happens, student voice risks becoming a collection exercise rather than a credible route to improvement.
García-Domingo and colleagues examine this visibility problem through a quantitative study of Spanish higher education. Using a questionnaire designed and validated for the project, they surveyed 440 university students to analyse how students perceive the impact of accreditation systems on the organisation and management of degree programmes, and which factors are linked to more positive perceptions. Although the regulatory context is Spanish, the underlying issue transfers well to UK higher education: if students do not understand how quality systems work, they are less likely to trust them or recognise the changes they produce.
The first finding is more positive than some quality teams might expect. Most students reported a medium-to-high perception that accreditation systems contribute to improvements in the organisation and management of degree programmes. In other words, students were not dismissing quality assurance as meaningless bureaucracy. Many could see some value in it.
That said, the paper also shows that this perception is unevenly distributed. Students who participate in quality assurance systems as student representatives reported stronger perceptions of impact. That is a significant result for UK universities, because it suggests that participation does more than gather feedback. It also helps students understand how improvement happens. Where students are close enough to the process to see discussion, evidence, and follow-up, they are more likely to believe the system leads to change.
The second major finding is about visibility and knowledge. The authors argue that students' understanding of accreditation systems, and of the institutions responsible for them, is closely tied to whether they perceive those systems as useful. As they put it:
"the knowledge students have on quality accreditation systems ... plays a fundamental role in these perceptions."
This matters because it reframes a common student voice problem. Sometimes weak confidence in quality processes reflects weak practice. Sometimes it reflects weak communication about practice. If students do not know what accreditation is, who uses feedback, or what has changed as a result, improvement work can remain institutionally real but experientially invisible.
A third useful implication follows from the first two. Student participation and student understanding are not side issues in quality assurance, they are part of the mechanism that makes quality assurance credible. The paper therefore gives universities a practical way to think about quality culture: not just whether improvements are being made, but whether students can recognise them as improvements connected to their own voice.
For teams working with free-text data, this is especially relevant. Students often use open comments to describe timetabling confusion, inconsistent communication, unclear processes, or frustration with how course organisation works. Those comments are not only signals about operational problems. They can also show where institutional quality work is not visible enough to students. Analysing comment themes at scale can therefore help universities distinguish between a delivery problem and a quality-assurance communication problem.
For UK higher education teams, the first implication is to make quality assurance legible to students. If programme reviews, annual monitoring, module evaluations, or NSS action plans lead to changes, students should be shown where their feedback entered the process, what was discussed, and what changed as a result. A short "you said, we changed" summary tied to specific quality processes is often more credible than generic messaging about listening.
Second, universities should treat student representatives as quality interpreters, not just committee attendees. This paper suggests that students who participate directly in quality systems are better placed to recognise their effects. That means representative structures need briefing, support, and clear routes back into the wider student body. A rep who understands how an issue moved from feedback to action can make the process more visible for everyone else.
Third, institutions should measure awareness as well as satisfaction. Alongside questions about organisation, management, or course improvement, it is useful to ask whether students know how feedback is used and whether they can identify changes that resulted from previous feedback. Those questions help teams see whether a weak result reflects poor delivery, poor visibility, or both.
Fourth, this is an area where open-text analysis adds real value. Scaled items can tell you whether students think course organisation is improving, but comments explain why they believe that, or why they do not. Student Voice Analytics fits naturally here because categorising free-text comments on organisation, communication, feedback use, and student representation makes it easier to see whether quality assurance activity is landing in ways students can actually recognise.
Q: How can a UK university make accreditation or quality-assurance processes more visible to students in practice?
A: Start by mapping the route from feedback to action. Show where module evaluations, NSS comments, student-representative input, and committee discussions feed into annual monitoring or review processes. Then communicate specific outcomes in plain language, ideally at programme level. Students are more likely to trust quality systems when they can see the chain from comment to decision to change.
Q: What should we be cautious about when applying a Spanish questionnaire study to UK higher education?
A: The study is based on student perceptions within the Spanish accreditation context, so it does not prove that the same variables will operate identically in UK regulatory structures. It is also a self-report survey, which means it captures how students perceive impact rather than independently verifying each improvement. The transferable value lies in the mechanism: student knowledge and participation appear to shape whether quality systems are experienced as meaningful.
Q: What does this paper change about how universities should interpret student voice data?
A: It suggests that low confidence in course organisation or institutional responsiveness is not always just a service-performance issue. It can also reflect a visibility gap in the quality process itself. That is why student voice analysis should combine scores, representative feedback, and open-text comments. Together they show not only where students are dissatisfied, but whether they understand how improvement is supposed to happen.
[Paper Source]: Begoña García-Domingo, Carmen López-Escribano, Jesús M. Rodríguez-Mantilla "Accreditation processes and their impact on improvements in the organisation and management of Spanish university degrees: student assessment" DOI: 10.1080/13538322.2025.2576322
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.