QAA's Aberdeen review links student voice to clearer assessment feedback and stronger evidence use

Updated May 01, 2026

QAA's Aberdeen review matters because it shows how external quality scrutiny is now reading student voice as evidence, not just consultation. On 30 April 2026, QAA published its TQER report for the University of Aberdeen. The report judges Aberdeen effective overall, but it also says the university should strengthen assessment feedback, make assessment expectations clearer, and use review datasets more consistently. For Student Experience teams, PVCs, and quality professionals, that is the useful signal: even where listening structures are strong, reviewers still want clearer proof that feedback is being translated into fairer, more transparent academic practice.

What has changed in QAA's Aberdeen review

The immediate context matters. QAA says Tertiary Quality Enhancement Review, or TQER, is the current review method used under Scotland's Tertiary Quality Enhancement Framework. On QAA's framework page, the method is described as covering all credit-bearing provision, regardless of level, mode, or location, with student engagement and partnership, plus data and evidence, embedded throughout. Against that background, Aberdeen's review visits took place on 9 to 10 December 2025 and 2 to 5 February 2026, with a team of five independent reviewers including a student reviewer. The published judgement is positive overall: QAA found the university effective in managing academic standards, enhancing the quality of the learning experience, and enabling student success.

The story is not, then, that Aberdeen lacks student voice mechanisms. QAA lists eight areas of good practice and five recommendations for action. Among the strongest positive findings for feedback teams are the university's embedded approach to listening to the student voice, which QAA says has led to meaningful change for students, and the work to develop the virtual learning environment so it goes beyond course content to include employability tools, closes the feedback loop with students, and offers a more consistent structure across the institution. QAA also highlights robust quality oversight through the Quality Assurance Committee. The takeaway is that Aberdeen already has a serious quality and enhancement culture in place.

What makes the review especially relevant to student feedback practice is the mix of recommendations that follow. QAA says the university should strengthen assessment feedback so it is more consistent, equitable, and transparent. It also says Aberdeen should ensure assessment expectations are communicated more clearly and consistently, so students understand the criteria against which work is assessed, how marks are allocated, and how grades contribute to awards. A further recommendation says the university should review the datasets used in course and programme review, and support staff in using that data more consistently to understand and enhance student outcomes and experience. There are also recommendations on collaborative provision and professional services review. Together, they point to a familiar institutional challenge: hearing students is not the same as turning what they say into consistent assessment practice and defensible review evidence.

"The University should strengthen its approach to assessment feedback to ensure greater consistency, equity, and transparency."

What this means for institutions

The first implication is that listening and consistency are different tasks. Universities can have active survey routes, representative structures, and visible follow-up, yet still find that assessment criteria, feedback quality, or marking explanations vary too much between programmes. That is why the Aberdeen review is useful alongside work such as Glasgow's Student Voice Framework. Quality teams should not stop at asking whether students were heard. They should also test whether students encounter a comparable standard of assessment communication and feedback across the institution.

The second implication is about data use. QAA's recommendation on course and programme review datasets suggests that fragmented evidence is still a live problem. Institutions often hold survey scores, open comments, committee notes, attainment data, and service intelligence in separate places, with different levels of interpretation across schools. Reviewers are increasingly interested in whether staff can use that evidence consistently enough to understand what students are experiencing and what should change next. The benefit of a tighter review pack is straightforward: decisions become easier to explain, compare, and defend.

The third implication is scope. Aberdeen's review also touches collaborative provision, professional services, and the wider learning environment. That matters because feedback issues rarely sit neatly inside one module or one survey category. Students may be reacting to assessment design, support processes, local delivery arrangements, or the way information is presented across systems. Institutions should therefore check whether feedback can be compared across on-campus, online, and partner-delivered settings, not only within one school or one annual cycle. That is what makes student voice usable as institutional evidence rather than local anecdote.

How student feedback analysis connects

The Aberdeen recommendations are exactly the kind that headline scores cannot resolve on their own. If students say assessment feedback is inconsistent, teams need to know whether they mean slow turnaround, vague comments, opaque criteria, confusing grade weightings, or uneven practice between programmes. A stable approach to NSS open-text analysis methodology helps separate those themes instead of treating assessment and feedback as one broad problem.

If institutions also need to show how comments, committee issues, and review data were used, governance matters as much as coding. For teams doing that at scale, Student Voice Analytics is one way to compare comment streams consistently, but the more basic requirement is a clear evidence trail. Our student comment analysis governance checklist is a practical starting point for defining ownership, version control, and what counts as a closed loop. That makes it easier to show not only what students said, but how the institution responded and what changed as a result.

FAQ

Q: What should institutions do now if they want to respond to this kind of review signal?

A: Start with a focused audit of assessment and student voice evidence. Check where concerns about criteria, feedback quality, and transparency appear across surveys, module evaluations, SSLC minutes, and annual monitoring. Then review whether the same issues are being interpreted consistently across schools, and whether programme teams can explain what action followed.

Q: What is the timeline and scope of the Aberdeen review?

A: QAA published the Aberdeen announcement on 30 April 2026. The review visits took place on 9 to 10 December 2025 and 2 to 5 February 2026. The immediate scope is the University of Aberdeen within Scotland's Tertiary Quality Enhancement Framework, but QAA says TQER covers all credit-bearing provision regardless of level, mode, or location, including collaborative provision.

Q: What is the broader implication for student voice in quality review?

A: Student voice is being treated more explicitly as review evidence. It is no longer enough to show that students had routes to comment. Institutions are increasingly expected to show that feedback was analysed consistently, used in course and programme review, and connected to clearer, fairer academic practice that students can recognise.

References

[Quality Assurance Agency for Higher Education]: "QAA publishes TQER report for the University of Aberdeen" Published: 2026-04-30

[Quality Assurance Agency for Higher Education]: "Tertiary Quality Enhancement Review (TQER)" Published: not stated

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.