Glasgow's assessment and feedback tool shows how universities can act on student voice

Updated Apr 30, 2026

Student feedback only matters if it changes practice before the next cycle repeats the same problems. That is why the University of Glasgow's A&F PET 2026 announcement, first published on 21 April 2026, is worth attention. This assessment and feedback tool is not a new student survey. It is a staff-facing mechanism for reviewing practice against Glasgow's Learning Through Assessment framework and deciding where support should go next. For teams working on student voice, the practical takeaway is clear: Glasgow is making the route from student concerns to assessment change more explicit.

What has changed in Glasgow's assessment and feedback approach

The immediate change is the 2026 relaunch of Glasgow's Assessment & Feedback Practice Enhancement Tool, with submissions open from 26 March to 26 April 2026. The university says the tool helps colleagues reflect on their assessment practices against the Learning Through Assessment, or LTA, framework, and that the 2026 iteration is designed to compare progress with 2025. It also says some questions are now optional, so colleagues at different stages of their teaching careers can take part more easily. That matters because the university is treating assessment review as a repeatable institutional cycle, not a one-off exercise.

"This tool provides an opportunity for you to reflect on your assessment practices"

The more important detail is what Glasgow says happened after the previous rounds. The 21 April announcement says earlier participation led to continued development of the Assessment & Feedback Hub, college-level Assessment Changes Workshops, integration of the LTA framework into PGCAP learning for around 120 staff each year, and the establishment of an AI Subgroup and a Feedback Literacy Subgroup. In other words, the tool is not just collecting staff reflections. It is being used to decide what support, training, and workstreams the university should build next.

That is easier to read when set beside Glasgow's wider feedback architecture. The university's student voice pages say formal routes include course evaluation surveys, Summary and Response Documents, and Staff-Student Liaison Committees, all sitting inside a broader process that Glasgow has already tried to standardise through its Student Voice Framework. Glasgow's Learning Through Assessment framework also treats assessment and feedback as meaningful, iterative, programmatic, and inclusive. The PET does not replace those student-facing routes. It sits after them, creating a structured point where staff are asked to review how assessment practice is changing in response to what students report. That is the more useful development here.

What this means for institutions

The first implication is that acting on assessment concerns needs its own workflow. Many universities already collect assessment complaints and suggestions through NSS comments, module evaluations, reps, and committees. The weak point is often what happens next. Glasgow's approach suggests that institutions need a deliberate stage between feedback collection and policy change, where staff review patterns, compare progress over time, and identify what kind of support or redesign is needed. That gives assessment and feedback work a clearer action trail.

The second implication is about continuity and participation. Because Glasgow is comparing 2026 with 2025 and making some questions optional to widen involvement, it is trying to balance consistency with usability. Student experience teams can take a useful lesson from that. If a review tool is too rigid, staff stop engaging. If it changes too much each year, progress is hard to track. The benefit of a stable but usable process is that institutions can see whether recurring assessment issues are actually shifting rather than relying on anecdote from one school or one survey cycle.

The third implication is strategic. Glasgow links this work to assessment changes workshops, AI, and feedback literacy, not just to generic quality enhancement. That matches what the sector has been discussing more openly in QAA's assessment and feedback roadshow outcomes: assessment problems rarely sit in one neat box. Students may be talking about timing, clarity, criteria, AI boundaries, or how usable feedback feels. Institutions that treat those as separate problems too early can miss the fact that they often share the same underlying design issue. Glasgow's announcement is useful because it shows one university building a mechanism to keep those connections visible.

How student feedback analysis connects

This matters for student feedback analysis because assessment complaints rarely arrive neatly labelled. In one open-text response, students may combine concerns about unclear briefs, late feedback, workload spikes, inconsistent marking, and AI-related uncertainty. If those comments are collapsed into one general theme, institutions risk sending the wrong issue to the wrong team. A clearer thematic read is what makes a tool like PET more useful once the evidence reaches staff.

At Student Voice AI, we see the value when institutions can separate those assessment and feedback themes consistently across NSS comments, PTES comments, module evaluations, and local surveys. A reproducible approach, backed by a student comment analysis governance checklist, makes it easier to turn large comment sets into something committees, workshops, and academic leads can act on without losing traceability. Glasgow's latest step is not about analytics software on its own. It is about making sure evidence can travel from student voice into practice change.

FAQ

Q: What should institutions do now if they want a similar approach?

A: Start by mapping where assessment and feedback concerns currently surface, such as NSS comments, module evaluations, rep feedback, and committee discussions. Then define who reviews those patterns, how progress will be compared across cycles, and what support mechanism follows, whether that is staff development, redesign workshops, or a formal enhancement project.

Q: What is the timeline and scope of Glasgow's latest update?

A: The University of Glasgow says the PET opened on 26 March 2026 and closed on 26 April 2026. The announcement itself was first published on 21 April 2026. The scope is one Scottish institution, and the tool is staff-facing rather than a new national survey or regulatory requirement.

Q: What is the broader implication for student voice?

A: Student voice becomes more useful when institutions pair student-facing feedback routes with a clear mechanism for academic staff to review patterns, receive support, and change practice. Surveys and committees on their own do not close the loop. A structured change stage is what turns recurring comments into visible improvement.

References

[University of Glasgow]: "A&F PET 2026: Open for submissions until 26th April" Published: 2026-04-21

[University of Glasgow]: "A&F Practice Enhancement Tool" Published: not stated

[University of Glasgow]: "What is the Universities Feedback Process?" Published: not stated

[University of Glasgow]: "Assessment and Feedback Project Webpage" Published: not stated

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.