Bath's 2026 student feedback system shows how to collect the right survey at the right level

Updated Mar 24, 2026

The University of Bath's current Have your say - give feedback on your course page sets out a clear 2025/26 survey schedule: NSS from 2 February to 30 April 2026, the Course-level Survey from 2 March to 30 March, and PTES from 2 March to 30 April, with PRES returning in Spring 2027. For Student Experience teams, PVCs, and quality professionals, this matters because Bath is not asking one survey to do every job. It is running a student feedback system built around stage of study and survey purpose.

At Student Voice AI, we think that is the point worth noticing. Universities often accumulate surveys over time without being equally clear about who each one is for, what it is meant to inform, or how the results fit together. Bath's current setup is a practical example of a more deliberate design.

What has changed in Bath's student feedback system

Bath and its Students' Union present the survey cycle as a joined-up institutional process. Final-year undergraduates are directed to the NSS. Non-final undergraduates who are not on placement or in suspense are directed to an internal Course-level Survey. Taught postgraduates, including students in the taught phase of professional doctorates, are directed to PTES. Postgraduate researchers, and those in the research phase of professional doctorates, move onto PRES when that biennial survey returns. In other words, Bath is explicitly matching feedback instrument to cohort and point in the student lifecycle.

The internal Course-level Survey is the most revealing part of the model. Bath says it asks non-final undergraduates about their course experience so far this academic year, and the question set includes course-level approach, assessment for learning, organisation and management, student voice, sustainability, learning resources, wellbeing support, overall satisfaction, up to two additional questions, and two free-text comment questions. That matters because it gives Bath a middle layer of evidence between unit evaluations and the NSS, with room to ask about current institutional priorities as well as core educational experience.

"Giving constructive and detailed feedback in the open comments is most helpful"

Bath is also unusually explicit about scope and method. The Course-level Survey runs in Semester 2 each year, and the page lists who is and is not eligible, including exceptions for specific courses and modes of study. The university also says responses are anonymised to academic and professional services staff, while offensive or discriminatory comments will be investigated under its Dignity & Respect policy. That combination of eligibility rules, open-text guidance, and anonymity expectations is operational detail many institutions keep buried.

What this means for institutions

First, Bath's model is a reminder that survey design is part of student voice strategy. If the same questions are asked of every cohort, institutions either create duplication or lose useful specificity. Bath's approach separates national benchmarking from local enhancement: finalists take the NSS, non-final undergraduates get a course-level instrument, and taught postgraduates move into PTES. That is a cleaner architecture than relying on one catch-all internal survey.

Second, the Course-level Survey shows how institutions can bring themes into their own student feedback system that are harder to capture cleanly through national instruments alone. Sustainability, wellbeing support, and up to two additional questions give Bath room to test issues that matter locally, while the two free-text questions keep space for students to explain what is driving their ratings. This creates a useful middle layer alongside models such as Newcastle Experience Survey 2026 and Westminster's Mid-Module Check-ins.

Third, Bath's guidance on constructive comments is not cosmetic. Open-text quality shapes whether survey results are actually usable. Institutions that want stronger evidence should not focus only on response rates. They should also explain what useful feedback looks like, how anonymity works, and how comments will be handled if they cross behavioural lines. Our student comment analysis governance checklist and long-running discussion of closing the loop in student voice initiatives are directly relevant here.

How student feedback analysis connects

At Student Voice AI, we see the value in Bath's design because it creates comparable feedback streams across the student journey. If an institution collects open comments from non-final undergraduates through a Course-level Survey, from finalists through the NSS, and from taught postgraduates through PTES, it can start to see which issues are local, which are cohort-specific, and which are repeating across the institution.

That only works if the analysis design is as deliberate as the survey design. Terms such as assessment for learning, organisation and management, student voice, and wellbeing support need to map into a stable analysis framework, especially if teams want to compare themes across surveys or route issues to the right owners. Our NSS open-text analysis methodology and student feedback analysis glossary are useful starting points for institutions trying to make that comparison defensible.

FAQ

Q: What should institutions do now if they want a similar multi-survey student feedback system?

A: Start by mapping which student groups need which kinds of feedback route. Decide what should be benchmarked nationally, what should be captured through internal course-level surveys, and what should stay at module or service level. Then set clear ownership for each survey, the open-text analysis process, and how results will be communicated back to students and staff.

Q: What is the timeline and scope of Bath's 2026 survey cycle?

A: Bath's student voice pages show that NSS 2026 is open from 2 February to 30 April, the Course-level Survey from 2 March to 30 March, and PTES 2026 from 2 March to 30 April. PRES is not running in 2026 and is scheduled to return in Spring 2027. The immediate scope is one English institution, but the structure is relevant across UK higher education.

Q: What is the broader implication for student voice practice?

A: The broader implication is that universities should treat student voice as a system, not a single annual event. A coherent survey architecture helps institutions reduce overlap, gather more relevant evidence from each cohort, and connect feedback collection more clearly to enhancement work.

References

[University of Bath]: "Have your say - give feedback on your course" Published: not stated

[University of Bath]: "Take part in the Course-level Survey" Published: not stated

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.