Bath's Be Well Survey outcomes push student feedback strategy beyond standalone surveys

Updated Apr 13, 2026

Bath's latest wellbeing update matters because it shows how a university can move student feedback out of a standalone survey lane and into a joined-up student feedback system. On 1 April 2026, the University of Bath announced its spring Education & Student Experience Forum, with one agenda item standing out for anyone reviewing student feedback strategy: Be Well Survey outcomes, and next steps. For Student Experience teams, PVCs, and quality professionals, that matters because Bath is not presenting wellbeing feedback as a one-off survey result. It is treating that evidence as part of a wider student voice system that now reaches course-level surveys, placement feedback, assessment practice, and institutional planning. At Student Voice AI, we see that as the more useful sector signal: the strongest institutions are starting to design student feedback around decision-making, not around isolated survey windows.

What has changed in Bath's student feedback strategy

The immediate development is the 21 April 2026 forum itself. Bath says the session will cover the Inclusive Education project, Be Well Survey outcomes and next steps, the Access and Participation Plan, and wider employability activity. That matters because it pulls student wellbeing evidence into a live institutional forum on education and student experience, rather than leaving it inside a single service team or annual report. The move is institution-specific, not a national policy change, but the direction is clear: Bath is using student wellbeing evidence as an input into mainstream academic and student experience governance.

The more important detail sits in Bath's linked Be Well at Bath Annual Report 2024-25. In the "Learn" section, led by Professor Nathalia Gjersoe, Associate Pro-Vice-Chancellor (Student Voice), the university says it has aligned the Be Well Survey with sector best practice, increased response rates through new promotion and oversight, and embedded wellbeing questions into course-level and placement surveys. It also says its Student Voice Strategic Implementation Plan has made progress in enhancing student representation, reducing survey fatigue, and using student feedback more effectively. Those are concrete changes to how feedback is collected and connected, not just to how it is described.

"Be Well Survey aligned with sector best practice and significantly increased response rates"

Bath's annual report adds two further points that are easy to miss, but important for practice. First, it says the university has created departmental Assessment & Feedback roles to make marking criteria clearer and more consistent for students. Second, it sets wellbeing measures alongside data from the Course-level Survey, NSS, PTES and PRES, plus a future plan to pilot a student engagement analytics platform for taught students. In other words, Bath is tightening the links between wellbeing evidence, academic feedback, student representation, and survey architecture. That is what makes the April 2026 forum relevant beyond one institution.

What this means for institutions

The first implication is about survey design. Many universities still run wellbeing surveys, course surveys, placement surveys, NSS, PTES, and representative channels as separate exercises with overlapping questions and unclear ownership. Bath's current direction is closer to the model we saw in King's Wellbeing Survey and the wider survey architecture discussed in QAA's student representation research: use different routes for different purposes, but make the evidence work together. Embedding wellbeing questions into course-level and placement surveys can reduce duplication and bring student support issues closer to the parts of the experience that often drive them.

The second implication is that response rates and survey fatigue should be treated as strategy questions, not just comms questions. Bath's report does not claim that better promotion alone solves the problem. The more interesting point is that increased response rates sit alongside work to reduce survey fatigue and use feedback more effectively. That is a better framing for institutions than simply asking how to get more completions. The stronger question is whether the survey calendar is coherent enough to earn response effort in the first place. Our post on student participation fatigue in higher education is relevant here, because fewer duplicated asks and clearer follow-through usually matter more than another reminder email. It is also where question design in student feedback surveys matters, because shorter, clearer prompts help reduce burden without making the evidence thinner.

The third implication is operational ownership. Bath is connecting survey evidence to named institutional programmes, such as Inclusive Education, the Access and Participation Plan, and departmental Assessment & Feedback roles. That is a useful discipline for quality teams. Survey findings become more actionable when institutions can say which route is generating the evidence, who owns the response, and how different evidence sources will be read together. That is also the logic behind benchmarking and triangulating student survey data: local action gets stronger when survey signals are connected to adjacent data rather than interpreted in isolation. Visible ownership is also what lets institutions close the loop on student voice initiatives rather than leaving students with another set of untracked findings.

How student feedback analysis connects

This matters for comment analysis because Bath is broadening the places where wellbeing-related student voice may appear. Once wellbeing questions are embedded into course-level and placement surveys, institutions are likely to collect open-text feedback that cuts across assessment, belonging, communication, workload, adjustments, support access, and placements. Closed-question results can show where pressure is surfacing. Open comments are what help teams see whether the issue is really about assessment bunching, unclear expectations, weak signposting, or a more specific cohort problem. Bath's earlier example of acting on student feedback through a neuroinclusive study space redesign shows why that wider read matters: the useful signal is often cross-service, not locked inside one survey.

At Student Voice AI, we see the value when universities analyse those comment streams together rather than leaving each survey in its own reporting silo. A reproducible method helps institutions compare themes across course-level surveys, PTES, NSS, wellbeing instruments, and placement feedback without losing traceability. Bath's latest direction does not require a new kind of analytics to be useful, but it does make joined-up analysis more important. If you are reviewing how to connect student wellbeing comments with academic experience evidence, Student Voice Analytics and our NSS open-text analysis methodology are practical starting points.

FAQ

Q: What should institutions do now if they want a similar approach to wellbeing and student feedback?

A: Start by auditing where wellbeing questions currently sit across your survey calendar. Check whether students are being asked similar things in several places, whether each route has a clear purpose, and who owns the action that follows. Then decide which questions should stay in dedicated surveys and which can be embedded into course-level, placement, or other existing feedback routes without creating duplication.

Q: What is the timeline and scope of Bath's latest update?

A: Bath published the forum announcement on 1 April 2026, and the Education & Student Experience Forum is scheduled for 21 April 2026. The linked annual report covers progress in the 2024-25 academic year. This is one English university's current approach, not a sector-wide regulatory change, but the practices it highlights are relevant across UK higher education.

Q: What is the broader implication for student voice?

A: The broader implication is that student voice works better when institutions connect wellbeing, academic experience, and representation evidence rather than treating them as separate reporting streams. Universities that reduce duplication, improve response quality, and give feedback a clear route into action are more likely to collect evidence they can trust and use.

References

[University of Bath]: "Watch the spring 2026 Education & Student Experience Forum" Published: 2026-04-01

[University of Bath]: "Be Well at Bath Annual Report 2024-25" Published: not stated

[University of Bath]: "Be Well at Bath: Our Principles in Action" Published: 2026-01-26

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.