Updated Apr 12, 2026
When wellbeing concerns surface only in annual surveys, universities lose the chance to respond while the academic year is still live. That is why King’s College London’s latest launch deserves attention: it treats local wellbeing feedback as part of a joined-up evidence system, not a standalone survey. On 25 February 2026, King’s published Tell us how you’re doing by completing the King’s Wellbeing Survey, announcing a university-wide wellbeing survey for undergraduate and postgraduate students running from 2 March to 2 April 2026. The King’s Wellbeing Survey matters not just because it exists, but because King’s places it alongside NSS and PTES within a broader model of student voice in higher education. For other institutions, the practical lesson is clear: gather feedback across the year, connect the evidence, and act while there is still time to improve the student experience.
The immediate lesson for other institutions is how to launch a local wellbeing survey without creating confusion. King’s has introduced a 10-minute local survey for all undergraduate and postgraduate students, covering life at King’s, daily routines, quality of life, social connections, and general wellbeing. The page also does two useful things to build trust: it makes clear that this is a separate survey from NSS and PTES and tells students they can withdraw their data up to 1 May 2026. Framed as part of King’s wider work on student mental health and wellbeing, the survey shows institutions how to reduce friction at launch and build confidence in the process. Students can see what the survey covers, how it fits alongside other feedback, and what control they retain over their data.
The more useful lesson for other institutions is not the survey length or launch window. It is the role the survey plays in the university’s wider feedback system. King’s says it uses data from all three surveys, the King’s Wellbeing Survey, NSS, and PTES, to inform its whole-university approach to student mental health and wellbeing. That gives the local survey a clear job without isolating it from the rest of the evidence. The benefit for institutions is a cleaner institutional picture and quicker decisions. Define what each survey is for, then show how the findings combine into one institutional view through benchmarking and triangulating student survey evidence. That makes it easier to spot patterns, set priorities, and act with less guesswork.
"We use the data from all three surveys to inform our whole-university approach to student mental health and wellbeing."
King’s also links the survey to a visible “feedback in action” page, which gives students a clearer reason to take part and gives the launch more credibility. Published on 10 November 2025, that page says the university updated its module feedback and evaluation policy in September 2025 to create more opportunities to provide feedback during the module rather than at the end, while also shortening module evaluation surveys. It also lists concrete changes in assessment policy, personal tutoring, student support, accessibility, and wellbeing services. That matters because the survey is not just another request for student time. It sits inside a wider feedback system that shows what changed because students spoke up. For institutions, the benefit is practical: visible follow-through can strengthen trust, close the loop on student voice initiatives, support response rates, and make later survey requests easier to justify. Nottingham's PTES launch with visible follow-up reinforces the same lesson: when students can see change, institutions find it easier to sustain participation.
First, this is a strong example of survey architecture, not just survey promotion. Many universities run NSS, PTES, module evaluations, and occasional pulse surveys without being explicit about the job each instrument should do inside a wider student feedback system. Bath's 2026 student feedback system is a useful parallel example of matching survey purpose to cohort. King’s takes a more disciplined approach: it keeps a local survey focused on wellbeing, distinct from NSS and PTES, while tying it back to a single institutional picture. For Student Experience teams and PVCs, the benefit is clearer evidence and faster decisions. Define the job of each survey before adding another one, and it becomes easier to reduce duplication, spot gaps, and see what needs attention first.
Second, timing matters because late feedback limits what institutions can still change. A wellbeing issue that becomes visible only in NSS or another annual student experience survey often appears too late for teams to respond in the same academic year. Local wellbeing surveys can surface earlier signals on belonging and wellbeing, pressure, isolation, or access to support, especially when they run during term rather than after it. The benefit is earlier action, not just earlier measurement. That is why this story sits well alongside our recent coverage of the OfS student pulse survey and Westminster’s Mid-Module Check-ins. The same principle runs through all three cases: earlier listening gives institutions more time to intervene, more time for students to feel the benefit, and more evidence that feedback leads to action.
Third, joined-up listening still needs discipline if it is going to hold up under scrutiny. If universities combine local surveys with NSS, PTES, or service data, they need the kind of clear rules on response burden in student feedback surveys, representativeness, and ownership set out in Glasgow's Student Voice Framework. A short local survey can be valuable, but only if teams can explain who was asked, when, what the survey was for, and who owns the response. Our summary of non-response bias in student evaluations is relevant here, because a lighter-touch survey is not automatically a more representative one. The payoff from a layered feedback system is not a busier survey calendar. It is clearer governance, more consistent interpretation, and decisions that stand up when leaders ask teams to explain them.
The King’s story is not mainly about open-text analysis, but it points to a familiar challenge: a wellbeing survey can show that something is wrong, or improving, without showing why or what to fix first. That is where qualitative evidence becomes essential. Module feedback, service feedback, representative channels, and open comments can reveal whether a wellbeing signal is really about workload, assessment timing, communication, cost pressures, or access to support. The benefit is a sharper intervention plan. Teams can prioritise the right response instead of reacting to a headline score alone, and they can explain that decision more clearly.
Student Voice Analytics gives institutions a reproducible way to analyse those comment streams together rather than reading them in isolation. If a university wants to connect wellbeing surveys with module evaluations and annual survey evidence, it needs a consistent method for analysing comments and a clear governance model for turning findings into action. With reproducible analysis across survey types, teams can compare signals, spot pressure points earlier, and act with more confidence. If you are building a joined-up evidence model for wellbeing, module evaluations, and annual surveys, explore Student Voice Analytics to see how institutions analyse those comment streams with one reproducible method. Then use our NSS open-text analysis methodology and student comment analysis governance checklist to turn that approach into a repeatable process your team can defend.
Q: What should institutions do now if they want a similar local wellbeing survey?
A: Start by defining the gap the survey needs to fill that NSS, PTES, or module evaluations do not already cover. Keep it short, make the scope explicit, document how students can withdraw or opt out where relevant, and decide in advance who owns the follow-up when results show pressure points. That gives students a clearer reason to respond and gives teams a practical plan to act on the results before the term moves on.
Q: What is the timeline and scope of the King’s Wellbeing Survey?
A: King’s published the survey announcement on 25 February 2026. The survey opened on 2 March 2026 and closes on 2 April 2026. It is open to all undergraduate and postgraduate students at King’s, and the page says students can request withdrawal of their data up to 1 May 2026.
Q: What is the broader implication for student voice work?
A: Universities need a more layered feedback system. National surveys still matter, but local surveys focused on wellbeing or in-term experience can fill gaps that annual instruments leave. The value comes when those streams are connected, interpreted consistently, and turned into visible action that students can see.
[King’s College London]: "Tell us how you’re doing by completing the King’s Wellbeing Survey" Published: 2026-02-25
[King’s College London]: "Your feedback in action: shaping a better university experience" Published: 2025-11-10
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.