Updated Apr 02, 2026
Most universities run several student surveys, but far fewer can connect them quickly enough to improve the student experience before the academic year moves on. That is why King’s College London’s latest wellbeing survey deserves attention. On 25 February 2026, King’s published Tell us how you’re doing by completing the King’s Wellbeing Survey, announcing a university-wide wellbeing survey for undergraduate and postgraduate students running from 2 March to 2 April 2026. The King’s Wellbeing Survey matters not only because it exists, but because King’s places it alongside NSS and PTES within a broader model of student voice in higher education. For other institutions, that offers a practical example of how to gather feedback across the year, connect the evidence, and act while there is still time to make a difference.
The immediate change is operational: King’s has launched a 10-minute local survey for all undergraduate and postgraduate students, covering life at King’s, daily routines, quality of life, social connections, and general wellbeing. The page is clear that it is a separate survey from NSS and PTES. It also handles governance more clearly than many local survey launches. Students can withdraw their data up to 1 May 2026, and the survey is framed as part of King’s wider work on student mental health and wellbeing. For institutions considering their own pulse or wellbeing survey, that clarity reduces friction. Students can see what the survey covers, how it fits alongside other feedback, and what control they retain over their data.
The more important change is not the survey length or launch window, but the role the survey plays in the university’s wider feedback system. King’s says it uses data from all three surveys, the King’s Wellbeing Survey, NSS, and PTES, to inform its whole-university approach to student mental health and wellbeing. That gives the local survey a clear job without isolating it from the rest of the evidence. For other institutions, the takeaway is straightforward: define what each survey is for, then show how the findings connect into one institutional view people can act on.
"We use the data from all three surveys to inform our whole-university approach to student mental health and wellbeing."
King’s also links the survey to a visible “feedback in action” page, giving the launch more credibility and students a clearer payoff. Published on 10 November 2025, that page says the university updated its module feedback and evaluation policy in September 2025 to create more opportunities to provide feedback during the module rather than at the end, while also shortening module evaluation surveys. It also lists concrete changes in assessment policy, personal tutoring, student support, accessibility, and wellbeing services. This matters because the survey is not just another request for student time. It sits inside a wider feedback system that shows what changed because students spoke up. For institutions, that visible follow-through can build trust in future surveys and make response requests easier to justify, a point reinforced by Nottingham's PTES launch with visible follow-up.
First, this is a strong example of survey architecture, not just survey promotion. Many universities run NSS, PTES, module evaluations, and occasional pulse surveys without being explicit about what each instrument is meant to do. Bath's 2026 student feedback system is a good parallel example of matching survey purpose to cohort. King’s does something more useful: it keeps a local survey focused on wellbeing, distinct from NSS and PTES, while tying it back to a single institutional picture. For Student Experience teams and PVCs, the practical takeaway is to define the job of each survey before adding another one. That reduces duplication, helps manage response burden, and makes the outputs easier to interpret and act on.
Second, timing matters because late feedback narrows what institutions can still change. A wellbeing issue that becomes visible only in NSS or another annual student experience survey often appears too late for teams to respond in the same academic year. Local wellbeing surveys can surface earlier signals on belonging, pressure, isolation, or access to support, especially when they run during term rather than after it. That is why this story sits well alongside our recent coverage of the OfS student pulse survey and Westminster’s Mid-Module Check-ins. The same principle runs through all three cases: earlier listening gives institutions more time to intervene while students still feel the impact, and more time to show that feedback leads to action.
Third, joined-up listening still needs discipline. If universities combine local surveys with NSS, PTES, or service data, they need clear rules on response burden, representativeness, and ownership. A short local survey can be valuable, but only if teams can explain who was asked, when, what the survey was for, and who owns the response. Our summary of non-response bias in student evaluations is relevant here, because a lighter-touch survey is not automatically a more representative one. The value of a layered feedback system comes from clear governance, consistent interpretation, and visible follow-through, not from simply running more surveys.
The King’s story is not mainly about open-text analysis, but it points to a familiar challenge. A wellbeing survey can show that something is wrong, or improving, without showing why or what to fix first. That is where qualitative evidence becomes essential. Module feedback, service feedback, representative channels, and open comments can reveal whether a wellbeing signal is really about workload, assessment timing, communication, cost pressures, or access to support. That extra context helps teams prioritise the right intervention instead of reacting to a headline score alone.
At Student Voice AI, we see the clearest institutional picture when those comment streams are analysed together rather than read in isolation. If a university wants to combine wellbeing surveys with module evaluations and annual survey evidence, it needs a repeatable way to analyse comments and a clear governance model for turning findings into action. Student Voice Analytics helps teams do that with reproducible analysis across survey types, so leaders can compare signals, spot pressure points earlier, and act with more confidence. If you are building that kind of evidence model, explore Student Voice Analytics to see how joined-up comment analysis works in practice, then start with our NSS open-text analysis methodology and student comment analysis governance checklist.
Q: What should institutions do now if they want a similar local wellbeing survey?
A: Start by defining the gap the survey needs to fill that NSS, PTES, or module evaluations do not already cover. Keep it short, make the scope explicit, document how students can withdraw or opt out where relevant, and decide in advance who owns the follow-up when results show pressure points. That gives students a clearer reason to respond and gives teams a practical plan for acting on the results.
Q: What is the timeline and scope of the King’s Wellbeing Survey?
A: King’s published the survey announcement on 25 February 2026. The survey opened on 2 March 2026 and closes on 2 April 2026. It is open to all undergraduate and postgraduate students at King’s, and the page says students can request withdrawal of their data up to 1 May 2026.
Q: What is the broader implication for student voice work?
A: Universities need a more layered feedback system. National surveys still matter, but local surveys focused on wellbeing or in-term experience can fill gaps that annual instruments leave. The value comes when those streams are connected, interpreted consistently, and turned into visible action.
[King’s College London]: "Tell us how you’re doing by completing the King’s Wellbeing Survey" Published: 2026-02-25
[King’s College London]: "Your feedback in action: shaping a better university experience" Published: 2025-11-10
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.