Updated Apr 22, 2026
Fast feedback only helps if institutions design the survey cycle so evidence arrives in time to act on it. Sussex's April 2026 decision to open module evaluations and PTES in close succession shows what that looks like in practice.
On 13 April 2026, the University of Sussex announced its spring Module Evaluation Questionnaires and PTES 2026, with module evaluations opening immediately and PTES following on 15 April 2026. For Student Experience teams, PVCs, and quality professionals, that matters because Sussex is not just asking more questions. It is tightening the mechanics of student feedback so module-level and taught-postgraduate evidence arrive quickly, through clear routes, and with a stronger chance of being used before the academic year has fully closed.
The immediate change is operational. Sussex opened spring Module Evaluation Questionnaires on 13 April 2026 and kept them open for three weeks, while launching PTES for taught postgraduates from 15 April to 12 June 2026. That means taught students are being asked for feedback at two different levels in the same part of the academic cycle: short, module-specific evaluation for spring teaching, and a broader national survey for postgraduate taught provision. The university says MEQs are anonymous, take two to four minutes, and can be accessed through a personalised email link, a central survey portal, or directly from Canvas module sites. The practical gain is quicker evidence collection without asking one survey to do every job, and it reflects wider evidence on what gets students to fill in teaching evaluations: low-friction access and a clear purpose both matter.
The Sussex announcement also makes the intended use of the data explicit. It says past survey feedback has already led to clearer assessment guidance and marking criteria, improvements to Canvas and reading-list navigation, and changes to how seminars and lectures are run. That is important because it frames survey collection as part of enhancement work, not as a compliance exercise or a year-end metric. PTES is described in the same practical way: a national survey for taught postgraduates, run by Advance HE through Jisc Online Surveys, with institutional reminders and a defined data-protection position. The reader takeaway is clear: operational design matters most when institutions can show how feedback has already changed the student experience, much like Nottingham's PTES launch with visible follow-up.
"Achieving a good response rate for MEQs is essential to making feedback on students' experiences of their courses more robust."
A linked Sussex staff briefing adds another detail that matters for practice. Students are automatically attached to their module surveys, module convenors can see live response rates in the instructor portal, and student comments are available during the survey window for immediate action where relevant. Sussex also says results will be available as soon as the survey closes, reducing the lag between collection and use. The main development here is not a new survey instrument. It is a clearer operational model for getting feedback in, checking response quality, and moving findings to teaching teams faster.
The first implication is timing. Running module evaluations and PTES close together gives institutions a chance to compare very local teaching issues with broader postgraduate themes before the academic year disappears into boards and summer planning. That is close to the layered survey architecture described in Bath's 2026 student feedback system, where different surveys have different jobs but still contribute to one institutional picture. For institutions that want to move even earlier in the cycle, Westminster's Mid-Module Check-ins show what in-semester module feedback can look like. The benefit is faster interpretation: teams can see whether a concern is confined to one module, repeated across a department, or surfacing more widely in taught-postgraduate experience.
The second implication is survey operations. Sussex's staff note is unusually direct that response rate is a quality issue, not just a comms issue. Giving students multiple access routes, automatically loading the right modules, and letting convenors monitor uptake in real time are practical design choices that can make module feedback more usable, not just more plentiful. That matters because non-response bias in student evaluations can leave teams over-reading the views of the students who are easiest to reach. They also need guardrails. If institutions want faster collection without weaker governance, they need minimum expectations on timing, ownership, access, and follow-up of the kind set out in Glasgow's Student Voice Framework. The takeaway is simple: better survey plumbing usually produces better evidence.
The third implication is trust. When results are available quickly and comments may be visible during fieldwork, institutions need clear rules about who can read what, when they should intervene, and how action is communicated back to students. Running more surveys will not help if teams cannot explain what happened next. Survey collection and feedback follow-up have to be designed together, or speed will erode confidence rather than strengthen it.
This story matters for comment analysis because MEQs and PTES generate different kinds of open text. Module evaluations tend to surface highly local issues about assessment clarity, seminar pace, lecture structure, reading lists, or Canvas organisation. PTES comments are more likely to pick up cross-programme themes such as workload, academic support, belonging, dissertation supervision, or organisation and management. If those two comment sets land at roughly the same time, institutions need a defensible way to separate local fixes from recurring institutional themes, so action lands at the right level.
At Student Voice AI, we see the value when universities treat those survey streams as connected evidence rather than separate reporting exercises. A practical governance starting point is our student comment analysis governance checklist, particularly when live comments and rapid turnaround are involved. For the analytical side, the NSS open-text analysis methodology is a useful model for classifying themes, documenting assumptions, and keeping results traceable across survey cycles. Student Voice Analytics helps teams make that comparison with a reproducible, governance-ready method rather than ad hoc coding.
Q: What should institutions do now if they want to learn from Sussex's approach?
A: Review the survey timetable first. Check whether module evaluations, PTES, NSS, and local pulse work are sequenced so teams can compare findings rather than treating each survey in isolation. Then confirm who monitors response rates, who can view comments during fieldwork, what counts as immediate action, and how students will be told what changed.
Q: What is the timeline and scope of the Sussex change?
A: Sussex opened spring MEQs on 13 April 2026 and said they would run for three weeks, which takes them to 1 May 2026. PTES opened from 15 April 2026 for eligible taught postgraduates and runs until 12 June 2026. The scope is institution-specific, covering taught students on spring modules and taught postgraduates at one English university rather than creating a new national requirement.
Q: What is the broader implication for student voice?
A: The broader implication is that survey design matters as much as survey content. Universities collect better student voice evidence when access is simple, response quality is monitored, staff can act quickly, and module-level feedback can be read alongside wider institutional surveys rather than after them.
[University of Sussex]: "Module Evaluation Questionnaires and PTES 2026" Published: 2026-04-13
[University of Sussex]: "Help encourage students to share feedback in spring Module Evaluation Questionnaires (MEQs)" Published: 2026-04-14
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.