Manchester's course unit surveys show how in-class time improves module evaluation response rates

Updated Apr 27, 2026

Module evaluation response rates are usually treated as a reminder problem. The University of Manchester's latest course unit survey update suggests they should be treated as a design problem instead. On 23 April 2026, Manchester published Semester 2 Course Unit Surveys - now live, asking teaching teams to allocate time in teaching sessions for students to complete this semester's course unit surveys. For teams responsible for student voice, that matters because the university is linking response quality to in-class conditions, access routes, and faster reporting, not just to survey promotion.

What has changed in Manchester's course unit survey design

Manchester's Semester 2 Course Unit Surveys opened on 20 April 2026 and will remain open until 15 May 2026 for taught students. The 23 April staff announcement says teaching teams are expected to give students time in-session to complete the survey, while the university's wider survey guidance says students receive unique email links and can also enter through unit-specific or project-level QR codes. This is an institution-specific operational model, not a new sector-wide survey rule. The important shift is that Manchester is treating the conditions for completion as part of the collection method.

The wider guidance also shows a tighter survey structure than many local module evaluation processes. Every unit survey uses five core University questions, with three mandatory scale questions and two optional free-text prompts. Schools can add up to five school-specific questions, but those need formal sign-off. Reporting is similarly structured: instructor reports are available in the week after the survey closes, and aggregate reports then go to school and faculty leaders for quality assurance and monitoring. That means the route from response to review is more clearly defined than in many routine end-of-module surveys.

"Actively promoting the survey during sessions has been shown to significantly increase response rates and ensures all student voices are heard."

The source also says reminder emails will go only to students who have not yet completed all assigned surveys, with up to four automatic reminders across the fieldwork period. Schools are asked to reinforce the message through newsletters, social media where possible, and downloadable slides. The core assumption is clear: email alone is not enough. Manchester is combining scheduled class time, low-friction access, and coordinated reminders to reduce the friction that often weakens module evaluation evidence.

What this means for institutions

First, Manchester's model is a reminder that local module evaluation response rates are shaped before a student answers the first question. If staff allocate time in-session, students can move straight from a teaching context into the survey, which reduces delay, forgotten links, and drop-off. That operational thinking sits well beside Sussex's response-rate approach to module evaluations. The practical takeaway is to design the conditions for response, not just the invitation.

Second, the mix of fixed core questions and limited school-specific additions is useful. It gives local teams some room to ask targeted questions without losing the ability to compare patterns across units. Institutions that want stronger internal benchmarking should review whether their own survey builds still support that balance, especially if different schools use different tools or templates. Comparability depends on survey mechanics as much as survey intent.

Third, faster reporting matters only if ownership is clear. Manchester's guidance draws a line between instructor-level reports, unit-level analysis, and aggregate reporting for schools and faculties. For quality teams, the wider point is that response-rate improvement should feed directly into a named review process, not into another inbox of comments waiting to be read.

How student feedback analysis connects

Manchester's unit surveys ask the two open-text questions that matter most operationally: what was good, and what could be improved. If in-session completion and easier access lift participation, universities will collect a larger and potentially more representative body of short comments at exactly the point when teaching teams need to act quickly. That is useful, but only if institutions can separate unit-level issues from school-wide patterns and document how they reached that judgement.

This is where governed text analysis becomes practical rather than optional. A method such as our NSS open-text analysis methodology helps teams classify themes consistently, while the student comment analysis governance checklist helps define who can see comments, how outputs are checked, and how action is recorded. Where institutions need to compare comment themes across large volumes of module feedback, Student Voice Analytics is one way to do that with a reproducible audit trail.

FAQ

Q: What should institutions do now if they want to learn from Manchester's approach?

A: Audit the fieldwork conditions before you rewrite the questionnaire. Check whether teaching teams are expected to allocate in-class time, whether students can reach the survey through more than one route, whether all units use a stable core question set, and who owns the reporting once the survey closes. If response rates are weak, those operational conditions are often the first place to look.

Q: What is the timeline and scope of Manchester's current course unit survey cycle?

A: Manchester's staff announcement was published on 23 April 2026. The university's survey guidance says Semester 2 Unit Surveys opened on 20 April 2026 and run until 15 May 2026. The change applies to taught students and course units at one English university. It is a local operational model, not a national survey or regulatory change.

Q: What is the broader implication for student voice?

A: The broader implication is that response-rate management is part of evidence governance. Better conditions for completion can make module feedback more representative, but only if universities pair that with clear reporting routes, comparable question design, and a disciplined way to analyse the comments that follow.

References

[The University of Manchester StaffNet]: "Semester 2 Course Unit Surveys - now live" Published: 2026-04-23

[The University of Manchester]: "Unit surveys" Published: not stated

[The University of Manchester StaffNet]: "Surveys" Published: not stated

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.