City St George's module evaluation system shows how merged universities can keep student feedback usable

Updated Apr 26, 2026

When institutions merge, student feedback can easily split along legacy systems instead of producing one usable view. That is why City St George's updated Module Evaluation at City St George's on 30 March 2026, and the page is worth attention beyond one university. It sets out how the newly merged institution is collecting module feedback across Clerkenwell, Moorgate, and Tooting, with shared expectations on timing, access, anonymity, and follow-up. For Student Experience teams, PVCs, and quality professionals, the practical point is clear: if student voice is going to stay usable after organisational change, the feedback route needs to stay simple for students and structured for staff.

What has changed in City St George's module evaluation system

The immediate development is not a new national survey. It is an updated institution-wide module evaluation system for a newly merged university. City St George's says its online module evaluations were developed in partnership with academic and professional staff in its Schools, and that each taught module is normally evaluated once, usually in teaching week 9 of each term. Students can reach the survey through a direct email, through MyMoodle or Canvas, or through a central Student Survey Portal. The survey takes no more than five minutes and asks about teaching, academic support, learning resources, student voice, module delivery, and the module overall.

The page is especially clear about how that process now works across different legacy systems. Because City and St George's resources are still being brought together, students are told that some support content remains campus-specific. Clerkenwell and Moorgate students receive one support route and one survey email address. Tooting students receive another. Even so, the core evaluation model is consistent across campuses: one online process, one survey portal, and one shared expectation that module feedback should be accessible whether students are on or off campus.

"It also enables us to process results and respond to your comments more quickly."

City St George's also sets out a more explicit action loop than many routine module evaluation pages. The university says students will receive a detailed response, and associated actions, soon after the survey closes. Results are shared with staff within Schools, considered at relevant committees, and discussed through Staff-Student Liaison Committees where appropriate. Module teams are expected to prepare a cohort-level response summarising common themes, their own reflection, and any development points. The anonymity position is also clearer than usual: programme teams should not know who gave which answers unless students identify themselves in open comments, while central systems can track who has responded so reminders only go to non-responders and analysis can be linked to background data in non-identifying ways.

What this means for institutions

The first implication is architectural. Post-merger and multi-campus universities should treat feedback operations as core quality infrastructure, not as an administrative clean-up exercise. City St George's shows why. When surveys live across different VLEs, email routes, and support contacts, the route to completion still has to feel coherent to students. That is the same wider discipline we saw in Bath's 2026 student feedback system: decide which route gathers which evidence, then make ownership and access rules explicit.

The second implication is timing. Running module evaluations in teaching week 9 aims to capture views before final exams and before the academic year fully closes. That does not guarantee changes for the same cohort, but it gives teams more room to respond than a process that starts only after teaching has finished. Institutions that want an even earlier read on live teaching issues should compare this model with Westminster's Mid-Module Check-ins. The useful question is not whether to ask students for more feedback. It is whether the survey window is early enough, and the response loop short enough, to make the answers usable.

The third implication is governance. City St George's is explicit that central teams can see who has and has not responded, while academic teams should not be able to identify responses from the content itself. Universities should review whether their own module evaluation privacy statements, reminder rules, and escalation limits are equally clear. If students do not understand who can see what, or how their comments will be used, response quality usually weakens before response rates do. For quality leaders, that makes anonymity wording and follow-up rules part of survey design, not a separate compliance task.

How student feedback analysis connects

Module evaluation comments are usually short, local, and numerous. Once an institution is collecting them across several Schools or campuses, the difficult part is no longer gathering the comments. It is making them comparable without flattening them into generic themes. A consistent analytical method helps teams distinguish between issues that belong with one module leader, one School, or a university-wide service, and it gives institutional teams a cleaner evidence base when they need to compare patterns across the year.

At Student Voice AI, we see the strongest results when institutions pair that analysis with a clear action log and a defined governance model. Student Voice Analytics can help teams group recurring module themes across campuses and cohorts, but the governance question matters just as much as the analytical one. If you are reviewing module feedback practice after structural change, our student comment analysis governance checklist is a practical starting point.

FAQ

Q: What should institutions do now if they run module evaluations across several campuses or systems?

A: Start by mapping the full student journey from invite to action. Check which VLEs are in scope, which portal students use, who monitors response rates, how anonymity is explained, where results are discussed, and when students are told what changed. A short governance review usually surfaces gaps faster than a new survey build.

Q: What is the timeline and scope of the City St George's change?

A: The page was last updated on 30 March 2026. It applies to module evaluations across taught provision at City St George's, University of London, including Clerkenwell, Moorgate, and Tooting. The university says evaluations normally run once per module, usually in teaching week 9 of each term, through online routes linked to email, MyMoodle, Canvas, and the Student Survey Portal. This is an institution-specific update, not a national sector rule.

Q: What is the broader implication for student voice?

A: The broader implication is that feedback quality depends on operational coherence. Universities collect more usable evidence when collection, anonymity, analysis, and follow-up are designed together, especially when provision spans multiple campuses, platforms, or legacy processes.

References

[City St George's, University of London]: "Module Evaluation at City St George's" Published: not stated; page last updated 2026-03-30

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.