Updated May 03, 2026
The Annual Programme Survey matters because it shows how an internal feedback route can stay focused on the whole taught-postgraduate experience, not only one module at a time. On 24 April 2026, UCL asked taught postgraduates to complete a five-minute survey covering modules, dissertations and placements, open until 26 June 2026. For institutions trying to build more joined-up student voice evidence, that scope matters because it keeps several parts of the programme experience in one route.
This is not a new national survey and it is not a PTES replacement. It is a UCL internal survey, but the current postgraduate wave shows a clear design choice. The 24 April announcement says the Annual Programme Survey gives taught postgraduates one route to comment on modules, and on dissertations and placements where relevant. That matters because postgraduate feedback is often split across module evaluations, dissertation reviews, and service questionnaires, which can make the evidence harder to read together.
UCL's 23 February 2026 APS briefing for continuing undergraduates helps explain the wider model. It says APS gathers feedback on teaching and learning, academic support, assessment and feedback, overall programme structure, and free-text comments on individual modules. In other words, UCL is using one survey framework across taught students, then adapting the live wave to the relevant cohort. The practical shift is towards programme-level coherence, not more surveys.
"APS feedback is entirely confidential"
That confidentiality point matters because UCL also says departments and support services receive anonymised open-text comments alongside numerical results, and that those results contribute to Faculty and Department Education Plans. The postgraduate announcement adds a sizeable prize draw, up to £1,000, but the more important operational detail is that the survey is short and scoped to the places where taught postgraduates often see friction first: modules, dissertations and placements.
The first implication is survey architecture. A programme survey works best when institutions decide in advance what should be captured at module level, what should be held at programme level, and what should stay in national instruments such as PTES. UCL's APS suggests one answer: keep related feedback about teaching, structure, assessment, and applied elements of the programme in one route when students experience them as one journey. That is close to the layered model discussed in QAA's research on student feedback systems, where different routes have different jobs but still contribute to one evidence picture.
The second implication is that open text needs a clear destination. UCL's APS material is useful because it does not stop at collection. It says anonymised comments and scores are passed to departments and support services and used in Education Plans. Many internal surveys fall short at this point: they collect broadly, then struggle to show who owns the response. Institutions should treat that action route as part of survey design, not as a later reporting problem. That is also why survey benchmarking and triangulation matter. A programme survey becomes more useful when teams can compare what students say about structure, support, and assessment with what other surveys and committee evidence are already showing.
The third implication is about taught-postgraduate cohorts specifically. These students often move quickly through a compressed academic year, and dissertation or placement issues can surface too late for an annual national survey to explain them well. A short internal programme survey can close some of that gap, but only if institutions keep it concise, protect confidentiality, and avoid duplicating what students are already asked elsewhere. The useful lesson from UCL is not the prize draw. It is the attempt to keep programme feedback connected.
This story matters for analysis because a programme survey that covers modules, dissertations and placements is likely to produce comments at different levels of specificity. Some will point to one module or assessment. Others will describe wider issues with supervision, timetable design, course structure, or support. If institutions do not separate those levels consistently, programme surveys can generate plenty of feedback but very little clear action.
At Student Voice AI, we see the value when institutions use a stable method to group those comments, protect anonymity in smaller cohorts, and keep an audit trail from raw comment to action plan. Our student comment analysis governance checklist is a practical starting point for that work, especially when one survey is bringing together several parts of the taught-student experience. Student Voice Analytics is useful where teams need to compare programme-level patterns with module-level themes without losing traceability.
Q: What should institutions do now if they run internal programme surveys?
A: Review scope and ownership before the next cycle opens. Decide which issues belong at module, programme, and service level, confirm how anonymity will be protected in smaller cohorts, and define how open-text comments will be triaged into action plans rather than left in raw exports.
Q: What is the timeline and scope of UCL's Annual Programme Survey change?
A: UCL published the taught-postgraduate APS notice on 24 April 2026 and said the survey would stay open until 26 June 2026. It applies to taught postgraduates at one English university. A separate APS notice published on 23 February 2026 shows UCL also uses the same survey framework for continuing undergraduates, so this is an institution-wide taught-student model rather than a sector-wide change.
Q: What is the broader implication for student voice?
A: Programme surveys can reduce fragmentation by collecting feedback on modules, dissertations and placements in one route, but they only add value if they feed into clear ownership, anonymised open-text analysis, and visible follow-through. The survey itself is only the front end of the evidence trail.
[UCL]: "Postgraduates: share your feedback today in the Annual Programme Survey" Published: 2026-04-24
[UCL Teaching & Learning]: "Annual Programme Survey to open for continuing students" Published: 2026-02-23
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.