Updated Apr 18, 2026
Student feedback expectations are forming before the first lecture. On 16 April 2026, Advance HE published its pre-arrival questionnaire (PAQ) national pilot wave 1 initial results, giving the sector a new public read on what incoming undergraduates in England expect from higher education before they arrive. For teams working on student voice, that matters because misalignment on feedback, contact, digital confidence, finances, and wellbeing starts early. If institutions wait for module evaluations or NSS to surface it, they are already late.
The immediate change is that the PAQ is now a visible national evidence source rather than only a local transition tool. Advance HE says the initial results draw on incoming undergraduates across 15 diverse higher education institutions in England and show what students bring with them on entry: prior learning histories, expectations, wellbeing concerns, and financial pressure. The supporting pilot documentation adds that the wider OfS-funded project runs until June 2027, with wave 1 fieldwork in autumn 2025 and wave 2 fieldwork scheduled for September to November 2026. That makes this more than a one-off transition exercise. It is the early shape of a repeatable survey architecture.
The headline findings are practical. Advance HE says students arrive with uneven academic, social, and practical confidence, and that disabled students, carers, commuters, and mature entrants report lower confidence on entry. It also says many students expect more structured contact and more accessible feedback than current higher education models routinely provide, while financial and wellbeing concerns are already present before term begins. Advance HE adds that institutions in the pilot were able to respond in real time by signposting students to relevant services and using questionnaire feedback to address misaligned expectations. The takeaway is not simply that students differ. It is that institutions can act on those differences before problems harden.
"Institutions may wish to consider how their transition support reflects the real diversity of students' starting points."
Advance HE's earlier 2025 funding announcement helps explain why the pilot matters. The project was designed to standardise how institutions collect and use pre-arrival information, align expectations with actual experience, and build more robust evidence on disparities in student experiences and outcomes. The wave 2 participation document also says the project aims to help institutions collect and act on information from students upon arrival and use it to support wellbeing, belonging, continuation, and attainment. In other words, the pre-arrival questionnaire is being positioned as operational evidence that can change practice, not as an interesting add-on. That is the real shift.
The first implication is about expectation-setting. If students arrive expecting closer contact and faster feedback than the institution is likely to provide, that gap needs to be addressed in induction, early assessment briefing, and first-term communication. Universities often talk about transition as a study-skills or belonging issue. The PAQ findings suggest it is also a feedback-design issue. Students need a realistic explanation of how contact hours, independent study, assessment, and feedback will work in practice, and they need that explanation early enough to use it.
The second implication is about sequence. A pre-arrival questionnaire is most useful when it is linked to the next feedback moment rather than left as a standalone survey. Institutions using similar transition instruments should decide in advance which issues will be checked again in the first few weeks of teaching, and how those findings will connect to routes such as Westminster's Mid-Module Check-ins. That creates a more coherent evidence chain from expectation, to lived experience, to action. Without that sequence, an early survey can become another isolated dataset.
The third implication is about method. If institutions want to build on this model locally, the survey itself needs to be clean, comprehensible, and comparable across cohorts. Recent Jisc changes to question types in Online Surveys are a useful reminder that small design choices affect whether results can be trusted later. Once early feedback starts feeding induction planning, school-level action, or access work, universities also need the discipline described in benchmarking and triangulating student survey data. The benefit is not more survey activity. It is better evidence about which groups arrive with which risks, and whether those risks actually reduce after intervention.
This story matters for comment analysis because pre-arrival evidence tells institutions what to listen for once teaching starts. If students say they are worried about feedback availability, confidence, commuting, workload, or wellbeing before arrival, the next question is whether those same themes surface again in induction feedback, early pulse work, mid-module comments, or later annual surveys. A consistent approach to NSS open-text analysis methodology helps teams compare those signals rather than treating each survey window as a fresh start.
Student Voice Analytics is useful here when universities want to track whether expectation gaps close or harden. The practical value is simple: teams can compare transition, in-term, and annual comment themes with one reproducible method, and then show whether early interventions actually changed what students said. That makes the pre-arrival questionnaire more useful as a starting point for action, not just a snapshot.
Q: What should institutions do now if they want to act on the PAQ findings?
A: Start by reviewing where your students are most likely to meet an expectations gap in the first six weeks. Check induction messaging, early assessment guidance, feedback turnaround explanations, digital onboarding, and signposting for wellbeing and financial support. Then decide which issues should be rechecked through an early in-term route, so you can see whether the gap has narrowed rather than waiting for end-of-module evidence.
Q: What is the timeline and scope of the pilot?
A: Advance HE published the initial results on 16 April 2026. The public summary covers incoming undergraduates across 15 higher education institutions in England. Supporting pilot guidance says the OfS-funded project runs until June 2027, with wave 2 fieldwork scheduled for September to November 2026. This is an England-based pilot, not a UK-wide statutory survey change.
Q: What is the broader implication for student voice?
A: Student voice should start earlier than most institutional feedback calendars currently allow. The broader implication of the pre-arrival questionnaire is that universities can begin collecting useful evidence before students arrive, then connect that early signal to induction, belonging, assessment communication, and later survey evidence. That gives student experience teams a better chance to prevent predictable problems instead of only recording them after the fact.
[Advance HE]: "Pre-arrival questionnaire (PAQ) national pilot wave 1 initial results" Published: 2026-04-16
[Advance HE]: "Joint bid wins OfS funding for a pre-arrival survey" Published: 2025-02-19
[Advance HE]: "Information for participating institutions: Pre-arrival Academic Questionnaire (PAQ) National Pilot - Wave 2" Published: not stated
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.