Advance HE spotlights student experiences of GenAI in UK universities, and what it means for student voice

Updated Apr 10, 2026

Student voice on GenAI is moving from speculation to evidence. Advance HE has put that shift back in front of the sector by highlighting findings from the StudentXGenAI Survey 2025. On 9 April 2026, it published "It is a temptation to get it to do the work..." - student experiences of GenAI in UK universities. For Student Experience teams, PVCs, and quality professionals, that matters because AI policy is moving from abstract principle to measurable student voice. Institutions now have a clearer sector signal on AI literacy, usefulness, trust, and perceived impact, not just local anecdotes, and it sits alongside wider evidence on how hope, worry, and guilt shape students' use of AI.

What has changed in student experiences of GenAI in UK universities

The most useful detail currently available sits in Advance HE's Artificial Intelligence Symposium 2026 session abstracts. The abstract for the StudentXGenAI session says the Leverhulme-funded survey explores student experiences of GenAI across 10 UK universities and is on track for 5,000 responses. It also says the survey maps demographic information against three dimensions: Knowledge & Access (GenAI literacy), Use and Usefulness, and Attitudes, including trust, values, and perceived impact. That gives institutions more than a headline figure. It shows the survey is structured to distinguish literacy, usefulness, and attitudes rather than rolling everything into a single question about AI use.

"The survey is on track for 5000 responses making it one of the highest response rates for a survey of this kind in the UK."

The scope is UK-wide rather than institution-specific. A UCL Digital Education team blog post from the survey launch phase says the UK version was led by Edinburgh Napier University and involved UCL, Manchester, Birmingham, St Andrews, York, and Queen's University Belfast, with South Wales, Edge Hill, and Coventry expected to complete the 10-university group. The same post says the survey was anonymous, open to students over 18, and designed to help institutions benchmark their students' experiences against work already carried out in Australia and planned in Europe. That matters because institutions are not looking at a one-campus snapshot. They have the outline of a benchmarkable UK evidence base, linked to parallel work in Australia and planned work in Europe.

This is not a new regulatory requirement or a mandated national survey. The change is in the evidence now visible to UK higher education. Advance HE is giving wider visibility to a large cross-institutional dataset on how students actually describe GenAI, moving the conversation beyond generic anxiety or enthusiasm towards a more structured picture of where students feel informed, helped, conflicted, or unconvinced.

What this means for institutions

The first practical implication is survey design. If institutions want to understand how GenAI is affecting the student experience, simple usage questions are too thin. The StudentXGenAI framework suggests local surveys and module evaluations should separate literacy and access from usefulness, and both from attitudes such as trust or perceived harm. That gives teams a better basis for deciding whether the right response is policy clarification, assessment literacy work, assessment redesign, staff development, or student guidance, rather than reaching for one generic AI fix.

The second implication is scope. Because the survey spans multiple UK providers, AI feedback can no longer be treated as a niche issue owned only by digital education teams. It now sits alongside other student voice evidence that quality and enhancement teams should review systematically. If your institution is piloting AI feedback tools, changing assessment rules, or rewriting guidance, this is a strong case for adding targeted student questions before confusion surfaces later in NSS, PTES, PRES, or local pulse work.

The third implication is comparability. A common survey structure makes it easier to compare AI-related concerns across faculties, cohorts, and years. That helps institutions separate broad sector patterns from local implementation problems, and it makes year-on-year review more defensible when local policies and tools change. It also fits the wider direction of travel on this site, from AI-resilient assessment design to governed survey benchmarking. As an inference from the survey design, the real opportunity is to collect AI-related student voice in a way that still stands up to comparison over time.

How student feedback analysis connects

At Student Voice AI, we see this as a comment analysis problem as much as a policy problem. Closed questions can tell you whether students use GenAI. They do not explain where students find it useful, where they mistrust it, or what they think crosses a line. Open-text feedback can surface those distinctions, especially between drafting, feedback, revision, study support, and assessment completion. That is also why recent work on students using Generative AI for feedback matters alongside this sector update.

That makes governed, reproducible analysis more important, not less. If institutions start collecting more AI-related comments through module evaluations, pulse surveys, or dedicated AI questionnaires, they need a method that can group themes consistently and keep an audit trail. If your institution is collecting that feedback, see how Student Voice Analytics helps teams analyse AI-related comments with a reproducible method rather than DIY spreadsheet-based comment analysis or generic LLM workflows. For a practical next step, use our NSS open-text analysis methodology, our student comment analysis governance checklist, and our comparison of Student Voice Analytics vs generic LLMs. The issue is not only hearing more student voice, but making it usable for policy and quality decisions.

FAQ

Q: What should institutions do now?

A: Review any local AI survey or module evaluation questions and check whether they distinguish between access, usefulness, trust, and perceived impact. If they do not, refine them before the next survey cycle and add at least one open-text prompt so students can explain where policy, assessment, or support feels unclear. That gives teams something more actionable than a simple usage rate, especially when AI detectors, privacy, and false positives are already shaping how students interpret policy.

Q: What is the timeline and scope of this change?

A: Advance HE published its News + Views item on 9 April 2026. The supporting symposium abstract describes the StudentXGenAI Survey 2025, which was fielded across 10 UK universities and is now being discussed at sector level in 2026. This is UK-wide in institutional scope, but it is not a regulatory change and does not impose a mandatory survey requirement.

Q: What is the broader implication for student voice?

A: The broader implication is that student voice on AI needs to go beyond adoption and misconduct. Institutions need evidence on trust, literacy, values, and perceived educational impact if they want AI policy to reflect how students actually experience these tools. Universities that listen in that broader way will be better placed to design guidance, review assessment, and act on student feedback before concerns harden.

References

[Advance HE]: It is a temptation to get it to do the work... - student experiences of GenAI in UK universities Published: 2026-04-09

[Advance HE]: "Artificial Intelligence Symposium 2026: Session abstracts" Published: 2026-02-10

[UCL Digital Education team blog]: "Student survey on their experiences of Generative AI: call for participants" Published: 2025-10-14

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.