Jisc will retire Digital experience insights, and what it means for student feedback benchmarking

Updated Apr 09, 2026

Jisc Digital experience insights now has a retirement date, and universities have one live cycle left to preserve benchmark evidence. As of 9 April 2026, Jisc's Key dates for our surveys page states that Digital Experience Insights will retire on 31 July 2026. For Student Experience teams, PVCs, and quality professionals, that matters because Digital experience insights has been a structured route for collecting, benchmarking, and acting on digital student feedback across UK higher education. The immediate question is not only what replaces the service, but how institutions preserve comparable evidence on students' digital experience while the final cycle is still live.

What has changed in Jisc Digital experience insights

The core announcement is straightforward. Jisc's key dates page says Digital Experience Insights (DEI) will retire on 31 July 2026, that the service will continue to operate as normal until that date, and that Jisc will stay committed to support through the final cycle. The same page also sets out the timetable for that final cycle: the 2025/26 surveys opened on 6 October 2025, the student or learner survey closes on 1 May 2026, and the teaching staff and professional services staff surveys close on 3 July 2026. That gives institutions a short but usable window to finish current fieldwork, export the evidence they need, and document how results have been used before the service ends.

"Digital Experience Insights (DEI) will retire on 31 July 2026."

The scope of that change is wider than a single student questionnaire. Jisc's Our surveys page says the service supports student, teaching staff, and professional services staff surveys, can be run annually or as pulse surveys, allows institutions to add local questions, and provides sector benchmarking data plus annual reports. In other words, the retirement affects a whole survey architecture for understanding digital experience, not just one reporting dashboard. For universities that have used DEI to connect student feedback with staff perspectives and digital strategy, this is a change in evidence infrastructure, not just software procurement.

Jisc's Our reports page also shows why the change matters at sector level. It says over 200 organisations have engaged with the surveys so far, and that 21,279 students and learners from 46 UK organisations participated in the 2024/25 survey cycle. The pages we reviewed do not state when the retirement notice was first published, so the safest reading is that this is a live April 2026 service update rather than a dated press release. Either way, institutions should treat the closure timetable as active guidance for their final DEI cycle.

What the Digital experience insights retirement means for institutions

The first task is evidence preservation. If your university uses DEI for student experience, digital strategy, or committee reporting, do not wait until July to decide what you need to keep. Export response data, archive benchmark outputs, record which local questions were appended, and note how results have been grouped and reported internally. That matters most if your institution expects to compare a future replacement survey with historical DEI findings, or explain year-on-year shifts in digital experience to senior committees.

The second task is survey redesign. Many universities already run layered feedback systems that combine national surveys, local pulse work, and service-specific feedback. If DEI currently fills the digital experience part of that system, teams need to decide now whether the replacement should be a dedicated institutional survey, a set of targeted pulse surveys, or a smaller number of digital questions built into existing routes. Recent examples on this site, including Bath's 2026 student feedback system and Jisc Online Surveys question type changes, point to the same lesson: survey architecture matters as much as response rate. Without clear ownership, timing, and reporting rules, digital experience feedback becomes harder to compare and harder to act on.

The third task is benchmarking discipline. DEI has given institutions not only a way to collect data, but a way to place local results in sector context. If that context is going away, teams should be explicit about what will replace it, and how they will benchmark and triangulate student survey evidence once DEI is gone. That may mean building stronger internal baselines, triangulating local survey results with helpdesk themes and learning analytics evidence, or using clearer governance so that student voice on digital provision is tracked consistently across years. The gain is continuity and credible decision-making: teams can show whether changes to platforms, support, accessibility, or digital skills provision are actually improving what students report.

How student feedback analysis connects

This is where open-text analysis becomes especially useful. When a survey instrument changes, closed-question trend lines often become harder to compare. Qualitative comments can help institutions retain continuity by showing whether the same issues, such as platform reliability, access to software, digital assessment, online learning design, or digital skills support, keep appearing across different feedback routes. That matters if universities end up replacing one sector survey with a mix of local surveys and service feedback rather than a like-for-like national benchmark.

Digital experience problems rarely surface in only one place. Students mention them in dedicated digital surveys, but also in module evaluations, NSS comments on learning resources and organisation, and ad hoc service feedback. A governed approach to student comment analysis helps institutions connect those signals once the survey landscape shifts. That does not remove the need for a clear replacement survey strategy, but it does make it easier to keep digital student voice visible while methods change.

FAQ

Q: What should institutions do now if they use Jisc Digital Experience Insights?

A: Finish the current cycle with closure in mind. Export the data you need, archive benchmark reports and questionnaire versions, confirm who owns any replacement survey design, and decide before summer how digital experience evidence will be collected in 2026/27. If digital feedback is currently spread across teams, bring those owners together now, rather than after the service closes.

Q: What is the timeline and scope of the change?

A: Jisc's key dates page, as accessed on 9 April 2026, says DEI will retire on 31 July 2026. The current 2025/26 cycle opened on 6 October 2025, the student or learner survey closes on 1 May 2026, and the teaching staff and professional services staff surveys close on 3 July 2026. The service covers students, teaching staff, and professional services staff, with use across UK higher education and further education, plus some international organisations.

Q: What is the broader implication for student voice?

A: The broader implication is that digital student voice is now an infrastructure issue, not just an IT satisfaction issue. If institutions want to keep listening well, they need a replacement that preserves longitudinal evidence, clear ownership, and a route from findings to action. Otherwise, digital experience becomes harder to track just as it remains central to teaching, support, and assessment.

References

[Jisc Digital Insights]: "Key dates for our surveys" Published: not stated

[Jisc Digital Insights]: "Our surveys" Published: not stated

[Jisc Digital Insights]: "Our reports" Published: not stated

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.