Jisc: building a business case for learning analytics, keeping student feedback in the loop

Updated Apr 02, 2026

On 27 February 2026, Jisc published practical guidance on the business case for learning analytics, focusing on stakeholder engagement and long-term support. We are highlighting it because learning analytics programmes often succeed or fail on the same thing student feedback programmes do: whether students and staff understand the purpose, trust the process, and see improvement over time. [Jisc blog post]

What has changed in Jisc’s business case for learning analytics guidance

Jisc’s blog is the second part of a short series on building a business case for learning analytics. This instalment is less about the technical platform and more about adoption: how you take a pilot into business-as-usual without losing buy-in, quality, or safeguards.

For institutions, the key shift is the emphasis on learning analytics as a change programme. Jisc highlights the need to plan for staff capacity, role-based training, and communication that builds confidence rather than suspicion. It also makes explicit that implementation should include an improvement loop, not a one-off launch.

"Gather staff and student feedback, refine thresholds and workflows, and scale in phases"

Jisc also flags governance considerations that will feel familiar to student experience and quality teams: transparency with students about what data is collected and why, involvement of student representatives, and clear safeguards to manage bias, false positives, and the risk of over-alerting.

What this means for institutions

If you are building (or refreshing) a learning analytics business case, treat student feedback as part of the core evidence, not an add-on. Quantitative indicators can tell you where engagement or continuation risks may sit, but student voice evidence tells you why. That matters when you are making decisions about which interventions are acceptable, which are effective, and which create unintended friction.

Practically, Jisc’s guidance translates into a few immediate actions:

  • Make the pilot measurable, and include feedback loops on the intervention itself, not only the dashboard.
  • Agree what “good” looks like operationally: who triages alerts, what the response time is, and what escalation routes exist.
  • Communicate clearly with students about purpose, benefits, and safeguards, and involve student representatives early.

For teams already doing large-scale comment analysis, the overlap is strong. The same governance disciplines apply, including repeatability, privacy controls, and being able to explain method choices to panels and committees. If you need a lightweight starting point, see our student comment analysis governance checklist. For a reminder of why versioning and data refresh matter in governance packs, see OfS TEF dashboard corrections.

How student feedback analysis connects

At Student Voice AI, we see learning analytics initiatives work best when they are paired with open-text feedback. Free-text comments from NSS, PTES, PRES, pulse surveys, and module evaluations help teams interpret what a metric cannot: whether students experience an intervention as supportive, confusing, or intrusive.

If you are scaling learning analytics, it is worth putting a repeatable open-text workflow alongside it, so you can track what changes in student voice as you adjust thresholds and support models. For a practical workflow, see our NSS open-text analysis methodology.

FAQ

Q: What should we do now to build a credible learning analytics business case?

A: Start with a scoped pilot and define the operational model before you scale. Include staff and student feedback in the pilot evaluation, and document how thresholds, workflows, and safeguards will be reviewed over time.

Q: Is Jisc’s guidance mandatory, and who does it apply to?

A: No. This is non-regulatory guidance published by Jisc on 27 February 2026. It is written for UK universities and colleges considering learning analytics, but the adoption and trust principles apply more widely.

Q: What is the biggest student voice risk in learning analytics programmes?

A: Treating learning analytics as purely technical. If students are not informed and involved, and if feedback on the intervention is not captured, institutions can undermine trust and miss unintended impacts.

References

[Jisc]: "Building a business case for learning analytics: securing stakeholder engagement and ongoing support"
Published: 2026-02-27

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.