Jisc adds file uploads to Online Surveys, and why it matters for student feedback surveys

Updated Apr 09, 2026

Student feedback surveys usually ask students to describe a problem. Jisc's latest update means they can now show it too. On 9 February 2026, Jisc announced in its [Online Surveys product updates] that Online Surveys now includes a File upload question. For teams running student feedback surveys as part of a wider student voice practice, that is more than a minor product release. It gives institutions a new way to collect supporting evidence alongside ratings and open-text comments, at the point of response.

What has changed in Jisc Online Surveys for student feedback surveys

Jisc says respondents can now attach a PDF or an image directly within a survey response. The same update frames the feature as a way to collect supporting documents, screenshots, and photos, and says it is available now in all surveys. Jisc’s [change log] confirms the release in version 3.34.0, also dated 9 February 2026. For institutions already using Jisc Online Surveys, the takeaway is immediate: richer evidence can now be collected without changing platform.

This is a platform change, not a national survey methodology change. It does not alter NSS, PTES, PRES, or UKES question wording, and it does not affect institutions using other survey platforms. But for universities already using Jisc Online Surveys for local student feedback work, the feature is live immediately and can be added to student experience, service review, and issue-reporting surveys without waiting for a new survey cycle. That matters most where teams need clearer evidence on operational problems this term, not after the next annual survey round.

"Reduce follow-up emails by getting everything you need at the point of response."

Jisc also positions the feature as a way to support a broader set of survey workflows, including application-style surveys and research submissions. For higher education teams, the practical significance is that some student feedback processes can now move from description only to description plus evidence, within the same response flow. Used carefully, that can shorten the gap between a reported problem and a useful response.

What this means for institutions

The first implication is survey design. Student Experience teams, PVCs, and quality professionals should review where attachments genuinely improve decision-making, rather than adding friction. The same principle appears in our summary of how teaching evaluation surveys work better when students and staff help design them, where question format and question purpose need to stay aligned. The strongest use cases are likely to be operational feedback processes where context matters most: digital access problems, learning environment issues, or service failures that students currently have to explain in a follow-up email after submitting a survey. In those cases, an upload field can reduce back-and-forth and help teams diagnose the issue faster.

The second implication is governance. A file upload option can make student feedback more actionable, but it can also increase the chance of collecting identifiable or sensitive material. Institutions should decide in advance when attachments are appropriate, who can access them, how long they are retained, and whether uploaded files are handled as part of case triage rather than institution-wide reporting. That governance boundary is easier to manage when teams use a student feedback governance framework rather than ad hoc local rules. That clarity protects both students and staff, and it reduces the risk of turning a useful evidence feature into a data-handling problem. Our student comment analysis governance checklist is a useful starting point for that discussion.

The third implication is expectation-setting. If students are invited to upload evidence, the survey needs to explain what kind of file is useful, when not to upload anything, and whether an individual response should be expected. Without that clarity, institutions risk collecting more material than they can review promptly, or encouraging students to share evidence that should have gone through a different support route. The gain is better evidence only if the request is tightly scoped.

How student feedback analysis connects

At Student Voice AI, we see open-text analysis doing most of the work in large-scale reporting because open comments remain the best source for theme detection, benchmarking, and trend analysis across courses and cohorts. File uploads serve a different job. They are most useful as high-context evidence for specific operational issues, not as a replacement for structured comment analysis.

The practical opportunity is to connect the two. Use surveys to collect consistent open-text at scale, then use attachments selectively where a screenshot, document, or image materially improves understanding of the issue being reported. That joined workflow is strongest when teams benchmark and triangulate survey evidence rather than treating each feedback channel in isolation. If you are refining that workflow, Student Voice Analytics helps institutions analyse comment data consistently across cohorts while keeping high-context case evidence in the right operational channel. Then use our NSS open-text analysis methodology and student feedback analysis glossary as reference points when you design the process.

FAQ

Q: Should institutions add file uploads to all student feedback surveys now?

A: No. Start with a small number of use cases where supporting evidence will clearly improve action, such as reporting digital access or facilities issues. For broad experience surveys, open-text comments will usually remain the more scalable and comparable source of evidence.

Q: When is the change live, and who does it affect?

A: Jisc released the File upload question on 9 February 2026, with the change recorded in Online Surveys version 3.34.0. It is available in Jisc Online Surveys now, so it affects institutions that already use that platform for survey work. It is optional, and it does not change national survey methodology.

Q: What is the broader implication for student voice practice?

A: The change makes it easier to collect richer evidence in some feedback workflows, but it also sharpens the distinction between large-scale insight and case handling. Universities still need a consistent way to analyse open-text comments across cohorts, while deciding carefully where attachments improve understanding and where they simply add operational overhead.

References

[Jisc Online Surveys]: "Product updates"
Published: 2026-02-09

[Jisc Online Surveys]: "Change log"
Published: 2026-02-09

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.