Updated Apr 08, 2026
One poorly configured survey question can weaken student feedback evidence before analysis even begins. On 6 March 2026, Jisc's Online Surveys product update introduced separate single-answer and multi-answer versions of its Choice and Grid question types, a small builder change with real implications for universities running module evaluations designed with students and staff, pulse checks, and internal student feedback surveys. For Student Experience teams, PVCs, and quality professionals, that means one less avoidable source of ambiguity in how response data is collected and trusted.
At Student Voice AI, we pay close attention to upstream survey design because weak collection rules create avoidable ambiguity before analysis even begins. A cleaner question structure will not solve every methodology problem, but it does remove one common source of confusion in internal feedback workflows.
The core change is straightforward. In release v3.34.2, Jisc says it added new standalone question types for single-choice and multi-choice Choice and Grid questions and redesigned the Add item menu to accommodate them. In the accompanying product update, Jisc explains that the earlier setup made it too easy to overlook whether respondents were being asked to choose one answer or several. The practical benefit is simple: survey builders are less likely to create a question whose wording and response logic do not match. That matters for any institution using internal surveys to compare responses across modules, schools, or services.
"That made it easy to miss what you were creating."
The same 6 March release also changed the Average response time display on the Insights page. Jisc says the format now appears as HH MM SS, and that the metric now shows the median instead of the mean. Our inference from that switch is that Jisc wants the indicator to be less distorted by unusually long sessions or abandoned surveys, which should make it a better warning sign for forms that are causing friction, as we explored in more detail in our post on why Jisc's move to median response time matters for student feedback surveys.
Jisc followed with another relevant fix on 16 March 2026. In release v3.35.0, the change log says it fixed an issue where manually closing a survey could remove drop out data from the Insights page. Taken together, these March updates are not a new regulatory requirement, but they are a live change to the survey and reporting environment many universities use for local feedback collection. The takeaway is clear: survey evidence depends on platform behaviour as well as question wording.
The first implication is practical quality control. If your institution uses Jisc Online Surveys for module evaluations, rep systems, or ad hoc pulse work, review your survey templates and local question banks now. Separate single-answer and multi-answer question types should reduce accidental misconfiguration, but only if teams update old habits, question wording, and template guidance. A prompt such as "select all that apply" should now be matched deliberately to the correct question type, rather than left to assumption. That gives institutions a cleaner basis for comparing like with like later.
The second implication is comparability. Universities often treat local survey setup as an operational detail, but it shapes whether results can be compared across departments and over time. Recent examples such as Bath's 2026 student feedback system and Westminster's Mid-Module Check-ins show that institutions are building more layered feedback architectures. Once that happens, consistency in survey design matters more because small question-logic differences can undermine otherwise sensible attempts to benchmark results across departments or track change over time. Cleaner setup protects the value of the comparisons institutions want to make.
The third implication is interpretation discipline. The median response-time update, and the later fix to drop-out data in Insights, are reminders that dashboard metrics are only as robust as the underlying survey mechanics. Teams should record which platform version or release notes were current when a survey ran, especially if completion patterns or abandonment rates are being discussed in committee papers. That is the same logic behind OfS guidance on protecting NSS integrity: evidence quality depends on the collection process, not just the final chart. Version awareness is a governance issue, not just an admin detail.
Survey design and comment analysis are closer than they look. If institutions want to combine closed-question results with open-text feedback, they need confidence that the structured questions were configured clearly and interpreted consistently. Otherwise, it becomes harder to tell whether a pattern reflects the student experience or a survey design issue. That is one reason we recommend treating internal survey setup as part of the wider student comment analysis governance checklist, not as a separate technical task.
For teams analysing open comments at scale, cleaner survey construction also improves downstream reporting. Stable question structures make it easier to compare modules, merge Jisc-collected surveys with NSS-style reporting, and interpret qualitative themes alongside quantitative measures. Our NSS open-text analysis methodology is focused on national survey comments, but the same principle applies locally: if collection rules shift, teams should document the change before they interpret movement as a real change in student experience.
Q: What should institutions do now if they use Jisc Online Surveys for student feedback?
A: Audit your live templates and guidance. Check that every Choice and Grid question uses the correct single-answer or multi-answer format, update wording so it matches the response logic, and note the March 2026 release changes in local survey guidance for colleagues who build forms. That gives teams a clearer baseline for later analysis and comparison.
Q: What is the timeline and scope of the Jisc Online Surveys change?
A: Jisc says the new standalone single-choice and multi-choice Choice and Grid question types were added in release v3.34.2 on 6 March 2026. A further fix affecting drop-out data on the Insights page was released in v3.35.0 on 16 March 2026. This applies to institutions using Jisc Online Surveys rather than to a national statutory survey such as the NSS.
Q: What is the broader implication for student voice?
A: The broader implication is that student voice evidence depends on survey mechanics as well as survey content. Clearer question structures, cleaner analytics, and better version awareness make internal feedback more defensible when teams use it to prioritise action, compare cohorts, or explain decisions.
[Jisc Online Surveys]: "Clearer question types: Single answer and Multi answer are now separate" Published: 2026-03-06
[Jisc Online Surveys]: "Change log (v3.34.2 release notes)" Published: 2026-03-06
[Jisc Online Surveys]: "Change log (v3.35.0 release notes)" Published: 2026-03-16
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.