Updated Mar 09, 2026
At Student Voice AI, we spend a lot of time helping universities interpret what students are really saying about assessment and feedback. That is why Robert A. Nash's paper in Assessment & Evaluation in Higher Education, "Universities' policies on feedback speed do not meaningfully predict students' satisfaction with assessment and feedback", matters. For UK institutions relying on NSS results, module evaluations, and free-text comments to improve the student experience, the paper challenges a common assumption: tightening institutional turnaround targets does not automatically produce better feedback satisfaction.
Students frequently report slow feedback as a source of frustration, and many universities have responded with institution-wide policies that specify how quickly marked work should be returned. Those policies are easy to monitor and easy to communicate, which makes them attractive to senior teams looking for a clear lever to pull. But they also create a blunt performance target: they say something about elapsed time, not necessarily about whether feedback is useful, specific, or timely enough to inform the next piece of work.
Nash addresses that gap through a secondary analysis of UK universities' feedback turnaround policies and their relationship to National Student Survey outcomes. The practical research question is straightforward: if a university mandates quicker feedback return times, does that translate into meaningfully higher student satisfaction with assessment and feedback?
The answer was largely no. Across the institutions analysed, the number of turnaround days permitted in feedback policies was negatively associated with NSS assessment-and-feedback satisfaction, but the relationship was weak and not statistically significant. In other words, universities with stricter published feedback deadlines did not reliably outperform institutions with longer permitted turnaround times.
That matters because it undercuts a common improvement strategy in UK higher education. If faster institutional policy targets are not strongly linked to better NSS outcomes, then policy speed on its own is probably not the main driver of student satisfaction. The paper pushes institutions to distinguish between what is easy to count and what actually changes the experience students report.
"Students are more concerned about the quality and utility of feedback, rather than receiving it quickly."
The paper also found a stronger association between feedback policy and students' views of organisation and management. That suggests, though does not prove, that turnaround policies may function more as a signal of institutional coordination than as a direct route to better feedback itself. A clearly communicated policy may reassure students that a course is organised, even if it does not guarantee that comments are detailed, actionable, or well timed in relation to the next assessment.
For Student Experience teams, the deeper lesson is that feedback dissatisfaction is usually more local and more qualitative than a university-wide service standard can capture. A school can meet a 15-day policy and still attract negative comments because feedback is generic, inconsistent across markers, or arrives after the next assignment. That is exactly where open-text analysis becomes useful: it reveals whether students are complaining about lateness, usefulness, fairness, or all three.
The first implication is to stop treating feedback speed as the headline solution to weak NSS assessment scores. Reasonable turnaround standards still matter, especially when feedback arrives too late to be used, but institutions should be wary of shortening deadlines further unless they can show that speed is the real constraint in the student experience.
The second implication is to analyse feedback at a more granular level. Programme and module teams should combine turnaround compliance data with what students actually write in surveys and evaluations. If complaints cluster around vague comments, poor feed-forward, or inconsistent marking, then the problem is feedback design and implementation, not the formal policy clock.
The third implication is methodological. When universities want to improve assessment and feedback, they need evidence that goes beyond summary scores and central service standards. This is where Student Voice Analytics fits naturally: comment analysis can separate concerns about slow feedback from concerns about quality, benchmark those themes across departments, and help senior teams target action where dissatisfaction is actually being generated.
Q: How should a university respond if it already has a fast feedback policy but NSS scores on assessment are still weak?
A: Start by checking whether students are objecting to speed at all. Review module-level comments, look for recurring themes such as vague feedback, unclear marking criteria, inconsistent moderation, or feedback arriving too late to use, and compare those patterns with compliance data. If the policy target is being met but dissatisfaction remains high, the likely issue is feedback quality or usability rather than turnaround time alone.
Q: What are the limits of a secondary analysis of policies and NSS scores?
A: The paper tests association, not simple causation. A published policy does not guarantee consistent practice across modules, and NSS responses capture students' perceptions of a broad assessment-and-feedback experience rather than a single operational measure. The value of the study is that it shows universities should not assume policy speed is a strong predictor; local audits are still needed to understand what is happening on the ground.
Q: Does this mean speed no longer matters in student voice data about feedback?
A: No. Speed still matters when delayed feedback prevents students from acting on comments before the next task, or when it signals weak organisation. The paper's point is narrower and more useful: faster deadlines on paper are not enough. To improve the student voice on assessment and feedback, universities need to know whether students are asking for quicker returns, better comments, clearer criteria, or stronger follow-through.
[Paper Source]: Robert A. Nash "Universities' policies on feedback speed do not meaningfully predict students' satisfaction with assessment and feedback" DOI: 10.1080/02602938.2025.2450226
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.