Updated May 06, 2026
Student feedback rarely changes assessment rules this explicitly. The University of Portsmouth has published Changes to Assessment Regulations 2026, a current MyPort guidance page setting out changes that take effect from 1 September 2026, including an earlier-referral model that the university says is based on student feedback. For Student Experience teams, PVCs, and quality professionals, that matters because it is a clear example of an institution moving from student feedback about assessment pressure to a published rules change, not just another promise to "take comments on board". It also shows why a credible student voice process needs enough structure to turn a recurring concern into something operational.
The most relevant change for student feedback practice is the new approach to referrals. Portsmouth says that waiting until the summer period to retake a failed assessment can create long delays, bunch reassessments together, and add stress. From September 2026, it says students will be supported to take a referral at the next available window through the year, rather than only in summer. The source also says module leaders will specify when those referral opportunities fall, and notes that earlier referrals may not suit every assessment type.
"next available window through the year rather than only in summer"
The wider package matters too, even though not all of it is directly attributed to student feedback. Portsmouth says some courses will move to a 120-credit module through the whole year, with repeat-year fees based on how many assessments a student has already passed. It also says undergraduate provision will start moving from 20-credit modules to 30-credit modules from September 2026 at Level 4, with Levels 5 and 6 following from September 2027, and that degree classification rules will continue to discount the equivalent of 20 credits, rather than 30, under the new structure. In other words, the earlier-referral change sits inside a broader assessment and curriculum redesign, not in isolation.
The university's wider Exams and assessments guidance says this assessment information applies to undergraduate and postgraduate taught students, including degree apprenticeships, across Portsmouth, London, and distance learning routes. But some of the detailed structural changes are clearly undergraduate-specific, especially the move to new credit sizes. That distinction matters for other institutions reading the story. This is one English university's assessment package, not a sector-wide rule change, and different parts of the package affect different groups.
The first implication is that assessment feedback often becomes most useful when it points to process problems, not just teaching quality problems. Students do not always describe a concern as "assessment regulation". More often, they talk about delayed reassessments, workload bunching, stress, or uncertainty about what happens next. Portsmouth's change is useful because it shows one way an institution can act when those themes keep appearing. Universities already testing earlier in-term routes, such as Westminster's Mid-Module Check-ins, should treat assessment timing questions as part of that same evidence base, not as a separate policy issue that only surfaces later.
The second implication is about evidence quality. If a university wants to justify changes to referral timing, repeat rules, or assessment sequencing, it needs more than a handful of anecdotes. It needs a clear trail showing what students said, which cohorts were affected, and why the issue was serious enough to warrant a policy or process change. That is especially important when assessment changes sit alongside wider curriculum restructuring, as they do here. The useful discipline is the same one we saw in QAA's recent work on pre-grade assessment feedback: separate out whether the problem is timing, clarity, grade anxiety, or the rules themselves, then decide what actually needs to change.
The third implication is that acting on student feedback is only half the task. Institutions also need to communicate the scope of the response precisely. Portsmouth's pages do that reasonably clearly by separating the student-feedback-led earlier-referral change from the broader credit-structure and fee changes, and by stating exact implementation dates. That matters because vague "you said, we did" messaging can blur who is affected and when. For quality teams, the takeaway is straightforward: when rules change, the response should be visible, dated, and specific enough for students and staff to understand what has actually shifted.
This kind of story also shows why open-text analysis matters. A score can tell you that students are dissatisfied with assessment and feedback, but it will not tell you whether the underlying problem is reassessment timing, workload concentration, unclear rules, or poor communication. Open comments can. When those themes are analysed consistently across module evaluations, annual surveys, and local pulse work, institutions are in a better position to decide whether they need a communications fix, a teaching fix, or an assessment-regulation fix.
That is where a repeatable method becomes useful. Our NSS open-text analysis methodology and student comment analysis governance checklist are both relevant if you want to track whether students keep raising timing and assessment-pressure issues after a change has been introduced. Student Voice Analytics is one practical route for doing that across NSS, PTES, module evaluation, and other institutional surveys, but the wider point is methodological rather than commercial: if you cannot compare what students are saying before and after a policy change, it is harder to know whether the change actually worked.
Q: What should institutions do now if they want to respond to similar assessment feedback?
A: Start by checking where concerns about reassessment timing and assessment pressure are already appearing. They may sit in module evaluations, rep minutes, PTES comments, or annual survey free text rather than in a policy review. Then test whether the issue is really about timing, communication, or assessment design. If the pattern is persistent, build a dated action trail that shows what changed, which cohorts it affects, and how you will review the impact in the next survey cycle.
Q: What is the timeline and scope of Portsmouth's change?
A: The earlier-referral change takes effect from 1 September 2026. The undergraduate move to new 30-credit structures starts at Level 4 from September 2026, with Levels 5 and 6 following from September 2027, although Portsmouth says some courses will move all levels from September 2026. The wider assessment guidance applies across undergraduate and postgraduate taught provision, including degree apprenticeships, but some of the detailed structural changes are clearly undergraduate-specific.
Q: What is the broader implication for student voice?
A: The broader implication is that student voice becomes more credible when it changes a rule, timetable, or process that students can actually notice. Institutions do not need every concern to lead to a regulation change, but they do need a defensible route from recurring student feedback to a visible response. That is what turns feedback from a listening exercise into institutional evidence.
[University of Portsmouth]: "Changes to Assessment Regulations 2026" Published: not stated
[University of Portsmouth]: "Exams and assessments" Published: not stated
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.