DMU's block teaching evaluation links faster feedback with stronger student survey evidence

Updated Apr 14, 2026

Block teaching often comes with bold claims about focus, workload, and faster feedback. Published evidence that ties those claims back to student surveys is much rarer. On 10 April 2026, De Montfort University published its block teaching evaluation, saying the model has improved student experience, pass rates, continuation, and recruitment. For institutions following the current sector debate on block learning and student-staff partnership, DMU's block teaching evaluation matters because it treats student feedback evidence as part of the case for course redesign, not just as a retrospective claim.

What has changed in DMU's block teaching evaluation

The institutional shift itself is not new. DMU says it introduced block teaching across all courses in the 2022/23 academic year, so students study one module at a time rather than several in parallel. The university says this structure is intended to reduce stress, improve engagement, and provide faster feedback by moving exams to the end of each module. What is new is the evaluation now being published. DMU says the first cohort taught entirely under the model completed their studies in 2025, which has allowed the university to assess the approach using institutional survey data, national benchmarks, and continuation metrics. The scope is institution-wide at one English university, not a UK-wide policy change, but it is still a live example of how a provider is testing a major delivery redesign.

The student experience detail is what makes the announcement useful. DMU says its surveys asked students about whether they felt connected to course mates, how easy it was to secure time with a tutor, the level of interaction with teaching staff, and whether the timetable worked for them. In every category, the university says students taught under block delivery reported improved responses, with some increases above 15%. It also says the 2025 National Student Survey, the first to reflect a fully block-taught graduating cohort, showed gains across all core themes compared with 2022, including Teaching on my Course, Learning Opportunities, Academic Support, and Organisation and Management. DMU says Assessment and Feedback improved strongly too, supported by a 15-day feedback turnaround.

"those students who have been taught entirely within the block model have performed better"

The announcement also goes beyond satisfaction measures. DMU says programmes that moved to block delivery in 2022 improved continuation rates year on year, outperforming non-block programmes at the university and exceeding the wider sector trend in the latest Office for Students dataset. It also reports stronger confidence, a better sense of belonging, improved ability to manage workload, and fewer reports of stress affecting study. On recruitment, DMU says 60% of undergraduates and 71% of postgraduates entering in 2025/26 said the model influenced their decision to enrol. The practical takeaway is that DMU is presenting block teaching as a whole-system change, measured through student voice, academic outcomes, and student choice.

What this means for institutions

The first implication is about what universities measure when they redesign delivery. DMU is not only pointing to overall satisfaction. It is pointing to specific aspects of the student experience that a structural change should plausibly affect: tutor access, peer connection, staff interaction, timetable fit, workload, and stress. That is a better model for institutional evaluation than relying on a single headline score after the event. If a university is moving to block, trimester, or other intensive delivery, it should be clear in advance which parts of the student experience the new structure is supposed to improve, and which survey questions or comment prompts will test that claim.

The second implication is about feedback speed. In a block model, turnaround time matters because students move quickly from one module or assessment point to the next. DMU's 15-day turnaround is therefore relevant, especially where the goal is to keep feedback close enough to the learning activity to remain usable. But institutions should be careful not to treat turnaround targets as the whole answer. As we noted in our review of why faster feedback policies do not guarantee better NSS results, speed alone does not tell you whether students found the feedback clear, specific, or helpful. The real question is whether students receive comments early enough, and usefully enough, to improve the next stage of learning.

The third implication is evidence discipline. DMU is comparing the fully block-taught 2025 cohort with the final pre-block year in 2021/22, and also comparing block-delivery programmes with non-block programmes inside the institution. That kind of claim is valuable only if the evidence trail is clear enough to defend. Universities considering similar redesigns should keep tight records on which cohorts are in scope, whether question wording or survey timing changed, how response patterns shifted, and which metrics are being treated as comparable. The benefit is straightforward: senior teams get a more credible picture of whether the redesign genuinely improved the student experience, rather than a looser narrative built from selectively positive indicators.

How student feedback analysis connects

DMU's published evaluation is mostly about survey results and headline outcome measures. What it cannot show on its own is which parts of block delivery students are actually responding to. Open-text feedback is where institutions are more likely to see whether the improvement comes from less assessment bunching, clearer timetables, easier tutor access, better pacing, or a stronger sense of connection within each block. That is where a repeatable framework such as our NSS open-text analysis methodology becomes useful. It helps teams separate assessment timing, feedback quality, organisation and management, staff availability, and belonging rather than treating block teaching as a single theme.

If institutions want to compare block and non-block delivery credibly, they also need a documented process for how comments are collected, coded, reviewed, and turned into action. Our student comment analysis governance checklist is a practical starting point. Student Voice Analytics is useful where teams need one reproducible method across NSS, module evaluations, and local pulse surveys, especially when they want to trace whether a delivery redesign changed what students actually say, not only how they score it.

FAQ

Q: What should institutions do now if they are reviewing block teaching or another intensive delivery model?

A: Start by defining the outcomes the redesign is supposed to influence, then collect evidence against those outcomes in a stable way. That means keeping a comparable survey core, adding open-text prompts on workload, tutor access, assessment, and timetable experience, and reading those findings alongside pass, continuation, and progression data. The stronger the baseline, the easier it is to see whether the redesign improved the student experience or just changed the reporting frame.

Q: What is the timeline and scope of DMU's block teaching evaluation?

A: DMU published the evaluation on 10 April 2026. The block teaching model was introduced across all DMU courses in the 2022/23 academic year, and the university says 2025 was the first year in which a fully block-taught cohort completed their studies. This is an institution-specific case from England, not a sector-wide regulatory change or a new NSS rule.

Q: What is the broader implication for student voice?

A: The broader implication is that student voice is most useful when it is tied to live institutional design choices, not only annual reporting. If universities are changing how teaching is sequenced, assessed, and supported, they need student feedback evidence that can test whether those changes improved focus, belonging, support, and workload management in practice. That makes student voice part of course design evaluation, not just part of the comms cycle after results are published.

References

[De Montfort University]: "New analysis shows block teaching model delivers improved outcomes and experience for DMU students" Published: 2026-04-10

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.