Students describe the impact of COVID-19 on mathematics as predominantly negative in National Student Survey (NSS) comments: sentiment sits at −24.0 across the topic, with mathematical sciences among the most negative subjects at −34.2. Within the sector’s mathematics grouping (used widely in subject classification), remote learning sentiment is −11.5 and assessment methods register −36.4, signalling where delivery and assessment faltered. Younger cohorts drive much of the volume at 69.4% of comments, and their tone is more critical than mature students’. These sector patterns shape the themes in this case study.
How do sector insights frame mathematics students’ COVID-19 experience?
The pandemic reshapes teaching for a discipline that relies on systematic problem‑solving, iterative feedback and worked examples. Mathematics students require interactive explanations and rapid clarification to progress through complex proofs and techniques; moving online constrains those rhythms. Drawing on student surveys and text analysis, we examine how the switch affects learning, collaboration, assessment and wellbeing, and how providers respond to sustained disruption.
What changed in the online learning experience?
Loss of real‑time interaction reduces opportunities to test understanding and receive immediate feedback on problem steps. Students value flexibility and the ability to replay material, yet they report that asynchronous delivery weakens concept formation, reduces engagement with proofs and diminishes space for informal questions that typically surface misconceptions. The resulting gap between taught content and assessment expectations becomes more visible in modules that depend on stepwise reasoning.
How did pre-recorded lectures affect learning?
Pre‑recorded lectures dilute the scaffolding students expect from live, problem‑led sessions. Without spontaneous questions and targeted explanations, learners spend longer inferring intent from slides rather than working through examples. Variable audio, pace and annotation quality compound the issue, while the absence of peer dialogue limits exposure to alternative methods. Students ask for concise, well‑structured recordings with explicit worked solutions and signposted points where they should pause and attempt problems.
What happened to collaborative learning?
Group problem‑solving shifts unevenly online. Virtual whiteboards and breakout rooms help some cohorts maintain momentum, but many report a lack of immediacy and diminished accountability. Asynchronous chats seldom substitute for the step‑by‑step scrutiny typical of effective study groups. Where staff scaffold collaboration with clear roles, artefacts to produce, and timely check‑ins, students report better continuity of learning.
How did universities handle COVID-19?
Students describe a mixed institutional response. Where providers keep a single, up‑to‑date source of truth for changes, name owners for timetabling and module communications, and explain “what changed and why,” disruption feels manageable. Elsewhere, fragmented messaging, patchy access to specialist software and hurried assessment changes undermine trust. Students note that disciplines with continuity of learning and assessment clarity fare better; adopting those practices helps programmes sustain progress under pressure.
How did the pandemic affect mental health and wellbeing?
Isolation, blurred study boundaries and the loss of cohort interactions increase anxiety. Unpredictable schedules and shifting assessment briefs heighten stress, particularly when students cannot calibrate their progress against peers. While many universities expand digital wellbeing offers, students ask for proactive check‑ins from personal tutors, predictable study rhythms and signposting to services embedded within programmes rather than generic portals.
How did students adapt to online assessments?
Rapid shifts to new formats expose gaps in assessment design. Students question alignment between taught material and tasks, and ask for transparent marking criteria, annotated exemplars and realistic turnaround times for feedback. Online proctoring raises privacy and anxiety concerns; providers that pilot alternatives (open‑book with academic integrity guidance, staged submissions, oral follow‑ups) report fewer disputes. Reliable technical support and rehearsal opportunities reduce inequity created by variable connectivity and devices.
What should providers change next time?
How Student Voice Analytics helps you
Student Voice Analytics turns open‑text feedback into priorities you can act on. It tracks topic volume and sentiment over time, then lets you drill down from institution to school/department, cohort and site. You can compare like‑for‑like across mathematics and other subject groupings, segment by demographics and mode, and evidence change against relevant peers. The platform produces concise, anonymised summaries and export‑ready outputs for programme and quality teams, so you can brief staff quickly, target fixes where sentiment is lowest, and monitor the effect of changes to remote delivery and assessment.
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and standards and NSS requirements.