What did mathematics students say about COVID-19?

Updated Mar 28, 2026

COVID-19mathematics

COVID-19 hit mathematics teaching where students most need live explanation: when they are following complex steps, testing their understanding, and correcting errors quickly. In National Student Survey (NSS) comments, COVID-19 sentiment in mathematics is predominantly negative: the topic sits at −24.0 overall, while mathematical sciences fall to −34.2, making them one of the most negative subject areas. Within the sector’s mathematics grouping (used widely in subject classification), remote learning sentiment is −11.5 and assessment methods reach −36.4, showing where delivery and assessment broke down. Younger cohorts account for 69.4% of comments, and their tone is more critical than mature students’. These patterns shape the practical lessons in this case study.

How do sector insights frame mathematics students’ COVID-19 experience?

The pandemic reshaped teaching in a discipline that depends on systematic problem‑solving, iterative feedback, and worked examples. Mathematics students need interactive explanations and rapid clarification to keep progressing through complex proofs and techniques, so moving online disrupted the core learning rhythm, not just the delivery channel. Drawing on student surveys and open-text analysis of NSS comments, this case study shows where learning, collaboration, assessment, and wellbeing came under strain, and what providers can do to reduce that strain if disruption happens again.

What changed in the online learning experience?

Losing real‑time interaction reduces opportunities to test understanding and receive immediate feedback on each step. Students value flexibility and the ability to replay material, but many report that asynchronous delivery weakens concept formation, reduces engagement with proofs, and leaves less space for the informal questions that surface misconceptions early. When programmes restore regular live touchpoints, they make it easier for students to close the gap between taught content and assessment expectations, a pattern echoed in what mathematics students say about teaching methods.

How did pre-recorded lectures affect learning?

Pre‑recorded lectures dilute the scaffolding students expect from live, problem‑led sessions. Without spontaneous questions and targeted explanations, learners spend more time inferring intent from slides than working through examples. Variable audio, pace, and annotation quality compound the issue, while the loss of peer dialogue limits exposure to alternative methods. Students benefit most when recordings are short, clearly structured, and built around explicit worked solutions with signposted pause points for practice.

What happened to collaborative learning?

Group problem‑solving transfers unevenly online. Virtual whiteboards and breakout rooms help some cohorts maintain momentum, but many students describe lower immediacy and weaker accountability. Asynchronous chats rarely replace the step‑by‑step scrutiny that makes study groups effective. The practical takeaway is to scaffold collaboration with clear roles, a shared artefact, and timely check‑ins, because structure makes online group work more useful and less frustrating.

How did universities handle COVID-19?

Students describe a mixed institutional response. Where providers keep a single, up‑to‑date source of truth for changes, name owners for timetabling and module communications, and explain “what changed and why,” disruption feels more manageable. Elsewhere, fragmented messaging, patchy access to specialist software, and hurried assessment changes undermine trust. Clear ownership and consistent communication do not remove disruption, but they do make it easier for students to keep learning.

How did the pandemic affect mental health and wellbeing?

Isolation, blurred study boundaries, and the loss of cohort interactions increase anxiety. Unpredictable schedules and shifting assessment briefs heighten stress, especially when students cannot gauge their progress against peers. Many universities expanded digital wellbeing offers, but students still need proactive check‑ins from personal tutors, predictable study rhythms, and student support in mathematics courses signposted within the programme rather than buried in generic portals. That makes help easier to use when pressure peaks.

How did students adapt to online assessments?

Rapid shifts to new formats expose gaps in assessment design. Students question whether tasks still match what they were taught, and they ask for assessment methods in mathematics that feel fair and usable, with transparent marking criteria, annotated exemplars, and realistic turnaround times for feedback. Online proctoring can increase privacy concerns and anxiety, so lower‑friction alternatives, such as open‑book formats with integrity guidance, staged submissions, or oral follow‑ups, often feel fairer. Reliable technical support and rehearsal opportunities reduce inequity created by variable connectivity and devices.

What should providers change next time?

  • Publish disruption‑ready playbooks: keep a single source of truth, issue brief weekly updates, and make disability‑related adjustments explicit when arrangements change.
  • Stabilise delivery: name an owner for timetabling and course communications, and coordinate module pacing to avoid assessment bunching.
  • Clarify assessment: provide concise rubrics, aligned learning outcomes, and annotated exemplars; test new formats with students and communicate integrity expectations up front.
  • Strengthen online learning design: use shorter, structured recordings, embed worked examples, and build in low‑stakes checks with rapid feedback.
  • Target support to cohorts most affected: younger and full‑time students typically report more negative experiences; offer micro‑briefings and Q&A slots at peak stress points.
  • Capture and reuse stronger practice: lift approaches from subjects that sustain continuity and clarity, then run time‑bound reviews of assessment fairness and access to specialist tools in mathematics.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text feedback on remote learning, assessment, and wellbeing into priorities you can act on. It tracks topic volume and sentiment over time, then lets you drill down from institution to school, department, cohort, and site. You can compare like‑for‑like across mathematics and other subject groupings, segment by demographics and mode, and evidence change against relevant peers. The platform produces concise, anonymised summaries and export‑ready outputs for programme and quality teams, so you can brief staff quickly, target fixes where sentiment is lowest, and measure whether changes to delivery or assessment are improving the experience.

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

More posts on COVID-19:

More posts on mathematics student views:

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.