Did the pandemic reshape psychology students’ learning and wellbeing?

Published May 12, 2024 · Updated Feb 25, 2026

COVID-19psychology (non-specific)

COVID-19 did not just change how psychology was taught; it also sharpened student concerns about assessment clarity and wellbeing. In National Student Survey (NSS) open-text comment analysis, the COVID-19 topic remains net negative overall (sentiment index −24.0), and younger students drive much of the volume (69.4%).

Within the Common Aggregation Hierarchy (CAH) for psychology (non-specific), which groups UK psychology provision for sector analysis, psychology’s COVID-19 tone (using sentiment analysis for universities in the UK) follows the same pattern and intensifies long‑standing friction around assessment clarity. Psychology contributes a proportionate share of COVID-19 comments (6.1%), and across 2018–2025 the subject accounts for ≈23,488 comments overall.

Strengths cluster around staff and resources. Pressure points sit in assessment methods and, most notably, marking criteria (−45.0). Taken together, these insights help programmes decide where to adapt delivery, placements, wellbeing support, and research activity.

How did the shift to online learning change psychology teaching?

Providers moved quickly from physical classrooms to virtual environments and adapted teaching to suit the medium (see applied psychology students’ perspectives on remote learning). Psychology relies on interaction and practical engagement, so role-plays and group work needed redesign.

Institutions supported staff to teach online, maintained student engagement, and expanded access to digital libraries and resources. In psychology, tone around remote learning sits close to neutral when online materials match live teaching and are tidy, searchable, and easy to navigate. Many teams maintained standards, sustaining educational quality while accelerating blended approaches that now underpin delivery. For course teams, the takeaway is that organisation matters: tidy, searchable materials that match live teaching keep remote learning closer to neutral.

What happened to practical components?

Practical components, especially clinical observation and client-facing practice, were disrupted. Programmes replaced or supplemented in‑person activity with virtual simulations and structured online casework to protect learning outcomes.

Student feedback drove iterative improvements, and many teams now retain blended formats that widen access while safeguarding the integrity of practical training. The takeaway is to be explicit about learning outcomes and supervision when practical work shifts online, so students trust the alternatives.

What mental health challenges emerged?

Wellbeing moved to centre stage as students managed isolation, uncertainty, and emotionally demanding content. Younger and full‑time cohorts tended to report more critical experiences, so teams prioritised targeted communications, early contact, and flexible support.

Online counselling expanded, and routine check‑ins were built into modules and personal tutoring. This normalised help‑seeking and enabled earlier intervention. The takeaway is to keep support visible and proactive, with clear signposting and regular touchpoints that help you spot issues earlier.

How did students adapt research projects?

Students and supervisors retooled methods to meet safety and ethical requirements. Face‑to‑face studies shifted to online surveys and virtual interviews, with protocols adapted for consent, privacy, and data quality.

Staff shared exemplars and troubleshooting sessions to help students keep projects rigorous and feasible. The experience widened methodological repertoires and strengthened confidence with digital tools that are now common across the discipline. The takeaway is to provide templates, examples, and ethical guidance early, so method changes do not derail projects.

How did assessment and feedback change?

Assessment shifted towards continuous and open‑book formats. Students consistently asked for clearer guidance on what “good” looks like and how work is judged, echoing psychology’s pattern of tension around assessment methods and marking criteria (see common challenges in psychology assessments).

Programmes responded by publishing clearer assessment briefs, aligning marking criteria across modules, and sharing annotated exemplars. Many teams also committed to predictable turnaround times with feed‑forward, so students know what to do next. Regular virtual office hours supported quick clarification. The takeaway is to reduce ambiguity with shared criteria, annotated examples, and feedback that tells students what to improve next.

How did students access resources?

Access to libraries and specialist materials shifted online quickly. Providers consolidated a single source of truth for changes to timetabling, assessment, and resources, and expanded e‑collections and open educational resources.

Staff also offered guidance on using digital repositories and building study skills for online environments. The result was a more equitable baseline for access and stronger digital literacy across cohorts. The takeaway is that a single source of truth plus study-skills support improves equity when everything moves online.

What should providers do next?

Maintain a disruption‑ready playbook that covers teaching, assessment, and access to resources. Target communications and flexible access routes where tone is most critical for specific student groups. Continue calibrating standards and sharing exemplars to reduce ambiguity in assessment.

Protect the positives: staff availability, learning resources, and tidy remote materials. Sustain the blended options for practical learning and research that proved effective.

Done well, this keeps delivery resilient while reducing the assessment uncertainty students flag most strongly.

How Student Voice Analytics helps you

Student Voice Analytics surfaces what matters most for psychology during and beyond COVID-19 by tracking topic volume and sentiment over time, from institution to programme. It benchmarks psychology against the wider subject mix, shows where younger and full‑time cohorts report more negative experiences, and pinpoints assessment clarity issues so teams can act on marking criteria and feedback usefulness.

Segment by cohort or site, generate concise, anonymised summaries, and export ready‑to‑use tables and figures to brief programme, quality, and wellbeing teams.

See it in your own data: Explore Student Voice Analytics, or read the buyer’s guide to choose the right approach to NSS comment analysis.

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.