What did COVID-19 change in biomedical sciences education?
By Student Voice Analytics
COVID-19biomedical sciences (non-specific)COVID-19 reshaped delivery, assessment and student support, pushing biomedical programmes to stabilise remote teaching while tightening assessment clarity and preserving accessible staff support. In National Student Survey (NSS) open‑text data, COVID-19 aggregates cross‑disciplinary student commentary on pandemic-era practice, while biomedical sciences (non-specific) denotes the broad subject grouping used in sector analyses. Across 12,355 COVID‑19‑tagged comments the tone skews negative (index −24.0), yet biomedical sciences overall trends more positive on balance (roughly 51.0% Positive). Within this discipline, assessment dominates discourse: Feedback attracts 10.6% of comments and sits at a strongly negative −31.5, signalling that clarity and turnaround matter as much as format.
How did the rapid move online reshape the biomedical student experience?
The shift to online learning changed the cadence of study and expectations for access. Flexibility and recordings improved revision and enabled students balancing work or caring responsibilities to keep pace. At the same time, remote delivery made conceptual explanations and practical demonstrations harder to internalise without live, iterative interaction. Institutions introduced virtual simulations and online lab sessions; students report value in being able to revisit content, but question whether simulations fully substitute for learning by doing. Younger and full‑time cohorts drive more critical commentary in the NSS dataset, so programmes now target briefing rhythm and access routes accordingly.
What happened to practical labs and hands-on experience?
Loss of physical lab time limited the formative, tacit learning that comes with handling equipment, troubleshooting, and teamwork. Digital tools offered repeatability and access when campuses were constrained, but many students missed the confidence‑building aspects of in‑person experimentation. Providers respond by sequencing on‑campus intensives alongside simulation and by aligning pre‑lab preparation to post‑lab reflection. The design aim is not to replace practicals but to scaffold them so scarce lab hours build competence efficiently.
How did these shifts affect student wellbeing?
Disrupted routines and reduced social contact heightened stress and isolation. Where staff and peer networks were easy to reach, motivation and continuity improved; where communication was fragmented, students reported avoidable anxiety. Universities therefore strengthen signposting, timetable predictable check‑ins, and blend online counselling with rapid triage to in‑person services. Cohort‑level micro‑briefings and module‑level Q&As reduce uncertainty without overloading students with messages.
What shifts in assessment design actually worked?
Assessment flexibility reduced pressure but exposed gaps in briefing, marking criteria and feedback quality. Biomedical sciences comments highlight that unclear criteria and variable marking conventions undermine trust more than the mode of assessment itself. Programmes now prioritise assessment literacy: publish annotated exemplars, translate marking criteria into plain English, use checklist‑style rubrics, and calibrate in class so students can test their understanding before submission. Visible turnaround commitments and feedback that is specific and forward‑looking help address the most negative themes.
How have graduate prospects been affected?
Interrupted lab access and constrained research projects raised concerns about job readiness, especially for roles requiring extensive bench skills. At the same time, graduates developed adaptive habits—version control, data handling, virtual collaboration—that employers increasingly expect. Schools mitigate risk by documenting competence across modules, offering short, intensive skills refreshers, and partnering with employers for authentic tasks. Structured, work‑integrated rhythms also support students who missed routine during disruption.
Which innovations should biomedical programmes keep?
Augmented reality to visualise complex processes and flipped classrooms to free up contact time for application both improve engagement when well‑sequenced. Students report most benefit where these approaches sit within a stable weekly rhythm, with one source of truth for any change to delivery, access to materials, or assessment. Capturing the tactics used in disciplines that maintained continuity and assessment clarity, then reapplying them across modules, lifts consistency.
What should programmes prioritise next?
- Keep disruption‑ready delivery plans and a single, up‑to‑date information hub so changes are predictable and well‑explained.
- Stabilise timetabling and communications ownership at programme level to reduce friction points that otherwise contaminate learning quality.
- Treat assessment clarity as design work: exemplars, rubrics, calibration, and predictable feedback windows.
- Protect people‑centred strengths by safeguarding time for Personal Tutors and staff availability, which students rate highly and rely on when delivery patterns shift.
- Retain project support practices that worked well and repurpose them in taught modules to build assessment literacy and independence.
How Student Voice Analytics helps you
Student Voice Analytics turns open‑text feedback into programme‑level priorities. For COVID‑19 and biomedical sciences, it shows where sentiment deteriorates (e.g., around assessment clarity or remote delivery) and where it holds up (staff accessibility, project support), then benchmarks performance against the sector. You can track topic volume and tone over time, compare cohorts and modes, and export concise, anonymised summaries for action planning with module and quality teams.
Request a walkthrough
Book a Student Voice Analytics demo
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.
-
All-comment coverage with HE-tuned taxonomy and sentiment.
-
Versioned outputs with TEF-ready governance packs.
-
Benchmarks and BI-ready exports for boards and Senate.
More posts on COVID-19:
More posts on biomedical sciences (non-specific) student views: