Updated Mar 12, 2026
COVID-19biomedical sciencesCOVID-19 did not just move biomedical teaching online, it exposed how quickly assessment confusion, patchy communication, and reduced lab access can erode student confidence. In National Student Survey (NSS) open-text data, COVID-19 captures cross-disciplinary commentary on pandemic-era practice, while biomedical sciences (non-specific) denotes the broad subject grouping used in sector analysis. Across 12,355 COVID-19-tagged comments, the tone is strongly negative (index -24.0), while biomedical sciences overall is more balanced at roughly 51.0% positive. Within this discipline, assessment dominates the conversation: Feedback accounts for 10.6% of comments and sits at a sharply negative -31.5, which means briefing clarity and turnaround matter as much as format.
How did the rapid move online reshape the biomedical student experience?
The shift to online learning changed the rhythm of study and reset expectations about access. Recordings and flexible materials helped students balancing work or caring responsibilities keep up, which is a benefit worth preserving. At the same time, remote delivery made conceptual explanations and practical demonstrations harder to absorb without live, iterative interaction, a problem explored further in how biomedical sciences can be taught more effectively remotely. Institutions introduced virtual simulations and online lab sessions; students valued the chance to revisit content, but many still questioned whether simulations could fully replace learning by doing. Younger and full-time cohorts contribute more critical commentary in the NSS dataset, so programmes benefit from tighter briefing rhythms, simpler access routes, and more deliberate live interaction.
What happened to practical labs and hands-on experience?
Loss of physical lab time reduced the formative, tacit learning that comes from handling equipment, troubleshooting, and working in teams. Digital tools helped maintain continuity and gave students repeatable practice, but they rarely built the same confidence as in-person experimentation. Providers get better results when they sequence on-campus intensives alongside simulation, then connect pre-lab preparation to post-lab reflection. The practical takeaway is clear: use digital delivery to extend lab learning, not to stand in for it.
How did these shifts affect student wellbeing?
Disrupted routines and reduced social contact heightened stress and isolation. Where staff and peer networks were easy to reach, motivation and continuity held up better; where communication was fragmented, students reported avoidable anxiety, consistent with wider findings on communication in biomedical sciences education. Universities can reduce that drag by strengthening signposting, scheduling predictable check-ins, and blending online counselling with rapid triage into in-person services. Cohort-level briefings and module-level Q&As cut uncertainty before it hardens into disengagement.
What shifts in assessment design actually worked?
Assessment flexibility reduced pressure, but it also exposed gaps in briefing, marking criteria, and feedback quality. Biomedical sciences comments suggest that unclear criteria and variable marking conventions undermine trust more than the mode of assessment itself. Programmes now get more value from assessment change when they publish annotated exemplars, translate marking criteria into plain English, use checklist-style rubrics, and calibrate expectations in class before submission, echoing fair and consistent biology assessment design. Visible turnaround commitments and feedback that is specific and forward-looking help turn flexibility into reassurance rather than confusion.
How have graduate prospects been affected?
Interrupted lab access and constrained research projects raised understandable concerns about job readiness, especially for roles requiring extensive bench skills. At the same time, graduates developed adaptive habits, including version control, data handling, and virtual collaboration, that employers increasingly expect. Schools can close the gap by documenting competence across modules, offering short skills refreshers, and partnering with employers on authentic tasks. That combination helps students rebuild confidence in their practical readiness without losing the digital habits that now matter in scientific work.
Which innovations should biomedical programmes keep?
Some pandemic-era innovations are worth keeping, but only when they sit inside a stable course design. Augmented reality can help students visualise complex processes, and flipped classrooms can free contact time for application, discussion, and troubleshooting. Students report the greatest benefit when weekly rhythms are predictable and there is one source of truth for changes to delivery, material access, or assessment. Programmes should retain the tools that improve comprehension and flexibility, then standardise the communication practices that make those tools usable.
What should programmes prioritise next?
How Student Voice Analytics helps you
Student Voice Analytics turns open-text feedback into clear programme priorities. For COVID-19 and biomedical sciences, it shows where sentiment deteriorates, for example around assessment clarity, remote delivery, or lab access, and where it holds up, such as staff accessibility and project support, then benchmarks those patterns against the sector. You can track topic volume and tone over time, compare cohorts and delivery modes, and export concise, anonymised summaries for module and quality teams. Explore Student Voice Analytics if you need faster evidence on which pandemic-era fixes are still helping, and which frictions are still dragging student confidence down.
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.