Updated Mar 13, 2026
COVID-19chemical, process and energy engineeringCOVID-19 did not just push chemical engineering teaching online; it exposed how quickly confidence drops when assessments, labs, and communications become unpredictable. In National Student Survey (NSS) open-text comments, analysed using our NSS open-text analysis methodology, the COVID-19 topic acts as a cross-subject signal of disruption: across 12,355 comments, 68.6% are negative and the sentiment index is -24.0. For chemical, process and energy engineering, a Common Aggregation Hierarchy grouping used across UK higher education to compare programmes, around 1,928 comments show that unclear marking criteria and unstable operations drove dissatisfaction, with sentiment on that theme reaching -50.8. The sections below show which responses helped programmes protect continuity and rebuild trust.
How did universities respond to the pandemic for engineering programmes?
Departments pivoted quickly to protect continuity, moving content online and introducing virtual lab simulations and online group projects. The strongest responses gave students one source of truth for timetable and assessment updates, used a simple playbook for change, and reviewed comments regularly to improve delivery. Engineering schools that borrowed continuity approaches common in medicine and dentistry stabilised cohorts faster because students knew what would happen next. Flexible assessment windows and extended project deadlines also protected academic standards while recognising uneven study environments. The benefit for programme teams was clear: predictable operations reduced avoidable stress.
What changed in the transition to online learning?
The shift exposed how little of this discipline could move online without redesign, especially laboratory work and applied problem-solving. Recorded lectures lacked the immediacy of live teaching, and group work often drifted without more structure. Because students in this discipline value structured collaboration in chemical engineering, the most effective departments set regular facilitated team tasks with clear outputs and feedback points. Some also piloted virtual labs mapped to module learning outcomes and used text analytics to identify where students were still getting stuck. That structure made online learning feel less improvised and more useful.
How did the pandemic affect student mental health and what support worked?
Anxiety around exams, disrupted routines, and isolation increased demand for support, especially among younger full-time cohorts. Programmes expanded online counselling, ran engineering-specific stress workshops, and increased proactive personal tutor outreach. Short briefings, predictable Q&A sessions, and flexible access routes reduced uncertainty and helped students keep momentum on assessment deadlines, especially when paired with accessible support for chemical engineering students. The lesson is that support works best when it is closely tied to the pressure points students are already facing.
How were field trips and practical learning handled, and what was lost?
Field trips and in-person labs were cancelled or constrained, with Year 3 students feeling the loss most sharply. Virtual simulations kept progression moving, but they could not replicate tactile skills, instrumentation practice, or safe operating procedures. Where providers later scheduled short catch-up labs and tightly supervised site visits, students regained confidence and applied knowledge more effectively. Placements and trips appear less often in comments, but they land positively when delivered well. The takeaway is to protect high-value practical experiences, even in a blended model.
How did fees and institutional support evolve?
Questions about fee value centred on outcomes and support rather than price alone. Providers expanded hardship funds, device loans, software access, and academic guidance for cohorts starting remotely. Because student support appears less often in discipline comments and trends negative overall, the strongest response was to make help easy to find: one front door for advice, clear service levels for replies, and transparent escalation for complex cases. That clarity makes support feel credible, not theoretical.
Were resources and curricula adapted effectively?
Rapid expansion of remote licences for industry software and VPN capacity kept modelling, design work, and safety cases moving. Staff re-sequenced practicals, used open-book or take-home assessments, and aligned virtual tasks with learning outcomes. Library teams' online provision and study skills support were a consistent strength; programmes that added targeted guidance on assessment tasks reduced repeat queries and improved preparation. The practical benefit was continuity without losing sight of professional standards.
What happened to study abroad and software access?
International opportunities narrowed, so departments leaned harder on employer-linked projects and UK-based placements to preserve professional exposure. Expanded software licences and more reliable VPNs enabled home-based simulations and technical analysis. Regular feedback on usability helped teams remove bottlenecks quickly and keep learning moving. For students, that meant fewer avoidable interruptions and better access to core tools.
How did communication and assessment strategies change?
Providers consolidated communications into weekly digests, live Q&A sessions, and searchable portals, reducing ambiguity around protocols and teaching changes. Assessment design shifted toward coursework and open-book tasks completed remotely. To address the discipline's strongest pain point, departments focused on assessment clarity and feedback in chemical engineering. They published checklist-style rubrics mapped to learning outcomes, shared annotated exemplars showing what pass, merit, or distinction work looked like, and added a short feed-forward note with each grade. A realistic feedback service-level agreement also helped manage expectations without compromising marking quality. The takeaway is straightforward: when students know how they will be assessed, confidence recovers faster.
How Student Voice Analytics helps you
Student Voice Analytics turns open-text survey comments into prioritised actions for programme and quality teams. You can track topic volume and sentiment over time, drill from institution to programme and cohort, and see how tone shifts across COVID-19 themes and the chemical, process and energy engineering discipline. Compare like-for-like across Common Aggregation Hierarchy groups and demographics such as age, domicile, mode, or commuter status, then generate export-ready briefings for teams who need to act. That gives you a faster way to see whether changes to assessment, communication, and support are improving the student experience. To review disruption themes in your own feedback, explore Student Voice Analytics.
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.