Yes: student feedback shows the pandemic depresses experience sector-wide, and in this discipline the sharpest pressure falls on assessment clarity and predictable operations. In National Student Survey (NSS) open‑text comments, the COVID-19 topic acts as a cross‑subject lens on disruption: from 12,355 comments, 68.6% are negative and the sentiment index is −24.0. For chemical, process and energy engineering, a Common Aggregation Hierarchy grouping used across UK higher education to compare programmes, approximately 1,928 comments indicate uncertainty about marking criteria drives dissatisfaction, with sentiment on that theme reaching −50.8. These signals shape the decisions described below.
How did universities respond to the pandemic for engineering programmes?
Departments pivoted quickly to protect continuity, moving dense materials online and introducing virtual lab simulations and online group projects. Responses worked best where teams used a simple playbook for changes, kept a single source of truth for timetabling and assessment updates, and analysed student comments to iterate delivery. Engineering schools that adapted approaches common in medicine and dentistry (continuity of learning and assessment clarity) stabilised cohorts faster. Flexible assessment windows and extended project deadlines protected integrity while recognising varied study environments.
What changed in the transition to online learning?
The shift exposed preparation gaps for fully online delivery, especially for laboratory work and applied problem‑solving. Recorded lectures lacked the immediacy of live sessions, and coordination of group work was uneven. Because students in this discipline value structured peer collaboration, staff prioritised regular, facilitated team tasks with explicit outcomes and feedback points. Some departments piloted virtual labs mapped to module learning outcomes, used open‑ended simulations to develop method, and then targeted fixes using text analytics.
How did the pandemic affect student mental health and what support worked?
Anxiety around exams, disrupted routines and isolation increased demand for support, especially among younger, full‑time cohorts. Programmes expanded online counselling, ran engineering‑specific stress workshops, and increased proactive personal tutor outreach. Timely micro‑briefings, predictable Q&A sessions and flexible access routes reduced uncertainty and helped students keep momentum on assessment deadlines.
How were field trips and practical learning handled, and what was lost?
Field trips and in‑person labs were cancelled or constrained, with Year 3 students feeling the loss most acutely. Virtual simulations maintained progression but could not replicate tactile skills, instrumentation practice or safe operating procedures. Where providers scheduled short, intensive catch‑up labs and tightly supervised site visits once restrictions eased, students regained confidence. Placements and trips, while mentioned less often, land positively when delivered well, so blended models should protect these experiences.
How did fees and institutional support evolve?
Questions about fee value centred on outcomes and support rather than price alone. Providers expanded hardship funds, device loans and software access, and increased academic guidance for cohorts starting remotely. Because student support appears less often in discipline comments and trends net negative, programmes made routes explicit: a single front door for advice, published service levels for replies, and transparent escalation when cases became complex.
Were resources and curricula adapted effectively?
Rapid expansion of remote licences for industry software and VPN capacity kept modelling and safety cases moving. Staff re‑sequenced practicals, used open‑book or take‑home assessments, and aligned virtual tasks with learning outcomes. Library teams’ online provision and study skills support were a consistent strength; programmes that integrated targeted guidance on assessment tasks reduced repeat queries and improved preparation.
What happened to study abroad and software access?
International opportunities narrowed, so departments emphasised employer‑linked projects and UK‑based placements to sustain professional exposure. Expanded software licences and more reliable VPNs enabled home‑based simulations and technical analyses. Regular feedback on usability allowed teams to remove bottlenecks quickly and maintain continuity.
How did communication and assessment strategies change?
Providers consolidated communications into weekly digests, live Q&A and searchable portals, limiting ambiguity about protocols and teaching changes. Assessment design shifted towards coursework and open‑book tasks taken remotely. To address the discipline’s strongest pain point—assessment clarity—departments published checklist‑style rubrics mapped to learning outcomes, annotated exemplars that show what constitutes pass, merit or distinction, and a short feed‑forward note with each grade. A realistic feedback service level agreement helped manage expectations while protecting marking quality.
How Student Voice Analytics helps you
Student Voice Analytics turns open‑text survey comments into concise, prioritised actions. You can track topic volume and sentiment over time, drill from institution to programme and cohort, and see how tone shifts across [COVID-19] category themes and the chemical, process and energy engineering discipline. Compare like‑for‑like across Common Aggregation Hierarchy groups and demographics (age, domicile, mode, commuter status), segment by campus or provider, and generate briefings for programme and quality teams with export‑ready tables and figures.
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and standards and NSS requirements.