Partly: across remote learning comments in the National Student Survey (NSS), tone is net-negative (42.0% Positive, 53.8% Negative; sentiment index −3.4), while the sector’s mechanical engineering subject grouping reads more mixed but marginally positive overall (49.8% Positive). Students value collaboration but question assessment fairness and clarity, with marking criteria the sharpest pain point (−46.1). These sector patterns frame the experiences below and explain why our cohort prioritised predictable delivery, transparent assessment briefs and remote‑appropriate access to resources.
How did students navigate remote learning?
An area that formerly relied on labs and group projects moved rapidly to digital delivery. Live sessions and recorded lectures became standard, but students asked for a consistent weekly rhythm, stable links and signposted tasks to make workloads manageable across modules. Providers responded with redesigned teaching, more interactive sessions and better access to materials. Student surveys drove iterative fixes, such as searchable recordings and concise weekly summaries for those learning asynchronously. The experience underlined that quality depends on reliable delivery mechanics, not only staff expertise, and that quick feedback loops help staff prioritise what to improve next.
Where did course organisation and delivery falter?
Students reported scattered timetables, shifting links and limited access to practical work. For a hands‑on discipline, the loss of labs disrupted learning flow and confidence in applying theory. Programmes that consolidate a single source of truth per module, keep a “no surprises” window on timetable changes, and use virtual labs or simulations reduce friction. The student voice points to the same conclusion as sector evidence: stabilise delivery and communication, then layer in practice‑oriented digital demonstrations to sustain engagement.
How should assessment adapt to remote contexts?
Assessment standards often stayed static while learning moved online, creating perceived misalignment. Students called for criteria written in plain language, exemplars mapped to learning outcomes, and predictable feedback turnaround. Programme teams that pilot open‑book formats, structured project work and continuous assessment aligned to remote learning environments report less confusion and fewer challenges about grading. Sector‑level feedback also shows that marking criteria attract the strongest negativity when ambiguous; tightening briefs and rubrics reduces repeat queries and resubmissions.
How did students access software and resources?
Students needed remote access to specialist software, libraries and datasets without heavy setup barriers. Licensing, installation and bandwidth all surfaced as friction points. Programmes improved the experience by providing remote desktops, low‑bandwidth versions, captioned recordings, and a single hub per module for links and updates. Short orientation on “how we work online” and responsive IT support helped students get productive sooner and reduced duplicate requests to academic staff.
What changed in the wider university experience?
The shift online reduced informal contact and made peer support harder to find, even as formal touchpoints continued. Students valued timely office hours, written follow‑ups to critical announcements and time‑zone‑aware options for international peers. While digital platforms kept learning going, the absence of in‑person workshops and labs remained the most acute loss for mechanical engineering. Regular check‑ins on access issues, audio quality and timetable slips, with a visible “what we fixed” update, maintained trust during a volatile period.
What did digital collaboration enable?
Despite constraints, students sustained teamwork through structured online projects, shared milestones and peer review. Access to collaboration software and virtual workshops supported this. Many cohorts reported that defined roles and expectations made group work more equitable online than in ad‑hoc in‑person arrangements. Tutor interactions shifted, but scheduled drop‑ins and clear escalation routes kept guidance available when needed.
Does the balance of cost and value still hold?
Students weighed fees against reduced access to physical facilities. They accepted that remote formats bring flexibility and continuity, but did not see them as full substitutes for practical learning. Providers narrowed the gap with multi‑angle demonstrations, simulation‑based tasks and transparent specifications for remote submissions. Where these measures were systematic and well‑communicated, students recognised the value added; where they were piecemeal, dissatisfaction persisted.
What should providers carry forward?
Prioritise stable delivery mechanics, assessment clarity and remote‑first resources. Protect the practices that worked—predictable schedules, asynchronous parity, structured teamwork, visible support—while re‑centring practical elements through demonstrations and simulations. Align assessment with the modes you teach, and keep the weekly feedback loop open so students can see changes land.
How Student Voice Analytics helps you
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and standards and NSS requirements.