Updated Mar 30, 2026
COVID-19computer scienceCOVID-19 forced computer science teaching to change overnight, and students are still describing the knock-on effects in their feedback.
Across the sector, student feedback on the COVID-19 topic in the National Student Survey (NSS) shows sustained disruption and a negative tone (12,355 comments; overall sentiment index of −24.0, with terms explained in our student feedback analysis glossary), with younger students accounting for 69.4% of that volume. Within Computer Science, the overall mood across 2018–2025 skews slightly positive at 50.1% Positive, yet assessment clarity remains the weakest area, with marking criteria sentiment around −47.6 (see NSS open-text analysis methodology).
The sections below translate those patterns into practical actions for computer science programmes.
How did online learning change?
The shift to online learning necessitated by COVID-19 brought significant change to computer science teaching, prompting staff and students to engage in continuous digital adjustment. This transition was not just about swapping physical classrooms for virtual ones; it required rethinking pedagogic strategies to ensure engagement and learning outcomes were not compromised (see blended learning best practices from the perspective of students). For instance, traditional lectures were transformed into interactive webinars, while practical coding sessions moved to platforms that support real-time collaboration and feedback.
Student surveys and text analysis, central to a strong student voice approach and often supported by text analysis software for education teams, helped gauge the effectiveness of these formats. They surfaced where to prioritise improvements: more interaction, less isolation, and better support for computer science students, especially for younger cohorts who tended to be more negative. Computer science students expressed a mixed response. While some appreciated flexibility and on‑demand access, others felt a disconnection from their learning community, echoing student feedback on remote learning in computer science that highlights the same trade-offs.
Addressing these challenges required a balanced approach. It was not solely about technological solutions; it also meant building deliberate touchpoints for community and support.
What happened to the quality of teaching?
The quick transition to online learning raised substantive questions about teaching quality for computer science students. Staff adapted teaching methods for digital platforms, which can feel very different from face‑to‑face interaction. Flexibility in access improved, but without enough practical sessions the depth of understanding and hands‑on experience sometimes suffered, which also shaped what students say about computer science course content.
Feedback from students indicated a mixed experience. While some valued the new formats and staff efforts, others missed interaction and on‑the‑spot guidance. Variations in staff digital fluency affected delivery, reinforcing the case for ongoing professional development in digital education tools and techniques, a theme that also comes through in what computer science students say about their teaching staff. The digital divide further complicated the picture, so accessible and inclusive practice mattered even more, especially for diverse cohorts in computer science.
A clear baseline for digital delivery, with support for staff to meet it, helps students feel teaching quality is consistent.
How did safety measures affect learning?
As institutions implemented mask use and social distancing in labs and lecture halls, computer science students experienced restricted or adapted practical work. Some accepted the precautions; others questioned their impact on learning and the clarity of communication. Providers that maintained a single source of truth for course changes and explained what changed and why reduced confusion and helped sustain continuity.
Clear, timely updates and practical alternatives when access is limited help protect learning time.
How did exam formats shift, and what do students want now?
The rapid move to timed online tests reframed assessment. Some students welcomed location independence; others cited technical glitches and the loss of a controlled environment, concerns that overlap with academic integrity in online assessments. Platform reliability and connectivity issues undermined perceptions of fairness and accessibility.
Computer science feedback during this period aligns with longer‑term concerns about assessment clarity, especially across the Feedback, Marking criteria and Assessment methods themes used in undergraduate comment analysis. Students call for transparent marking standards and actionable feedback; computer science students' views on marking criteria remain sharply negative, around −47.6 in this subject area. In response, many departments re‑evaluate strategies, piloting open‑book exams and project‑based assessment. These options value application over recall and can mitigate technical risk, while strengthening alignment between assessment briefs, marking criteria and learning outcomes, especially where COVID‑era changes also intensified workload concerns among computer science students.
Whatever the format, clear rubrics, exemplars and a low‑stakes practice run can improve confidence and perceptions of fairness.
How did fees and resources feel during the shift?
The continuation of full fees amid the digital shift prompted debate about value for money in computer science education. Students questioned whether reduced access to libraries, laboratories and other physical assets should be reflected in costs, and whether computer science learning resources still met students' needs or wider university facilities for computer science students remained good enough for long project hours. Universities faced pressure from reduced international enrolment and invested in online platforms and digital infrastructure, arguing these expenses offset any savings from reduced on‑campus activity. The challenge remains to balance cost and quality so tuition aligns with what is delivered.
Where possible, be transparent about what students can access and what is being invested in, so expectations match reality.
Did interaction and community suffer?
Reduced social interaction constrained collaborative projects and teamwork. Group work and practical engagement, central to computer science education, proved harder to replicate online, which makes group work assessment best practice even more important. Digital tools enabled baseline communication but could not reproduce spontaneous exchange and real‑time problem‑solving. The effects extended beyond project outcomes to students’ sense of belonging, a theme that also appears in computer science students’ views on university life and in emotional engagement in online forums. Structured online collaboration spaces and well‑scaffolded tasks help to rebuild cohort connection.
Smaller groups, clear roles and structured collaboration tasks can make community easier to sustain online.
What technical barriers did students report?
Frequent technical issues during online lectures, such as poor microphone quality and video lag, disrupted learning. For complex topics, audio problems made explanations hard to follow and lag impaired live coding demonstrations. This underlined the need for reliable infrastructure and responsive IT support and staff readiness to use tools effectively.
Audio quality is often the quickest win; fixing it can prevent a lot of avoidable frustration.
How did support and communication change?
Delayed feedback and slow replies from staff caused frustration. In a period of heightened uncertainty, timely answers and detailed feedback matter for navigating complex programming problems, while clearer career guidance for computer science students became more important as placements, internships and employer contact grew less predictable. Institutions that set predictable feedback turnaround in computer science modules, used short weekly updates, and signposted a single point of contact for course changes saw fewer escalations, which matches wider computer science student feedback on communication with academic staff. Clearer communication norms, supported by digital literacy, helped students feel integrated and supported.
How Student Voice Analytics helps you
Want to spot these issues quickly and track whether fixes are working? Student Voice Analytics helps you:
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.