How did COVID-19 affect computer science students’ learning and assessment?

By Student Voice Analytics
COVID-19computer science

Yes. Across the sector, student feedback on the COVID-19 topic in the National Student Survey (NSS) shows sustained disruption and a negative tone, with 12,355 comments and an overall sentiment index of −24.0, and younger students accounting for 69.4% of that volume. Within Computer Science, the overall mood across 2018–2025 skews slightly positive at 50.1% Positive, yet assessment clarity remains the weakest area, with marking criteria sentiment around −47.6. These patterns shape the experiences described here and point to practical fixes for computer science programmes.

How did online learning change?

The shift to online learning necessitated by COVID-19 brought a significant change within the computer science curriculum, prompting staff and students to engage in continuous digital adjustment. This transition was not just about swapping physical classrooms for virtual ones. It involved re‑thinking pedagogic strategies to ensure that engagement and learning outcomes were not compromised. For instance, traditional lectures were transformed into interactive webinars, while practical coding sessions required online platforms that allow real-time collaboration and feedback.

Student surveys and text analysis proved useful for gauging the effectiveness of these formats. They surfaced where to prioritise improvements—enhancing interaction and reducing isolation—especially for younger cohorts who tended to be more negative. Computer science students expressed a mixed response. While some appreciated flexibility and on‑demand access, others felt a disconnection from their learning community, which affected motivation and engagement.

Addressing these challenges required a balanced approach, not solely focusing on technological solutions but also fostering a strong sense of community and support among students.

What happened to the quality of teaching?

The quick transition to online learning raised substantive questions about teaching quality for computer science students. Staff adapted teaching methods for digital platforms, which differ from face‑to‑face interaction. Flexibility in access improved, but without practical sessions the depth of understanding and hands‑on experience suffered.

Feedback from students indicated a mixed experience. While some valued the new formats and staff efforts, others missed interaction and on‑the‑spot guidance. Variations in staff digital fluency affected delivery, reinforcing the case for ongoing professional development in digital education tools and techniques to uphold standards. The digital divide further complicated the picture, emphasising accessible and inclusive practice.

How did safety measures affect learning?

As institutions implemented mask use and social distancing in labs and lecture halls, computer science students experienced restricted or adapted practical work. Some accepted the precautions; others questioned their impact on learning and the clarity of communication. Providers that maintained a single source of truth for changes and explained what changed and why reduced confusion and helped sustain continuity.

How did exam formats shift, and what do students want now?

The rapid move to timed online tests reframed assessment. Some students welcomed location independence; others cited technical glitches and the loss of a controlled environment. Platform reliability and connectivity issues undermined perceptions of fairness and accessibility.

Computer science feedback during this period aligns with longer‑run concerns about assessment clarity. Students call for transparent marking standards and actionable feedback; marking criteria sentiment sits around −47.6 in this subject area. In response, many departments re‑evaluate strategies, piloting open‑book exams and project‑based assessment. These options value application over recall and can mitigate technical risk while strengthening alignment between assessment briefs, marking criteria and learning outcomes.

How did fees and resources feel during the shift?

The continuation of full fees amid the digital shift prompted debate. Students questioned whether reduced access to libraries, laboratories and other physical assets should be reflected in costs. Universities faced pressure from reduced international enrolment and invested in robust online platforms and digital infrastructure, arguing these expenses offset any savings from reduced on‑campus activity. The challenge remains to balance cost and quality so tuition aligns with what is delivered.

Did interaction and community suffer?

Reduced social interaction constrained collaborative projects and teamwork. Group work and practical engagement—central to computer science education—proved harder to replicate online. Digital tools enabled baseline communication but could not reproduce spontaneous exchange and real‑time problem‑solving. The effects extended beyond project outcomes to students’ sense of belonging. Structured online collaboration spaces and well‑scaffolded tasks help to rebuild cohort connection.

What technical barriers did students report?

Frequent technical issues during online lectures, such as poor microphone quality and video lag, disrupted learning. For complex topics, audio problems made explanations hard to follow and lag impaired live coding demonstrations. This underlined the need for reliable infrastructure and responsive IT support, alongside staff readiness to use tools effectively.

How did support and communication change?

Delayed feedback and slow replies from staff caused frustration. In a period of heightened uncertainty, timely answers and detailed feedback matter for navigating complex programming problems. Institutions that set predictable feedback turnaround, used short weekly updates, and signposted a single point of contact for course changes saw fewer escalations. Strategic enhancements in digital literacy and communication protocols improved students’ sense of integration and support.

How Student Voice Analytics helps you

  • Track COVID-19 topic volume and sentiment over time, then drill from institution to school/department, cohort and site for Computer Science.
  • Compare like‑for‑like across CAH groups and demographics (age, mode, disability), and segment by campus or provider to target younger full‑time cohorts where sentiment is lowest.
  • Generate concise, anonymised summaries and export tables and figures for rapid briefing to programme and quality teams.
  • Evidence progress on assessment clarity by monitoring Feedback, Marking criteria and Assessment methods topics, and by surfacing representative comments to guide rubric, exemplar and turnaround improvements.

Book a Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and standards and NSS requirements.

More posts on COVID-19:

More posts on computer science student views: