What does student feedback say about remote learning in computer science?

By Student Voice Analytics
remote learningcomputer science

Student feedback shows that remote delivery for computer science is workable but uneven: in the National Student Survey (NSS), the remote learning category carries a sentiment index of −3.4, with full‑time cohorts at −11.2 compared with +6.5 among part‑time students. Within Computer Science in the Common Aggregation Hierarchy, assessment clarity dominates concerns, with Feedback taking 8.5% of comments and Marking criteria reading −47.6. As sector lenses, the category aggregates cross‑provider experience while the discipline view focuses on subject‑specific delivery and assessment. These signals frame the issues below and explain why engagement, tooling and assessment practice warrant particular attention.

As we examine the dimensions and impacts of remote learning on computer science students during the recent pandemic, a mixed landscape unfolds. The move to digital classrooms changes how staff deliver content and how students engage with complex computational concepts away from campus labs. By analysing feedback through student surveys and employing text analysis to understand their perspectives, this discussion focuses on how student voice shapes adjustments to software access, lab experience, assessment integrity and communication.

What challenges does remote learning create for engagement in computer science?

Maintaining engagement in practicums is difficult when virtual labs cannot fully replicate the hands‑on aspects of campus facilities. Instances of plagiarism attract greater attention as assessments move online and academic integrity procedures adapt. International students face additional language barriers, compounded by fewer opportunities for real‑time clarification. Younger cohorts tend to report a more negative experience, so providers should prioritise predictable routines, explicit activity signposting and responsive support for those groups to ensure equitable access.

Where do software and tooling constraints limit learning?

Students frequently struggle to run required tooling, such as specific Ubuntu versions, on personal devices with varying specifications. Without campus labs, compatibility and bandwidth issues impede participation. Staff should provide remote‑first materials, including captioned recordings, transcripts and low‑bandwidth options, and maintain a single, stable link hub per module. Regularly audit core software stacks and offer containerised or remote‑desktop alternatives so all students can complete tasks reliably.

How does remote assessment affect integrity and wellbeing?

The transition to online exams exposes variation in standards and conditions. Proctoring software can deter misconduct but may heighten stress and disadvantage students without suitable space or connectivity. Given discipline‑level findings that Feedback accounts for 8.5% of comments and Marking criteria sentiment sits at −47.6, programmes should publish annotated exemplars, concise rubrics and feed‑forward guidance, and ensure consistent communication about expectations. Co‑designing mitigations with students can reduce anxiety and improve perceived fairness.

How can providers equalise access to labs and specialist resources remotely?

High‑performance hardware and specialist environments underpin much of the computer science curriculum. Virtual labs and remote desktops help, but the experience varies with home connectivity and devices. Establish robust support services, including responsive online help desks and extended lab windows, and run short “getting set online” orientations so students can test access early. Where possible, provide asynchronous parity for lab content through recordings and short written digests of key steps.

Is the course content aligning with remote‑first delivery?

Fundamentals remain essential, but heavy reliance on generic asynchronous materials leaves gaps for complex topics that benefit from interaction. Students report a mismatch between content and practical skill‑building when live guidance and iterative feedback are limited. Programme teams should integrate more structured live coding, rapid Q&A loops and small‑group problem‑solving, alongside timely summaries and searchable recordings so students can revisit critical concepts.

Which lecture formats best support computational learning online?

Pre‑recorded lectures allow students to pause and revisit difficult material, while live sessions support immediate clarification and a sense of cohort. A blended model performs best when providers keep a consistent weekly rhythm, minimise platform switching and publish session artefacts promptly. Tracking friction points such as access, audio and timetable slips, then closing the loop with quick updates on fixes, sustains engagement.

How should we structure communications so students can act on them?

When staff and students are dispersed, communication must be predictable and concise. Students respond well to a single source of truth for changes, short weekly updates on what changed and why, and time‑zone‑aware office hours with written follow‑ups to critical announcements. Improving the cadence and clarity of updates also addresses recurring concerns about organisation, timetabling and student voice.

What should providers prioritise next?

Computer science thrives when delivery is stable and assessment expectations are unambiguous. Focus on consistent online rhythms for full‑time and younger cohorts, invest in remote‑first resources and virtual lab capacity, and make marking criteria and feedback practices transparent. Balance technological infrastructure with accessible human support so students can participate fully and demonstrate their learning under fair conditions.

How Student Voice Analytics helps you

  • Track topic volume and sentiment over time for remote learning and computer science, from provider level to school and cohort.
  • Slice by mode, age, domicile and CAH subject groups to see where full‑time and younger cohorts need targeted interventions.
  • Produce concise, anonymised summaries for programme teams and governance, with export‑ready tables and charts for briefing.
  • Evidence progress with like‑for‑like comparisons to the sector and rapid feedback loops on access, delivery and assessment changes.

Request a walkthrough

Book a Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready governance packs.
  • Benchmarks and BI-ready exports for boards and Senate.

More posts on remote learning:

More posts on computer science student views: