How well do CS students communicate with academic staff?

Updated Mar 18, 2026

communication with supervisor, lecturer, tutorcomputer science

When communication breaks down in computer science, confusion about assessments, coding work, and support escalates quickly. NSS open-text comments in the communication with supervisor, lecturer, tutor theme show the experience is broadly positive overall, with 6,373 comments and a sentiment index of +5.5, but students still point to avoidable friction when guidance is unclear or hard to access.

Within computing, sentiment on this theme is slightly higher at +7.1. Yet comments linked to computer science through the sector's Common Aggregation Hierarchy show that students care as much about clear expectations as responsive contact: Feedback accounts for 8.5% of comments, and sentiment on computer science marking criteria sits at -47.6. That makes predictable availability, actionable guidance, and follow-up that closes the loop more important than simply sending more messages.

What should supervisor engagement look like?

Supervisor engagement should reduce uncertainty, not add to it. Accessible supervisors who give timely, relevant feedback and use meetings to promote critical thinking and real-world application help students keep moving when modules get complex. Combine scheduled face-to-face time for complex issues with digital check-ins for quick clarifications. Publish office hours, set a programme-level reply-within-X-working-days norm, and name back-up contacts during leave. Short, proactive check-ins at assessment pinch points particularly help disabled and mature students, who often face higher barriers to contact.

What does lecturer accessibility require?

Lecturer accessibility works best when students know where to ask, when they will hear back, and what happens next. Students value lecturers who are responsive and willing to discuss intricate theories and fast-changing technologies. Effective practice blends virtual office hours, discussion forums, and brief recorded updates with in-person opportunities for deeper exploration. Standardise expectations for response times, and route different query types to the right place, for example VLE forums for routine questions and email for personal matters. Close the loop by summarising actions and common Q&A on the VLE so cohorts can self-serve and staff can spot recurring issues quickly enough to adjust teaching, mirroring the single-source course communication practices computer science students ask for.

How do tutors best support problem‑solving?

Tutors are most valuable when they turn stuck moments into progress without removing the challenge. They bridge theory and practice by guiding students through coding problems while still building independent problem-solving. Calibrate support: use worked examples and targeted hints where needed, and step back to encourage students to test, debug, and reason. Offer approachable drop-ins and small-group clinics, vary methods for different learning preferences, and escalate patterns of misunderstanding to module leaders so assessment briefs and marking criteria are clarified early. Tracking missed responses and follow-ups helps ensure no student stalls between sessions.

Which communication channels work best?

The best channel mix saves time and reduces duplication because each route has a clear purpose. Keep email for formal, documented exchanges; run VLE forums for routine technical questions; provide virtual office hours for real-time help; and retain face-to-face sessions for complex topics and assessment discussions. Maintain a single source of truth on the VLE where decisions, deadlines, and changes are summarised after meetings. For time-poor cohorts such as part-time learners and those on placements or apprenticeships, provide predictable, asynchronous updates and some out-of-hours slots.

What feedback on assignments do computer science students need?

Assignment feedback matters most when students can use it straight away. Given the persistent concerns in computer science about assessment clarity and feedback that students can use straight away, prioritise annotated exemplars, checklist-style rubrics, and explicit marking criteria alongside realistic turnaround times and feed-forward guidance. Feedback should highlight strengths as well as targeted improvements, mapping comments to the marking criteria students will meet again in later modules. Where feasible, use brief feedback surgeries so students can query points before the next submission.

How do we balance autonomous learning with guidance?

Students handle independent learning better when expectations and support routes are explicit. Define what autonomy looks like in each module, give choice within assessment briefs, and make support routes transparent, building on what students say about support that works best in computer science. Use midpoint check-ins and light-touch pulse surveys to surface where students need more guidance, then adjust seminars, labs, or resources accordingly. This adaptive approach preserves exploration while keeping students aligned to programme outcomes.

What should we improve next?

The next gains will come from making communication standards visible and consistent across the programme.

  • Set programme-wide service standards for academic communication: channels by query type, a simple reply-within-X-working-days norm, visible office hours, and named back-ups.
  • Fit communication to time-poor cohorts with weekly digests, recorded briefings, and some out-of-hours options; summarise key decisions in one VLE location.
  • Reduce barriers for disabled and mature students by offering alternative modes, including captioned recordings and written summaries, and confirming adjustments in writing; schedule short proactive check-ins at key assessment points.
  • Stabilise delivery and assessment clarity: name an owner for scheduling and course communications, and publish exemplars, rubrics, and turnaround expectations to address weak sentiment on feedback and marking criteria.
  • Measure and learn fast: track response-time compliance and common communication issues by cohort; review at programme meetings and act within the next teaching block.

How Student Voice Analytics helps you

If you want to spot these patterns before they harden into poor NSS outcomes, Student Voice Analytics helps you:

  • See topic and sentiment for this communication theme over time, with drill downs by school, programme, cohort, and site.
  • Compare like for like across computer science and other discipline groupings, and by demographics, including age, domicile, mode, disability, and commuter status.
  • Get concise, anonymised summaries that highlight what to fix now and what to scale, with export-ready outputs for programme boards, TEF, and quality reviews.
  • Evidence progress by tracking response-time compliance, issue volumes, and assessment-related sentiment shift across teaching blocks.

See Student Voice Analytics to explore the platform, or read the buyer's guide for a practical checklist when comparing approaches.

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.