Student Voice Analytics for Computer Science — UK student feedback 2018–2025

Scope. UK NSS open-text comments for Computer Science (CAH11-01-01) students across academic years 2018–2025.
Volume. ~9,781 comments; 96.8% successfully categorised to a single primary topic.
Overall mood. Roughly 50.1% Positive, 46.2% Negative, 3.8% Neutral (positive:negative ≈ 1.09:1).

What students are saying

Across the period, students talk most about the substance of the course and how it is assessed. “Type and breadth of course content” is the top topic by volume (≈9.2% share) with a mildly positive tone overall (index ~+8.9) but notably below the sector benchmark for the same topic. In contrast, the Assessment & Feedback suite—Feedback (8.5%), Marking criteria (4.7%), and Assessment methods (3.5%)—is consistently negative (indices ~−27.8, −47.6, −31.0 respectively), signalling a need for clearer expectations, more actionable feedback, and transparent marking standards.

Students also comment frequently on the human and delivery sides of teaching. “Teaching Staff” features strongly (6.6%) and is on balance positive (index ~+18.3) but below sector, while “Delivery of teaching” (5.1%) trends negative (index ~−8.3). “Student support” is present (4.3%) with a near‑neutral tone, and “Availability of teaching staff” (2.2%) is more clearly positive, suggesting that access to people is valued when it’s predictable.

Operational topics carry less weight here than in many disciplines but still matter: “Organisation & management of course,” “Scheduling/timetabling,” and “Communication about course and teaching” together point to a familiar desire for better planning and clearer updates. Notably, “Workload” (2.4%) is strongly negative, and “Student voice” (2.1%) is also negative—signals that students want both feasible pacing and visible responsiveness. On the resource side, “General facilities” and the “Library” lean positive, with IT‑related comments nearer to neutral but a little better than sector.

Some themes appear less than the sector average by share—for example “Placements/fieldwork/trips” and “Student support”—while others are more prominent, such as “Module choice/variety” and “Learning resources.” That distribution underscores where Computer Science students are investing their attention across these years: content, assessment clarity, and the reliability of teaching delivery.

Top categories by share (Computer Science vs sector)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Type and breadth of course content Learning opportunities 9.2 6.9 +2.2 +8.9 −13.7
Feedback Assessment and feedback 8.5 7.3 +1.2 −27.8 −12.7
Teaching Staff The teaching on my course 6.6 6.7 −0.2 +18.3 −17.2
Module choice / variety Learning opportunities 5.4 4.2 +1.2 +3.1 −14.2
Delivery of teaching The teaching on my course 5.1 5.4 −0.4 −8.3 −17.0
Learning resources Learning resources 5.0 3.8 +1.3 +10.3 −11.2
Marking criteria Assessment and feedback 4.7 3.5 +1.1 −47.6 −1.9
Student support Academic support 4.3 6.2 −1.9 +3.6 −9.6
Organisation & management of course Organisation and management 3.7 3.3 +0.4 −24.6 −10.6
Assessment methods Assessment and feedback 3.5 3.0 +0.5 −31.0 −7.2

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment & feedback 4.7 3.5 +1.1 −47.6 −1.9
Workload Organisation & management 2.4 1.8 +0.6 −43.3 −3.3
COVID-19 Others 2.0 3.3 −1.4 −41.9 −9.0
Assessment methods Assessment & feedback 3.5 3.0 +0.5 −31.0 −7.2
Student voice Student voice 2.1 1.8 +0.3 −30.1 −10.8
Feedback Assessment & feedback 8.5 7.3 +1.2 −27.8 −12.7
Organisation & management of course Organisation & management 3.7 3.3 +0.4 −24.6 −10.6

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Personal development Learning community 2.3 2.5 −0.2 +58.2 −1.6
General facilities Learning resources 2.6 1.8 +0.9 +37.4 +13.9
Student life Learning community 2.8 3.2 −0.4 +34.1 +2.0
Career guidance, support Learning community 3.3 2.4 +0.9 +32.6 +2.5
Availability of teaching staff Academic support 2.2 2.1 +0.1 +30.1 −9.2
Teaching Staff Teaching 6.6 6.7 −0.2 +18.3 −17.2
Learning resources Learning resources 5.0 3.8 +1.3 +10.3 −11.2

What this means in practice

  • Make assessment clarity the first lever. Publish annotated exemplars, checklist‑style rubrics, and clear marking criteria; timetable realistic feedback turnaround and feed‑forward guidance. These steps directly address the weakest sentiment areas (Feedback, Marking criteria, Assessment methods).

  • Stabilise the delivery rhythm. Reduce “unknowns” by setting and keeping a single source of truth for changes, naming an owner for scheduling/organisation, and providing short weekly updates (“what changed and why”). This tends to lift Organisation, Scheduling and Student voice simultaneously.

  • Strengthen the teaching experience where students already respond well. Keep the strengths of approachable staff and applied delivery, and use structured session signposting to improve perceptions of “Delivery of teaching.” Maintain the positives in facilities and library access; monitor IT access and tooling so it stays at least sector‑standard.

  • Keep the growth agenda visible. Students notice personal development and career support; make progression links explicit in modules and assessments, and signpost employability touchpoints clearly.

Data at a glance (2018–2025)

  • Top topics by share: Type & breadth of course content (≈9.2%), Feedback (≈8.5%), Teaching Staff (≈6.6%), Module choice/variety (≈5.4%), Delivery of teaching (≈5.1%), Learning resources (≈5.0%).
  • Assessment & Feedback cluster (Feedback, Marking criteria, Assessment methods, Dissertation) accounts for ≈17.5% of comments and is the clearest improvement opportunity.
  • Delivery & ops cluster (Placements/fieldwork, Scheduling, Organisation & management, Course communications, Remote learning) is ≈11.7% by share; tone is mixed to negative across scheduling/organisation, with remote learning close to sector.
  • People & growth cluster (Personal Tutor, Student support, Teaching Staff, Availability of teaching staff, Delivery of teaching, Personal development, Student life) accounts for ≈25.0% and is broadly positive, led by Personal development and Student life.

How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open-text survey comments into clear, prioritised actions by tracking topics and sentiment over time for your discipline and cohorts. It supports whole‑institution views as well as fine‑grained department and school analysis, producing concise, anonymised theme summaries and representative comments for programme teams and external partners.

It also enables like‑for‑like proof of progress, with sector comparisons across CAH codes and by demographics (e.g., year of study, domicile group, mode of study, campus/site, commuter status). You can segment by site/provider, cohort and year to focus on the pockets where change will move the dial most. Export‑ready outputs (web, deck, dashboard) make it straightforward to share priorities and track improvement against the sector on a comparable basis.

Insights into specific areas of computer science education