Student Voice Analytics for Software Engineering — UK student feedback 2018–2025

Scope. UK NSS open-text comments for Software Engineering (CAH11-01-04) students across academic years 2018–2025.
Volume. ~1,500 comments; 97.3% successfully categorised to a single primary topic.
Overall mood. 49.1% Positive, 47.5% Negative, 3.4% Neutral (positive:negative ≈ 1.03:1).

What students are saying

Software Engineering students focus most on the mechanics of teaching and on assessment. “Teaching Staff” is the single largest topic (8.8% share) and trends positive overall (sentiment index +13.9), although below the sector’s stronger baseline for the same topic. “Delivery of teaching” is widely discussed (6.2%) but hovers near neutral (−1.5), while “Remote learning” (5.6%) carries a distinctly negative tone (−27.0) and lags sector sentiment.

The assessment picture is clear and consistent: it’s a priority and a pain point. Together, “Feedback” (8.0%), “Marking criteria” (4.6%) and “Assessment methods” (4.3%) account for roughly 17–18% of all comments and are all negative in tone (−22.3, −50.5 and −34.8 respectively), each sitting below sector sentiment. Themes here typically centre on whether expectations are explicit, criteria are interpretable, and turnaround is timely.

Set against those pressures are steady strengths around people and community. “Student support” (4.0%) is a net positive (+16.9, above sector), and students speak well of “Opportunities to work with other students” (3.6%, +9.1, above sector) and of their broader experience (“Student life” +41.8; “Personal development” +52.0). Resource-related topics are mixed: “General facilities” (+40.2), “Learning resources” (+24.8) and “Library” (+55.9) are positive, while “IT Facilities” is weaker (−19.2). Placements are mentioned far less than in the sector (1.4% vs 3.4%) and carry a positive slant when they do arise.

Operational delivery remains a recurrent friction point: “Organisation, management of course” (2.9%, −22.7), “Scheduling/timetabling” (2.0%, −18.2) and “Student voice” (2.3%, −26.0) indicate that predictability, communications and closing the loop on feedback matter to students’ day‑to‑day experience.

Top categories by share (software engineering vs sector):

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Teaching Staff The teaching on my course 8.8 6.7 +2.0 +13.9 −21.6
Feedback Assessment and feedback 8.0 7.3 +0.7 −22.3 −7.2
Type and breadth of course content Learning opportunities 6.4 6.9 −0.5 +10.5 −12.1
Delivery of teaching The teaching on my course 6.2 5.4 +0.8 −1.5 −10.3
Remote learning The teaching on my course 5.6 3.5 +2.1 −27.0 −18.0
Marking criteria Assessment and feedback 4.6 3.5 +1.0 −50.5 −4.8
Module choice / variety Learning opportunities 4.6 4.2 +0.4 +2.3 −15.1
COVID-19 Others 4.5 3.3 +1.1 −36.2 −3.2
Assessment methods Assessment and feedback 4.3 3.0 +1.3 −34.8 −11.1
Student support Academic support 4.0 6.2 −2.2 +16.9 +3.7

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment and feedback 4.6 3.5 +1.0 −50.5 −4.8
COVID-19 Others 4.5 3.3 +1.1 −36.2 −3.2
Assessment methods Assessment and feedback 4.3 3.0 +1.3 −34.8 −11.1
Remote learning The teaching on my course 5.6 3.5 +2.1 −27.0 −18.0
Student voice Student voice 2.3 1.8 +0.6 −26.0 −6.8
Organisation, management of course Organisation and management 2.9 3.3 −0.5 −22.7 −8.8
Feedback Assessment and feedback 8.0 7.3 +0.7 −22.3 −7.2

Shares are the proportion of all Software Engineering comments whose primary topic is the category. Sentiment index ranges from −100 (more negative than positive) to +100 (more positive than negative).

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Personal development Learning community 2.1 2.5 −0.3 +52.0 −7.8
Student life Learning community 2.9 3.2 −0.2 +41.8 +9.7
General facilities Learning resources 3.2 1.8 +1.5 +40.2 +16.7
Career guidance, support Learning community 2.1 2.4 −0.4 +40.1 +10.0
Learning resources Learning resources 3.8 3.8 +0.1 +24.8 +3.4
Student support Academic support 4.0 6.2 −2.2 +16.9 +3.7
Teaching Staff The teaching 8.8 6.7 +2.0 +13.9 −21.6

What this means in practice

  • Make assessment clarity non‑negotiable. Publish annotated exemplars, checklist‑style rubrics and clear grade descriptors; explain how criteria are applied; calibrate marking across modules; and commit to a realistic feedback service level that includes “what to do next” feedforward.

  • Reduce operational friction. Establish a single, reliable source of truth for course communications; name owners for scheduling and organisation; provide timely change rationales; and close the loop on “you said, we did” so Student Voice sentiment improves.

  • Lift the learning experience in mixed modes. For remote or hybrid elements, emphasise structure and expectations, align platforms and timelines, and ensure materials and interactions are easy to find and revisit.

  • Protect and promote people‑centred strengths. Maintain visibility of supportive staff and effective signposting. Given “Personal Tutor” appears far less than sector, clarify roles and contact expectations so students know where to go for help.

Data at a glance (2018–2025)

  • Top topics by share: Teaching Staff (≈8.8%), Feedback (≈8.0%), Type & breadth of course content (≈6.4%), Delivery of teaching (≈6.2%), Remote learning (≈5.6%).
  • Delivery & ops cluster (placements, scheduling, organisation, comms, remote): ≈13.6% of all comments, with weaker tone driven by Remote learning (−27.0) and Organisation/management (−22.7).
  • People & growth cluster (personal tutor, student support, teaching staff, availability of staff, delivery of teaching, personal development, student life): ≈26.1% of comments, generally positive in tone.
  • Assessment & feedback topics (feedback, marking criteria, assessment methods, dissertation) together account for ≈17.7% and skew negative.

How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics converts thousands of open‑text survey responses into clear, prioritised actions. It tracks topics, sentiment and movement by year across the whole institution and at fine‑grained levels (school, department, programme), so teams can focus on high‑impact areas like assessment clarity, delivery/communications and student support.

Most importantly, it enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status). You can segment by site/provider, cohort and year to target interventions where they will move sentiment most. Concise, anonymised theme summaries with representative comments make it straightforward to brief programme teams and external partners without trawling raw text, and export‑ready outputs (reports, decks, dashboards) help you share priorities and progress across the institution.

Insights into specific areas of software engineering education