Student Voice Analytics for Artificial Intelligence — UK student feedback 2018–2025

Scope. UK NSS open-text comments for Artificial Intelligence (CAH11-01-05) students across academic years 2018–2025.
Volume. ~287 comments; 97% successfully categorised to a single primary topic.
Overall mood. Roughly 50.2% Positive, 46.6% Negative, 3.2% Neutral (positive:negative ≈ 1.08:1).

What students are saying

Artificial Intelligence students place a clear emphasis on assessment clarity and fairness. Feedback is the single largest topic (≈9.0% share) and is strongly negative (index around −44.8), sitting well below the sector tone for the same category. Comments about Marking criteria (≈7.5% share) are even more negative (≈−54.0). Assessment methods also lean negative. The through-line is consistent: students want transparent criteria, exemplars that show “what good looks like”, and feedback that is timely and actionable.

Set against these pressures are notable positives in teaching and curriculum. Teaching Staff are well regarded (index ≈+34.3), Delivery of teaching trends positive, and students are upbeat about the Type and breadth of course content. Module choice/variety stands out with a very positive tone (≈+58.9) and above-sector performance.

Support structures appear more mixed. Student support and Personal Tutor are both discussed at similar volumes to sector but skew negative and below-sector on tone, suggesting variability in availability, outreach and follow‑through. Operational categories such as Organisation and management of course and Workload also carry negative sentiment.

Learning resources are generally a bright spot: General facilities and Learning resources trend positive and above sector. Smaller-volume topics such as Library and Study Space, while limited in share, are very positive.

External factors show up clearly. Strike Action (≈4.3%) is strongly negative; COVID‑19 appears less frequently and is negative but less central to the overall narrative.

Top categories by share (AI vs sector):

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Feedback Assessment and feedback 9.0 7.3 1.7 -44.8 -29.8
Student support Academic support 7.5 6.2 1.3 -13.5 -26.7
Marking criteria Assessment and feedback 7.5 3.5 4.0 -54.0 -8.3
Teaching Staff The teaching on my course 7.5 6.7 0.8 +34.3 -1.3
Delivery of teaching The teaching on my course 6.8 5.4 1.4 +12.7 +3.9
Type and breadth of course content Learning opportunities 6.8 6.9 -0.1 +29.5 +6.9
Module choice / variety Learning opportunities 4.7 4.2 0.5 +58.9 +41.5
Strike Action Others 4.3 1.7 2.6 -53.4 +9.6
Organisation, management of course Organisation and management 3.9 3.3 0.6 -28.3 -14.3
Personal Tutor Academic support 3.9 3.2 0.8 -20.7 -39.4

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment and feedback 7.5 3.5 4.0 -54.0 -8.3
Strike Action Others 4.3 1.7 2.6 -53.4 +9.6
Workload Organisation and management 3.2 1.8 1.4 -48.1 -8.1
Feedback Assessment and feedback 9.0 7.3 1.7 -44.8 -29.8
Organisation, management of course Organisation and management 3.9 3.3 0.6 -28.3 -14.3
Assessment methods Assessment and feedback 2.5 3.0 -0.5 -25.0 -1.2
Personal Tutor Academic support 3.9 3.2 0.8 -20.7 -39.4

Shares are the proportion of all Artificial Intelligence comments whose primary topic is the category. Sentiment index ranges from −100 (more negative than positive) to +100 (more positive than negative).

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Module choice / variety Learning opportunities 4.7 4.2 0.5 +58.9 +41.5
General facilities Learning resources 3.9 1.8 2.2 +41.2 +17.8
Teaching Staff The teaching on my course 7.5 6.7 0.8 +34.3 -1.3
Career guidance, support Learning community 2.5 2.4 0.1 +32.5 +2.4
Learning resources Learning resources 3.9 3.8 0.2 +30.9 +9.4
Type and breadth of course content Learning opportunities 6.8 6.9 -0.1 +29.5 +6.9
Delivery of teaching The teaching on my course 6.8 5.4 1.4 +12.7 +3.9

What this means in practice

  • Make assessment clarity non‑negotiable. Publish annotated exemplars aligned to the marking criteria; use checklist‑style rubrics; set realistic turnaround times and honour them. Hold routine marker calibration so students see consistency in standards.

  • Stabilise the operational rhythm. Name an owner for course organisation and workload planning; maintain a single, up‑to‑date source of truth for schedules and changes; provide a brief weekly “what changed and why” update to cut ambiguity.

  • Strengthen student support touchpoints. Clarify the Personal Tutor role, establish proactive check‑ins at predictable points in term, and offer clear signposting for academic and wellbeing support so that help feels available and joined‑up.

  • Build on what works. Protect the strengths students value—engaged teaching staff, well‑structured delivery, broad content and module choice—and keep facilities and resources accessible and reliable.

Data at a glance (2018–2025)

  • Top topics by share: Feedback (≈9.0%), Student support (≈7.5%), Marking criteria (≈7.5%), Teaching Staff (≈7.5%), Delivery of teaching (≈6.8%), Type and breadth of course content (≈6.8%).
  • Clusters:
    • Assessment & feedback cluster (Feedback, Marking criteria, Assessment methods, Dissertation): ≈21.2% of all comments; tone is predominantly negative and below sector for the largest items.
    • People & growth cluster (Personal Tutor, Student support, Teaching Staff, Availability of teaching staff, Delivery of teaching, Personal development, Student life): ≈30.4%, with teaching and content positive but support/tutor below sector.
    • Delivery & ops cluster (Organisation & management, Scheduling, Comms about course/teaching, Remote learning, Placements/fieldwork): ≈6.8%, mixed but tending negative for organisation and workload.
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text survey comments into clear priorities you can act on. It tracks topics and sentiment year by year for all disciplines, including Artificial Intelligence, so programme teams can focus on high‑impact categories like Feedback, Marking criteria, Organisation and workload, and Teaching quality.

It supports analysis at whole‑institution level as well as fine‑grained department and school views, producing concise, anonymised theme summaries and representative comments for boards, partners and programme teams. Most importantly, it enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), so you can evidence improvement against the right peer group. You can also segment by site/provider, cohort and year to target interventions where they will move sentiment most. Export‑ready outputs (web, deck, dashboard) make it straightforward to share priorities and progress across the institution.

Insights into specific areas of artificial intelligence education