Student Voice Analytics for Statistics — UK student feedback 2018–2025

Scope. UK NSS open-text comments for Statistics (CAH09-01-03) students across academic years 2018–2025.
Volume. ~765 comments; 95.6% successfully categorised to a single primary topic.
Overall mood. Roughly 55.1% Positive, 41.4% Negative, 3.4% Neutral (positive:negative ≈ 1.33:1).

What students are saying

Statistics students talk first about resources and curriculum shape. The single biggest topic is Learning resources (≈12.6% of all comments), with a positive tone (sentiment index +22.4) broadly in line with the sector. Discussion of the Type and breadth of course content is also prominent (≈10.7% share) but more mixed in tone (index +6.8), sitting well below the sector benchmark for the same topic.

Assessment and feedback draws sustained attention. Feedback appears in ~7.0% of comments and is near‑neutral overall (index −0.6), though notably more positive than the sector baseline. By contrast, Assessment methods (index −47.6) and Marking criteria (index −38.5) attract the most negative sentiment within the assessment family, pointing to questions about format, weighting or clarity of expectations.

People‑centred support is a strength. Teaching Staff are rated highly (index +48.1), and Student support is a clear positive (index +29.3). Career guidance/support is especially strong for Statistics (index +57.4). Personal Tutor is mentioned less often and trends only mildly positive (index +4.9), below the sector on tone.

Operationally, this cohort is relatively upbeat. Organisation and management (index +16.5) and Scheduling/timetabling (index +18.8) both score well above sector tone, and Remote learning is also more positive than the wider benchmark. Placements/fieldwork are rarely mentioned in Statistics (≈0.1% vs 3.4% sector), indicating that work‑based learning is not a defining feature of the day‑to‑day experience here.

Top categories by share (statistics vs sector):

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Learning resources Learning resources 12.6 3.8 +8.8 +22.4 +1.0
Type and breadth of course content Learning opportunities 10.7 6.9 +3.7 +6.8 −15.8
Feedback Assessment & feedback 7.0 7.3 −0.3 −0.6 +14.5
Module choice / variety Learning opportunities 6.8 4.2 +2.7 +23.2 +5.8
Delivery of teaching The teaching on my course 6.0 5.4 +0.6 −5.3 −14.0
Student support Academic support 5.6 6.2 −0.6 +29.3 +16.1
Teaching Staff The teaching on my course 5.1 6.7 −1.7 +48.1 +12.6
Organisation & management of course Organisation & management 4.4 3.3 +1.0 +16.5 +30.5
Scheduling/timetabling Organisation & management 4.0 2.9 +1.1 +18.8 +35.4
Remote learning The teaching on my course 3.6 3.5 +0.1 +9.0 +18.0

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Costs / Value for money Others 2.1 1.6 +0.4 −56.7 −3.9
Assessment methods Assessment & feedback 3.6 3.0 +0.6 −47.6 −23.9
Marking criteria Assessment & feedback 2.6 3.5 −0.9 −38.5 +7.2
IT Facilities Learning resources 3.4 1.2 +2.2 −24.6 −10.6
Delivery of teaching The teaching on my course 6.0 5.4 +0.6 −5.3 −14.0
Feedback Assessment & feedback 7.0 7.3 −0.3 −0.6 +14.5

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Career guidance, support Learning community 3.6 2.4 +1.1 +57.4 +27.3
Teaching Staff The teaching on my course 5.1 6.7 −1.7 +48.1 +12.6
Student life Learning community 2.3 3.2 −0.8 +46.8 +14.7
Student support Academic support 5.6 6.2 −0.6 +29.3 +16.1
Module choice / variety Learning opportunities 6.8 4.2 +2.7 +23.2 +5.8
Learning resources Learning resources 12.6 3.8 +8.8 +22.4 +1.0
Scheduling/timetabling Organisation & management 4.0 2.9 +1.1 +18.8 +35.4

What this means in practice

  • Put clarity at the heart of assessment. The most negative topics are Assessment methods and Marking criteria. Publish annotated exemplars, checklist‑style rubrics, and clear grade descriptors; explain the rationale for methods and weightings ahead of time; close the loop by mapping feedback to criteria.

  • Keep resource quality high while fixing IT friction. Learning resources are a major, well‑rated feature of the course, but IT Facilities trend negative. A simple “resource reliability” SLA (uptime, access routes, turnaround on issues) and a single, current index of key resources help sustain satisfaction.

  • Separate people from delivery mechanics. Students value staff and general support highly, but Delivery of teaching is softer. Make the mechanics explicit: brief, consistent session outlines; what to do before/after each session; where to find materials; and how each activity connects to assessment.

  • Maintain operational reliability. Organisation and Scheduling are strong relative to sector; keep a single source of truth for changes and a clear owner for timetables. Remote learning is also above sector on tone—carry forward the practices that made it work.

  • Address value‑for‑money perceptions. Where possible, make visible what students receive (contact, support, resources, skills development) and link those benefits to future outcomes. Small wins in transparency matter here.

Data at a glance (2018–2025)

  • Top topics by share: Learning resources (≈12.6%), Type & breadth of course content (≈10.7%), Feedback (≈7.0%), Module choice/variety (≈6.8%), Delivery of teaching (≈6.0%).
  • Delivery & ops cluster (placements, scheduling, organisation, comms, remote): ≈12.5% of all comments.
  • People & growth cluster (personal tutor, student support, teaching staff, delivery of teaching, personal development, student life): ≈23.5% of all comments, generally positive in tone.
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text survey comments into clear, prioritised actions. It tracks topics and sentiment over time (2018–2025), so you can see where Statistics is improving, where it is static, and where it lags. It supports whole‑institution views as well as fine‑grained analysis by department and school, with concise, anonymised theme summaries and representative comments for programme teams and external stakeholders.

Critically, it enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status). You can segment by site/provider, cohort and year to target interventions where they will shift sentiment most. Export‑ready outputs (for web, decks and dashboards) make it straightforward to share priorities and progress across the institution.

Insights into specific areas of statistics education