Student Voice Analytics for Social Policy — UK student feedback 2018–2025

Scope. UK NSS open‑text comments for Social Policy (CAH15-01-03) students across academic years 2018–2025.
Volume. ~528 comments; 97.3% successfully categorised to a single primary topic.
Overall mood. Roughly 50.7% Positive, 46.3% Negative, 3.1% Neutral (positive:negative ≈ 1.09:1).

What students are saying

The Social Policy comment set is anchored in three areas: assessment clarity, people‑centred support and the mechanics of delivery.

  • Assessment and feedback is front‑of‑mind. Feedback (8.6% share) is notably negative (index −19.7), and Marking criteria (2.1%) is strongly negative (−59.6), both below sector on tone. Students tend to reward quick, actionable feedback and clear standards; they react poorly when criteria feel ambiguous or turnaround is unpredictable. Assessment methods (1.6%) is more balanced overall.

  • People and teaching are an evident strength. Student support is the largest single category (9.7%) and net positive (+13.7). Teaching Staff (6.6%) and Delivery of teaching (4.7%) are also positive, though both sit below sector on sentiment. Personal Tutor (4.5%) is only marginally positive (+1.6) and well below sector, suggesting variability in consistency or reach of one‑to‑one guidance. Students also speak positively about Learning resources (4.1%) and Student life (3.5%).

  • Delivery and operations attract a smaller but meaningful share. Remote learning (5.3%) trends slightly positive (+3.0) and is well above the sector on tone. Organisation, management of course (2.9%) is negative (−20.8), and Scheduling/timetabling (2.7%) sits close to neutral (−3.9), much less negative than sector. Where Communication about course and teaching appears (1.2%), it is unexpectedly positive relative to sector. Comments about COVID‑19 (5.6%) and Strike Action (2.5%) are, unsurprisingly, negative, with strike action the most negative category overall.

  • Some topics are under‑represented compared with the sector. Placements/fieldwork/trips are rarely mentioned (0.4% vs 3.4% sector), implying they are not a defining feature of the Social Policy experience. Library (2.3%) is a weak spot on tone (−5.0 vs sector +26.7).

Taken together, the “delivery & ops” cluster accounts for around 12.5% of comments (remote learning, scheduling, organisation, comms), while the “people & growth” cluster (student support, teaching staff, availability, delivery, personal tutor, personal development, student life) is about 32.7%. Students respond best to clear expectations, accessible people and reliable rhythms.

Top categories by share (Social Policy vs sector):

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Student support Academic support 9.7 6.2 3.5 13.7 0.5
Feedback Assessment & feedback 8.6 7.3 1.3 −19.7 −4.7
Type and breadth of course content Learning opportunities 7.6 6.9 0.7 10.7 −11.9
Teaching Staff The teaching on my course 6.6 6.7 −0.1 14.7 −20.9
COVID-19 Others 5.6 3.3 2.3 −20.2 12.7
Remote learning The teaching on my course 5.3 3.5 1.8 3.0 12.0
Delivery of teaching The teaching on my course 4.7 5.4 −0.8 12.8 4.0
Personal Tutor Academic support 4.5 3.2 1.3 1.6 −17.1
Module choice / variety Learning opportunities 4.5 4.2 0.3 21.9 4.5
Learning resources Learning resources 4.1 3.8 0.3 20.8 −0.6

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Strike Action Others 2.5 1.7 0.8 −64.3 −1.3
Marking criteria Assessment & feedback 2.1 3.5 −1.4 −59.6 −13.9
Organisation, management of course Organisation & management 2.9 3.3 −0.4 −20.8 −6.8
COVID-19 Others 5.6 3.3 2.3 −20.2 12.7
Feedback Assessment & feedback 8.6 7.3 1.3 −19.7 −4.7
Communication with supervisor, lecturer, tutor Academic support 2.1 1.7 0.4 −17.9 −9.8
Library Learning resources 2.3 1.8 0.5 −5.0 −31.7

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Student life Learning community 3.5 3.2 0.3 30.1 −2.0
Availability of teaching staff Academic support 2.3 2.1 0.2 24.5 −14.8
Module choice / variety Learning opportunities 4.5 4.2 0.3 21.9 4.5
Learning resources Learning resources 4.1 3.8 0.3 20.8 −0.6
Teaching Staff The teaching on my course 6.6 6.7 −0.1 14.7 −20.9
Student support Academic support 9.7 6.2 3.5 13.7 0.5
Delivery of teaching The teaching on my course 4.7 5.4 −0.8 12.8 4.0

What this means in practice

  • Make assessment clarity non‑negotiable. Publish explicit marking criteria with annotated exemplars; use checklist‑style rubrics; set and communicate realistic feedback SLAs. Where work is graded by multiple markers, show how consistency is ensured. These steps address the clear pain points in Feedback and Marking criteria.

  • Strengthen the everyday touchpoints. Maintain a predictable Personal Tutor cadence (named contact, agreed frequency, clear remit) and ensure staff availability windows are visible. Keep “Delivery of teaching” predictable: signpost learning outcomes, session structure and expectations ahead of time.

  • Tighten operational rhythm and comms. Use a single source of truth for course communications; issue short weekly updates covering “what changed and why”; lock change windows for timetables where feasible. For remote elements, state the ground rules (materials, interaction, recording policy) to preserve the positive tone already present.

  • Resource experience matters. Where library sentiment is weak, check access routes (digital holdings, authentication off‑campus), peak‑time availability, and navigation to subject‑relevant materials. For sector‑wide disruptions (e.g., strikes), explain mitigations and assessment adjustments clearly and early.

Data at a glance (2018–2025)

  • Top topics by share: Student support (≈9.7%), Feedback (≈8.6%), Type & breadth of course content (≈7.6%), Teaching Staff (≈6.6%), COVID‑19 (≈5.6%), Remote learning (≈5.3%).
  • Cluster view:
    • Delivery & ops (placements, scheduling, organisation, comms, remote): ≈12.5% of all comments.
    • People & growth (personal tutor, student support, teaching staff, delivery of teaching, personal development, student life): ≈32.7% of all comments.
  • Notable differences vs sector:
    • Placements/fieldwork/trips are discussed far less than sector (0.4% vs 3.4%).
    • Remote learning is more positive than sector (+12.0 index points).
    • Library sentiment trails sector markedly (−31.7 points).
    • Scheduling is less negative than sector (+12.7 points).
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text survey comments into clear, prioritised actions. It tracks topics and sentiment over time so you can see what is rising, what is falling and where to intervene across the whole institution and within specific departments, schools and programmes.

It also enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), so you can evidence change against the right peer group. You can segment by site/provider, cohort and year to pinpoint issues, generate concise anonymised summaries for stakeholders and programme teams, and export insights in web, deck or dashboard formats for easy sharing.

Insights into specific areas of social policy education