Student Voice Analytics for History — UK student feedback 2018–2025

Scope. UK NSS open-text comments for History (CAH20-01-01) students across academic years 2018–2025.
Volume. ~10,636 comments; 97.5% successfully categorised to a single primary topic.
Overall mood. Roughly 51.9% Positive, 44.7% Negative, 3.5% Neutral (positive:negative ≈ 1.16:1).

What students are saying

History students focus first on the shape and quality of the academic experience. The largest topic is Module choice/variety (7.5% share), followed closely by Feedback (7.1%), Teaching Staff (7.0%), and the Type and breadth of course content (6.4%). The tone is notably positive around choice and content (sentiment indices +25.2 and +30.7 respectively, both above sector), and strongly positive for Teaching Staff (+41.1). Delivery of teaching also trends positive (+13.4).

Assessment and feedback remains a pressure point. While Feedback carries a negative tone (−11.0), it is less negative than the sector. By contrast, Marking criteria (−46.8) and Assessment methods (−25.9) are more clearly negative, with students signalling uncertainty about expectations and how work is judged.

“Strike Action” appears unusually prominently in the History narrative (4.6% share, +2.8 pp vs sector) and is very negative (−63.1). COVID-19 and Remote learning also contribute negative tone. Value for money is a smaller but strongly negative theme (−58.3).

Resources are generally a positive: Learning resources (+28.4) and the Library (+25.9) are both well regarded, and Availability of teaching staff stands out as a strength (+46.7). Personal support topics (Personal Tutor, Student support) appear frequently, though their tone is more mixed relative to sector.

Some areas are less salient than in the wider sector. Placements/fieldwork are rarely mentioned (0.3% share, −3.2 pp vs sector). Student life remains broadly positive but below sector on tone, and topics like Career guidance/support and Opportunities to work with other students underperform relative to sector on sentiment when they are raised.

Top categories by share (history vs sector):

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Module choice / variety Learning opportunities 7.5 4.2 +3.3 +25.2 +7.8
Feedback Assessment and feedback 7.1 7.3 −0.3 −11.0 +4.1
Teaching Staff The teaching on my course 7.0 6.7 +0.3 +41.1 +5.6
Type and breadth of course content Learning opportunities 6.4 6.9 −0.5 +30.7 +8.1
Student support Academic support 5.6 6.2 −0.6 +6.4 −6.8
Delivery of teaching The teaching on my course 5.5 5.4 0.0 +13.4 +4.7
Strike Action Others 4.6 1.7 +2.8 −63.1 −0.1
Learning resources Learning resources 3.8 3.8 0.0 +28.4 +7.0
Library Learning resources 3.8 1.8 +2.0 +25.9 −0.8
Marking criteria Assessment and feedback 3.7 3.5 +0.1 −46.8 −1.1

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Strike Action Others 4.6 1.7 +2.8 −63.1 −0.1
Costs / Value for money Others 2.2 1.6 +0.6 −58.3 −5.5
Marking criteria Assessment and feedback 3.7 3.5 +0.1 −46.8 −1.1
COVID-19 Others 3.4 3.3 0.0 −38.2 −5.3
Assessment methods Assessment and feedback 2.6 3.0 −0.4 −25.9 −2.2
Organisation, management of course Organisation and management 2.5 3.3 −0.9 −21.4 −7.4
Remote learning The teaching on my course 3.5 3.5 0.0 −16.9 −7.9

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Personal development Learning community 2.3 2.5 −0.1 +60.1 +0.3
Availability of teaching staff Academic support 2.1 2.1 0.0 +46.7 +7.4
Teaching Staff The teaching on my course 7.0 6.7 +0.3 +41.1 +5.6
Type and breadth of course content Learning opportunities 6.4 6.9 −0.5 +30.7 +8.1
Learning resources Learning resources 3.8 3.8 0.0 +28.4 +7.0
Library Learning resources 3.8 1.8 +2.0 +25.9 −0.8
Module choice / variety Learning opportunities 7.5 4.2 +3.3 +25.2 +7.8

What this means in practice

  • Make clarity the currency in assessment. Publish annotated exemplars, checklist-style rubrics and plain‑English marking criteria; align feedback to those criteria and commit to realistic turnaround times. This directly addresses the strongest assessment pain points (Marking criteria and Assessment methods) while improving the usefulness of Feedback.

  • Protect the positive academic core. Students respond well to strong teaching and rich module choice. Keep that momentum by maintaining transparent option selection windows, mapping modules to clear learning outcomes, and providing short overviews or sample materials that help students choose well.

  • Acknowledge and mitigate disruption. Where industrial action or legacy COVID-19 impacts are present, be explicit about catch‑up teaching, adjusted assessment loads, and “what changed and why.” A single source of truth for updates reduces confusion and frustration.

  • Tackle perceived value head‑on. Show how contact time, seminar quality, access to staff, rich library resources and skills support add up. Where contact time is a concern in specific areas, be clear about the balance between scheduled teaching and guided independent study, and signpost academic skills support.

  • Strengthen community and progression. Sentiment lags sector for Student life, Career guidance/support and Opportunities to work with other students. Practical steps include timetabled peer‑learning, small‑group seminars, alumni Q&As, employer‑linked briefs and visible careers signposting within core modules.

Data at a glance (2018–2025)

  • Top topics by share are stable: Module choice/variety (≈7.5%), Feedback (≈7.1%), Teaching Staff (≈7.0%), Type and breadth of course content (≈6.4%), Student support (≈5.6%), Delivery of teaching (≈5.5%). Strike Action (≈4.6%) features more than the sector and is strongly negative.
  • Cluster view:
    • Delivery & ops cluster (placements, scheduling, organisation, course comms, remote): ≈9.3% of all comments; tone pulled down by Remote learning and Organisation.
    • People & growth cluster (personal tutor, student support, teaching staff, availability of staff, delivery of teaching, personal development, student life): ≈28.9% of comments, with strong positives for Teaching Staff, Availability of staff and Personal development.
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open-text survey comments into clear, prioritised actions. It tracks topics and sentiment by year for History (CAH20-01-01) and all other disciplines, so teams can focus on high‑impact categories such as Feedback, Marking criteria, Teaching quality, and key operational themes.

It also lets you prove change on a like‑for‑like basis: benchmark and monitor against sector peers with like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status). Analyse at whole‑institution level and drill down to faculties, schools and programmes; segment by cohort, site/provider and year of study to target interventions where they will move sentiment most. Concise, anonymised theme summaries and representative comments make it easy to brief programme teams and external/partner stakeholders. Export‑ready outputs (web, deck, dashboard) support straightforward sharing of priorities and progress.

Insights into specific areas of history education