What UK Social Sciences (Non-Specific) Students Say: NSS Feedback Analysis (1,682 Comments, 2018–2025)

Key findings

  • 1,682 comments analysed across UK social sciences (non-specific) programmes (2018–2025); 54% positive overall
  • Type and breadth of course content is the most-discussed topic (7.9% of comments, sentiment index +22.2)
  • Marking criteria is the biggest pain point (sentiment −36.9, +8.8 vs sector)
  • Personal development is a clear strength (sentiment +55.6)

What students are saying

Students’ comments centre first on the shape and content of the curriculum. “Type and breadth of course content” is the single largest topic by share (~7.9%) and carries a positive tone (index ~+22.2), broadly in line with the sector. Alongside content, there is sustained attention to the mechanics of delivery: “Remote learning” (~7.2%) is widely discussed and leans negative (−10.2), and “Delivery of teaching” (~5.4%) trends slightly negative (−3.8), even as students rate “Teaching Staff” highly (index ~+40.8). In short, the people are praised, while elements of format and experience—especially online—draw more criticism.

Support-focused themes are a clear strength. “Student support” (~7.4%) and “Personal Tutor” (~6.1%) both trend positive and sit above the sector on tone. “Availability of teaching staff” (2.0%) is also warmly viewed. Students also report gains in “Personal development” (3.9%, index ~+55.6), and the wider “Learning resources” offer (6.1%, +26.6) is seen as helpful.

In Assessment & Feedback, patterns split. “Feedback” appears in ~6.6% of comments and is mildly negative (−6.9), though notably less negative than the sector. “Marking criteria” (4.5%) is a clear pain point (−36.9), typically when expectations are unclear. “Assessment methods” (2.8%) also leans negative (−20.7). Clarity and exemplars remain the quickest route to improvement.

Operational themes are mixed but comparatively lighter in volume for this discipline. “Organisation and management of course” (3.4%) sits near neutral (−0.1) and above the sector benchmark on tone, while “Scheduling/timetabling” (2.9%) is positive (+16.8) and well above sector. “Communication about course and teaching” appears less often (1.5%) but is sharply negative (−39.8). Notably, “Placements/fieldwork” barely feature (0.1% vs sector 3.4%), indicating that practice-based logistics are not a dominant part of the student experience in this area.

Top categories by share (Social Sciences vs sector)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Type and breadth of course content Learning opportunities 7.9 6.9 +1.0 +22.2 −0.4
Student support Academic support 7.4 6.2 +1.2 +26.1 +12.9
Remote learning The teaching on my course 7.2 3.5 +3.7 −10.2 −1.1
Feedback Assessment & feedback 6.6 7.3 −0.7 −6.9 +8.1
Learning resources Learning resources 6.1 3.8 +2.4 +26.6 +5.2
Personal Tutor Academic support 6.1 3.2 +2.9 +23.7 +5.0
Delivery of teaching The teaching on my course 5.4 5.4 +0.0 −3.8 −12.6
Teaching Staff The teaching on my course 4.5 6.7 −2.2 +40.8 +5.3
Marking criteria Assessment & feedback 4.5 3.5 +1.0 −36.9 +8.8
Module choice / variety Learning opportunities 4.5 4.2 +0.3 +13.2 −4.2

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment & feedback 4.5 3.5 +1.0 −36.9 +8.8
COVID-19 Others 3.6 3.3 +0.2 −34.9 −2.0
Assessment methods Assessment & feedback 2.8 3.0 −0.1 −20.7 +3.1
Remote learning The teaching on my course 7.2 3.5 +3.7 −10.2 −1.1
Feedback Assessment & feedback 6.6 7.3 −0.7 −6.9 +8.1
Delivery of teaching The teaching on my course 5.4 5.4 +0.0 −3.8 −12.6
Organisation, management of course Organisation & management 3.4 3.3 +0.1 −0.1 +13.9

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Personal development Learning community 3.9 2.5 +1.4 +55.6 −4.2
Teaching Staff Teaching 4.5 6.7 −2.2 +40.8 +5.3
Availability of teaching staff Academic support 2.0 2.1 −0.1 +33.7 −5.7
Learning resources Learning resources 6.1 3.8 +2.4 +26.6 +5.2
Student support Academic support 7.4 6.2 +1.2 +26.1 +12.9
Student life Learning community 3.7 3.2 +0.5 +24.7 −7.4
Personal Tutor Academic support 6.1 3.2 +2.9 +23.7 +5.0

What this means in practice

  • Make assessment expectations unmissable. Where “Marking criteria” and “Assessment methods” drive negativity, publish annotated exemplars, checklist-style rubrics, and plain‑English marking guides. Agree and communicate realistic turnaround times, then meet them.

  • Stabilise online learning journeys. For “Remote learning” and “Delivery of teaching”, set a single source of truth for materials and updates; use consistent layouts; provide short “what to do this week” summaries; and capture recordings with clear audio and slides.

  • Keep investing in people-centred support. Protect the responsiveness of personal tutors and the broader support offer—students notice and value it. Use these touchpoints to signpost assessment support and study resources.

  • Tighten delivery comms. Even with relatively positive scheduling and organisation, the subset of comments on course communications remains negative. A weekly digest (“what changed and why”) and named ownership for updates reduce confusion quickly.

Data at a glance (2018–2025)

  • Top topics by share: Type & breadth of course content (≈7.9%), Student support (≈7.4%), Remote learning (≈7.2%), Feedback (≈6.6%), Learning resources (≈6.1%), Personal Tutor (≈6.1%), Delivery of teaching (≈5.4%).
  • Cluster view:
    • Delivery & ops (remote learning, scheduling, organisation, comms, placements): ~15.1% of all comments.
    • People & growth (student support, personal tutor, teaching staff, availability of staff, delivery of teaching, personal development, student life): ~33.0%.
    • Assessment & feedback (feedback, marking criteria, assessment methods, dissertation): ~15.0%.
  • Topics under- or over-discussed vs sector: placements/fieldwork appear far less (0.1% vs 3.4%), while learning resources and support topics appear more often than sector averages.
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open-text survey responses into clear, defensible priorities. It tracks topics and sentiment over time so programme, department and school teams can see which categories move, why, and where to act—across the whole institution or within a specific course.

The platform enables like-for-like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), so you can evidence progress relative to the right peer group. You can segment by site/provider, cohort, and year to pinpoint where interventions will have most impact. Concise, anonymised summaries and representative comments make it easy to brief partners and programme teams, and export‑ready outputs (web, deck, dashboard) support straightforward sharing of priorities and progress.

How to use this data

This page presents sector-level student feedback analysis for social sciences (non-specific), with sentiment benchmarks and topic breakdowns you can reference directly in institutional documents.

Use this for

  • Annual Programme Review (APR) — reference the top-categories table and sentiment benchmarks to contextualise your programme's results against the discipline.
  • TEF and quality enhancement — cite the sentiment index and sector delta columns as evidence of awareness of student priorities relative to the sector.
  • Professional body revalidation — draw on placement, assessment and support data for evidence of responsiveness to student feedback in your discipline.
  • Staff-Student Liaison Committees (SSLCs) — share the key findings and most-negative categories as discussion starters with student representatives.
  • New programme design — use the topic share and sentiment data to anticipate which aspects of the student experience will need proactive attention.

Common themes in this subject area (on our blog)

Most-read posts in this subject area

Recommended next steps

  1. Look for repeatability: which themes recur across years and modules?
  2. Check whether issues are structural (resources/staffing) or local (one module/team).
  3. Define what “good” looks like for the subject (examples, rubrics, assessment clarity).
  4. Track movement: do actions reduce volume/negativity for key themes next cycle?

Cite this page

Student Voice AI (2025). "Social Sciences (non-specific) student feedback analysis (CAH15-01-01)." Student Voice AI. https://www.studentvoice.ai/cah3/social-sciences-(non-specific)/

Case studies on curriculum, support and remote learning in social sciences

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.