Scope. UK NSS open-text comments for Health Sciences (CAH02-06-01) students across academic years 2018–2025.
Volume. ~3,934 comments; 97.5% successfully categorised to a single primary topic.
Overall mood. Roughly 54.6% Positive, 42.1% Negative, 3.4% Neutral (positive:negative ≈ 1.30:1).
Students in Health Sciences discuss a broad mix of learning and delivery topics. The single largest theme is placements and fieldwork (≈7.9% of all comments, about one in thirteen). The tone is mildly positive overall (index ~+7.2) but trails the sector for the same topic. Comments typically weigh up the practicalities—site availability, scheduling and clarity of expectations—against the value of applied learning.
A second centre of gravity is assessment. Feedback (≈6.2% share) is negative on balance (index ~−11.2) but notably less negative than the sector. Where sentiment dips, students point to the usefulness and timeliness of feedback. Marking criteria (≈3.7%) is the most negative assessment subtopic (index ~−42.8), signalling a need for clearer standards and exemplars. Assessment methods (≈3.0%) also lean negative, though again less so than the sector.
The operational delivery of the programme matters: scheduling/timetabling (≈4.3%) is consistently critical (index ~−16.0, roughly in line with sector). Organisation and management of the course is closer to neutral (index ~−2.2) and is judged more favourably than sector peers. Remote learning (≈4.0%) remains slightly negative (index ~−2.6), but sentiment is better than sector averages. Together with placements and course communications, this “delivery & ops” cluster accounts for just over one‑fifth of all comments.
Set against those frictions are people‑centred strengths. Teaching Staff (index ~+39.9), Personal Tutor support (+29.2), and Student support (+19.6) are all clear positives and sit above sector on tone. Delivery of teaching itself is viewed positively (+18.0, well above sector), and students cite gains in personal development (+58.5) and career guidance (+35.1). Learning resources are also a net positive (+25.3), with the Library particularly well‑rated.
Some topics surface less often here than in the wider sector—module choice/variety, student life and library share are all a little lower—suggesting that day‑to‑day delivery and assessment clarity occupy more mindshare for these students.
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Placements/ fieldwork/ trips | Learning opportunities | 7.9 | 3.4 | +4.5 | +7.2 | −4.6 |
Type and breadth of course content | Learning opportunities | 6.4 | 6.9 | −0.5 | +32.6 | +10.0 |
Student support | Academic support | 6.4 | 6.2 | +0.2 | +19.6 | +6.4 |
Feedback | Assessment and feedback | 6.2 | 7.3 | −1.1 | −11.2 | +3.9 |
Teaching Staff | The teaching on my course | 5.7 | 6.7 | −1.0 | +39.9 | +4.4 |
Delivery of teaching | The teaching on my course | 4.8 | 5.4 | −0.7 | +18.0 | +9.3 |
Learning resources | Learning resources | 4.6 | 3.8 | +0.8 | +25.3 | +3.8 |
Scheduling/ timetabling | Organisation and management | 4.3 | 2.9 | +1.4 | −16.0 | +0.5 |
Personal Tutor | Academic support | 4.3 | 3.2 | +1.1 | +29.2 | +10.5 |
Remote learning | The teaching on my course | 4.0 | 3.5 | +0.5 | −2.6 | +6.4 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Marking criteria | Assessment and feedback | 3.7 | 3.5 | +0.2 | −42.8 | +2.9 |
Workload | Organisation and management | 2.5 | 1.8 | +0.7 | −40.6 | −0.6 |
Assessment methods | Assessment and feedback | 3.0 | 3.0 | +0.0 | −18.6 | +5.1 |
Scheduling/timetabling | Organisation and management | 4.3 | 2.9 | +1.4 | −16.0 | +0.5 |
COVID-19 | Others | 3.5 | 3.3 | +0.2 | −15.0 | +18.0 |
Feedback | Assessment and feedback | 6.2 | 7.3 | −1.1 | −11.2 | +3.9 |
Communication with supervisor, lecturer, tutor | Academic support | 2.0 | 1.7 | +0.3 | −10.3 | −2.2 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Personal development | Learning community | 2.6 | 2.5 | +0.1 | +58.5 | −1.3 |
Teaching Staff | The teaching on my course | 5.7 | 6.7 | −1.0 | +39.9 | +4.4 |
Student life | Learning community | 2.4 | 3.2 | −0.8 | +36.7 | +4.6 |
Career guidance, support | Learning community | 2.6 | 2.4 | +0.2 | +35.1 | +5.0 |
Availability of teaching staff | Academic support | 2.5 | 2.1 | +0.4 | +34.5 | −4.8 |
Type and breadth of course content | Learning opportunities | 6.4 | 6.9 | −0.5 | +32.6 | +10.0 |
Personal Tutor | Academic support | 4.3 | 3.2 | +1.1 | +29.2 | +10.5 |
Prioritise clarity in assessment. Publish annotated exemplars, checklist‑style rubrics and clear turnaround expectations. These straightforward moves address the most negative categories—marking criteria, workload and assessment methods—and lift the perceived usefulness of feedback.
Treat the delivery layer as a designed service. Name an owner for timetabling and organisation; set and communicate a “single source of truth” for course changes; and use short weekly updates to explain what changed and why. This stabilises scheduling sentiment and helps remote/online components land well.
Sustain the people‑centred strengths. Keep visibility high for Personal Tutors and Teaching Staff, and invest in structured guidance conversations (careers, progression, confidence). These are already well above sector on tone and are highly valued.
For placements and fieldwork, lock in practical predictability: confirm capacity early, minimise late changes, and be explicit about expectations and support on site. Small improvements here compound across the wider delivery experience.
Student Voice Analytics turns open‑text survey comments into clear, actionable priorities. It tracks topics, sentiment and movement by year so you can see what’s changing across the institution and within specific schools, departments and programmes.
It provides concise, anonymised theme summaries and representative comments for boards, partners and programme teams—so you can brief stakeholders without trawling thousands of responses. Crucially, it enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), so you can evidence improvement against the right peer group. You can also segment by site/provider, cohort and year to target interventions where they will shift sentiment most. Export‑ready outputs (web, deck, dashboard) make it straightforward to share priorities and progress.
health sciences (non-specific)
Assessment methods in health sciences
Overview of diverse assessment methods in health sciences education.