Student Voice Analytics for Biosciences — UK student feedback 2018–2025

Scope. UK NSS open-text comments for Biosciences (CAH03-01-01) students across academic years 2018–2025.
Volume. ~1,168 comments; 98% successfully categorised to a single primary topic.
Overall mood. Roughly 50.4% Positive, 45.7% Negative, 3.9% Neutral (positive:negative ≈ 1.10:1).

What students are saying

Across the period, students concentrate most on Assessment & Feedback. Feedback carries the single largest share (8.0%), with Assessment methods (5.2%), Marking criteria (4.8%) and Dissertation (2.8%) close behind. Together these account for about one in five comments (~20.8%). Tone is typically negative: Feedback (−22.7), Assessment methods (−21.9) and especially Marking criteria (−52.3). The pattern is familiar—clarity of expectations, exemplars, rubrics, and turnaround standards are decisive for sentiment.

By contrast, students are consistently positive about the human and pedagogic core of the course. Teaching Staff (7.2%; +41.0), Delivery of teaching (7.2%; +11.1) and the Type and breadth of course content (5.9%; +35.4) all attract strong approval, and Module choice/variety (4.0%; +40.8) outperforms the sector by a sizeable margin. Personal Tutor references are less frequent (2.7%) but still positive.

Operational topics show a mixed picture. Remote learning (5.9%; −19.4) and the COVID-19 period (5.5%; −36.3) lean negative. Communication about course and teaching (2.1%; −32.8) and Scheduling/timetabling (1.7%; −32.3) also carry negative tone, although overall Organisation and management of the course sits near neutral (2.8%; −3.1). Student support (5.0%) trends slightly negative (−1.7) and notably below sector, suggesting room to tighten coordination and signposting.

Two further themes to watch: Placements/fieldwork/trips are a smaller share in this discipline (2.7%) but trend strongly positive (+47.9, well above sector), and Costs/Value for money, while a modest share (1.5%), is very negative (−66.7).

Top categories by share (biosciences vs sector):

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Feedback Assessment & feedback 8.0 7.3 0.7 −22.7 −7.6
Teaching Staff The teaching on my course 7.2 6.7 0.4 +41.0 +5.4
Delivery of teaching The teaching on my course 7.2 5.4 1.7 +11.1 +2.3
Remote learning The teaching on my course 5.9 3.5 2.5 −19.4 −10.4
Type & breadth of course content Learning opportunities 5.9 6.9 −1.0 +35.4 +12.8
COVID-19 Others 5.5 3.3 2.2 −36.3 −3.4
Assessment methods Assessment & feedback 5.2 3.0 2.2 −21.9 +1.8
Student support Academic support 5.0 6.2 −1.2 −1.7 −14.9
Marking criteria Assessment & feedback 4.8 3.5 1.3 −52.3 −6.6
Module choice / variety Learning opportunities 4.0 4.2 −0.2 +40.8 +23.4

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment & feedback 4.8 3.5 1.3 −52.3 −6.6
Workload Organisation & management 2.1 1.8 0.3 −51.5 −11.5
Student voice Student voice 2.2 1.8 0.4 −39.4 −20.1
COVID-19 Others 5.5 3.3 2.2 −36.3 −3.4
Communication about course and teaching Organisation & management 2.1 1.7 0.4 −32.8 +3.0
Feedback Assessment & feedback 8.0 7.3 0.7 −22.7 −7.6
Assessment methods Assessment & feedback 5.2 3.0 2.2 −21.9 +1.8

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Placements/fieldwork/trips Learning opportunities 2.7 3.4 −0.7 +47.9 +36.1
Teaching Staff The teaching on my course 7.2 6.7 0.4 +41.0 +5.4
Module choice / variety Learning opportunities 4.0 4.2 −0.2 +40.8 +23.4
Type & breadth of course content Learning opportunities 5.9 6.9 −1.0 +35.4 +12.8
Career guidance, support Learning community 2.1 2.4 −0.3 +27.1 −3.0
Personal Tutor Academic support 2.7 3.2 −0.5 +15.9 −2.8
Delivery of teaching The teaching on my course 7.2 5.4 1.7 +11.1 +2.3

What this means in practice

  • Make assessment clarity non‑negotiable. Publish annotated exemplars at each grade band, use checklist‑style rubrics linked to learning outcomes, and set a visible feedback SLA with progress tracking. Where possible, run calibration workshops so staff mark to the same standard and students understand what “good” looks like.

  • Stabilise delivery and communications. Provide a single source of truth for timetables and changes; issue a short weekly “what changed and why” digest; and set a freeze window before assessments. This will lift sentiment in Communication, Scheduling and Student voice, especially where students currently report uncertainty.

  • Keep what works highly visible. Protect contact with Teaching Staff, preserve the structure students value in Delivery of teaching, and continue to curate coherent programme content and module choice—these are reputational strengths that offset operational noise.

  • Address pain points with intent. Remote learning is still a drag on tone: keep layouts consistent across modules, record sessions where appropriate, and ensure parity between remote and in‑person expectations. Monitor workload peaks and bunching across modules, and be transparent about contact vs self‑study hours.

Data at a glance (2018–2025)

  • Top topics by share: Feedback (8.0%), Teaching Staff (7.2%), Delivery of teaching (7.2%), Remote learning (5.9%), Type & breadth of course content (5.9%), COVID‑19 (5.5%).
  • Assessment & feedback cluster (Feedback, Assessment methods, Marking criteria, Dissertation): ~20.8% of all comments; tone generally negative.
  • Delivery & ops cluster (Placements, Scheduling, Organisation & management of course, Communication about course and teaching, Remote learning, Workload): ~17.3% of all comments; mixed tone with remote/scheduling/comms negative.
  • People & growth cluster (Personal Tutor, Student support, Teaching Staff, Availability of teaching staff, Delivery of teaching, Personal development, Student life): ~26.9% of all comments; strong positives around staff and teaching.
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text survey comments into clear, prioritised actions. It tracks topics and sentiment year by year so you can see whether changes in areas like Feedback, Marking criteria, Remote learning and Scheduling are moving in the right direction.

It supports whole‑institution views as well as fine‑grained analysis by faculty, school and department. You get concise, anonymised summaries and representative comments for programme teams and external partners, without trawling thousands of responses. Most importantly, it enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), so you can evidence progress against the right peer group. Flexible segmentation (site/provider, cohort, year) and export‑ready outputs (web, deck, dashboard) make it straightforward to brief stakeholders and share progress.

Insights into specific areas of biosciences education