Scope. UK NSS open‑text comments for Economics (CAH15-02-01) students across academic years 2018–2025.
Volume. ~9,472 comments; 97% successfully categorised to a single primary topic.
Overall mood. Roughly 51.6% Positive, 43.5% Negative, 4.9% Neutral (positive:negative ≈ 1.18:1).
The Economics conversation is led by assessment. Close to one in five comments centre on Assessment & Feedback (Feedback ≈9.8% share; Assessment methods ≈3.8%; Marking criteria ≈3.7%; Dissertation ≈1.1%). Tone is notably negative in the parts students rely on for progress: Marking criteria (index −48.1), Assessment methods (−32.5) and Feedback (−21.2). The themes are familiar—clarity of expectations, fairness/consistency across markers and modules, and the usefulness and timeliness of feedback.
By contrast, students value the course design they can see. Two high‑volume categories—Type and breadth of course content (8.3%) and Module choice/variety (8.2%)—are clearly positive (indices +23.9 and +22.7 respectively) and are discussed more than in the sector. This suggests students notice choice and coherence and reward it with positive sentiment.
Teaching comes through as a split story: Teaching Staff attract strong positivity (+32.5), while Delivery of teaching is neutral overall (−0.1) and below sector on tone. The signal here is about the mechanics: structure, pacing and the link between sessions and assessed outcomes. Remote learning carries a negative tone (−16.7), reinforcing that delivery format and clarity still matter.
Operational topics are a smaller share of the Economics conversation but remain material. Organisation & management of course sits near neutral (−2.0) and above sector on tone, yet Scheduling/timetabling (−29.2) and Communication about course and teaching (−28.0) trend negative where predictability and a single source of truth are missing.
Elsewhere, Learning resources and the Library are positively rated, and Student life is a net positive though below sector on tone. “Costs / Value for money” appears in a small share of comments but is strongly negative (−55.0). Placements/fieldwork feature only marginally in Economics compared with the sector.
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Feedback | Assessment and feedback | 9.8 | 7.3 | 2.5 | −21.2 | −6.2 |
Type and breadth of course content | Learning opportunities | 8.3 | 6.9 | 1.3 | +23.9 | +1.3 |
Module choice / variety | Learning opportunities | 8.2 | 4.2 | 4.1 | +22.7 | +5.4 |
Delivery of teaching | The teaching on my course | 6.6 | 5.4 | 1.1 | −0.1 | −8.9 |
Teaching Staff | The teaching on my course | 6.3 | 6.7 | −0.4 | +32.5 | −3.0 |
Student support | Academic support | 4.7 | 6.2 | −1.5 | +7.6 | −5.6 |
Student life | Learning community | 4.4 | 3.2 | 1.2 | +19.9 | −12.2 |
Assessment methods | Assessment and feedback | 3.8 | 3.0 | 0.8 | −32.5 | −8.7 |
Marking criteria | Assessment and feedback | 3.7 | 3.5 | 0.2 | −48.1 | −2.4 |
Organisation, management of course | Organisation and management | 3.7 | 3.3 | 0.4 | −2.0 | +12.0 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Marking criteria | Assessment and feedback | 3.7 | 3.5 | 0.2 | −48.1 | −2.4 |
Assessment methods | Assessment and feedback | 3.8 | 3.0 | 0.8 | −32.5 | −8.7 |
COVID-19 | Others | 2.2 | 3.3 | −1.1 | −32.1 | +0.9 |
Feedback | Assessment and feedback | 9.8 | 7.3 | 2.5 | −21.2 | −6.2 |
Remote learning | The teaching on my course | 2.4 | 3.5 | −1.1 | −16.7 | −7.7 |
Organisation, management of course | Organisation and management | 3.7 | 3.3 | 0.4 | −2.0 | +12.0 |
Delivery of teaching | The teaching on my course | 6.6 | 5.4 | 1.1 | −0.1 | −8.9 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Teaching Staff | The teaching on my course | 6.3 | 6.7 | −0.4 | +32.5 | −3.0 |
Career guidance, support | Learning community | 3.4 | 2.4 | 1.0 | +29.7 | −0.4 |
Library | Learning resources | 2.0 | 1.8 | 0.2 | +25.1 | −1.6 |
Type and breadth of course content | Learning opportunities | 8.3 | 6.9 | 1.3 | +23.9 | +1.3 |
Learning resources | Learning resources | 3.4 | 3.8 | −0.3 | +22.8 | +1.4 |
Module choice / variety | Learning opportunities | 8.2 | 4.2 | 4.1 | +22.7 | +5.4 |
Student life | Learning community | 4.4 | 3.2 | 1.2 | +19.9 | −12.2 |
Make assessment clarity the first lever. Publish annotated exemplars and checklist‑style rubrics; map each assessment to the relevant learning outcomes; and agree a realistic service level for feedback turnaround. Moderation and calibration notes (what “meets” vs “exceeds” looks like) reduce noise on fairness and consistency.
Tune delivery so structure is visible. Set simple, repeatable session templates (learning aims, worked example/application, “how this will be assessed”), and close the loop at the end of sessions with a short “what to do next” guide. Where remote elements persist, prioritise interaction and signposting to avoid the tone seen in remote learning.
Reduce operational friction. Name an owner for timetabling and course communications; publish changes and rationales in one place; and send a weekly “what changed” digest. This stabilises sentiment in scheduling and comms and supports delivery quality.
Protect what works. Keep the breadth and choice that students value, but make pathways explicit (recommended sequences and prerequisites). Maintain visible access to staff—availability scores well—and keep investing in resources and careers guidance where sentiment is strong.
Student Voice Analytics turns open‑text survey comments into clear, prioritised actions. It tracks topics, sentiment and movement by year for every discipline, with roll‑ups for the whole institution and drill‑downs to schools and programmes, so teams can focus on where change will move the needle most (e.g., Assessment & Feedback, Delivery, Organisation, Communications).
It also gives you proof. Like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status) show whether Economics is improving relative to the right peer group, not just the sector average. You can segment by site/provider, cohort and year of study, and produce concise, anonymised summaries for programme teams and external stakeholders. Export‑ready outputs (web, deck, dashboard) make it straightforward to share priorities and progress across the institution.