Scope. UK NSS open-text comments for Marketing (CAH17-01-03) students across academic years 2018–2025.
Volume. ~2,609 comments; 97.6% successfully categorised to a single primary topic.
Overall mood. ≈53.3% Positive, 43.0% Negative, 3.7% Neutral (positive:negative ≈ 1.24:1).
Marketing students talk first about assessment and teaching. Two topics lead the conversation in equal measure: Feedback (7.9% share) and Teaching Staff (7.9%). Feedback is a consistent pain point (sentiment index −13.7), though slightly less negative than the sector overall. The sharper edge sits with Marking criteria (4.2% share; −52.1), where students signal uncertainty about what “good” looks like. Assessment methods (4.1%; −11.9) draws mixed sentiment but is notably less negative than the sector benchmark.
Set against this, people-centred themes are a strength. Teaching Staff comments are strongly positive (+36.0), and Student support (5.8%; +20.1) also trends well above sector on tone. Delivery of teaching (5.8%) lands close to neutral (+1.5), implying variability in structure or expectations. Students value choice and resources: Module choice/variety (3.8%; +27.9), Learning resources (3.1%; +26.1) and the Library (2.2%; +37.0) are clear plus points.
Career preparation stands out. Career guidance/support carries both an elevated share (4.2%, +1.8 pp vs sector) and a very positive tone (+44.1), suggesting that practical signposting, live briefs and mentoring are resonating. Personal development sentiment is also high (+59.8), albeit a smaller share by volume (1.9%).
Operational themes are present but not dominant. Remote learning (3.0%; −32.8), Scheduling/timetabling (2.9%; −22.1), and Communication about course and teaching (1.7%; −35.7) are the main friction points. Placements/fieldwork/trips appear less frequently than in the sector (1.7% vs 3.4%) and are relatively positive (+31.3). A small but strongly negative subset concerns Costs/Value for money (1.7%; −58.5).
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Feedback | Assessment and feedback | 7.9 | 7.3 | +0.6 | −13.7 | +1.4 |
Teaching Staff | The teaching on my course | 7.9 | 6.7 | +1.2 | +36.0 | +0.5 |
Type and breadth of course content | Learning opportunities | 7.0 | 6.9 | +0.1 | +20.7 | −1.9 |
Student support | Academic support | 5.8 | 6.2 | −0.4 | +20.1 | +6.9 |
Delivery of teaching | The teaching on my course | 5.8 | 5.4 | +0.3 | +1.5 | −7.3 |
Career guidance, support | Learning community | 4.2 | 2.4 | +1.8 | +44.1 | +14.1 |
Marking criteria | Assessment and feedback | 4.2 | 3.5 | +0.6 | −52.1 | −6.4 |
Assessment methods | Assessment and feedback | 4.1 | 3.0 | +1.1 | −11.9 | +11.8 |
Module choice / variety | Learning opportunities | 3.8 | 4.2 | −0.3 | +27.9 | +10.5 |
Opportunities to work with other students | Learning community | 3.6 | 2.0 | +1.6 | −2.4 | −3.5 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Marking criteria | Assessment and feedback | 4.2 | 3.5 | +0.6 | −52.1 | −6.4 |
COVID-19 | Others | 2.8 | 3.3 | −0.5 | −43.7 | −10.8 |
Remote learning | The teaching on my course | 3.0 | 3.5 | −0.5 | −32.8 | −23.7 |
Scheduling/ timetabling | Organisation and management | 2.9 | 2.9 | +0.0 | −22.1 | −5.6 |
Feedback | Assessment and feedback | 7.9 | 7.3 | +0.6 | −13.7 | +1.4 |
Assessment methods | Assessment and feedback | 4.1 | 3.0 | +1.1 | −11.9 | +11.8 |
Organisation, management of course | Organisation and management | 2.2 | 3.3 | −1.1 | −11.3 | +2.6 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Career guidance, support | Learning community | 4.2 | 2.4 | +1.8 | +44.1 | +14.1 |
Student life | Learning community | 3.1 | 3.2 | −0.1 | +41.1 | +9.0 |
Library | Learning resources | 2.2 | 1.8 | +0.4 | +37.0 | +10.3 |
Teaching Staff | The teaching on my course | 7.9 | 6.7 | +1.2 | +36.0 | +0.5 |
Availability of teaching staff | Academic support | 2.1 | 2.1 | +0.0 | +29.0 | −10.3 |
Module choice / variety | Learning opportunities | 3.8 | 4.2 | −0.3 | +27.9 | +10.5 |
Learning resources | Learning resources | 3.1 | 3.8 | −0.6 | +26.1 | +4.7 |
Start with assessment clarity. Students are confident about teaching quality but want to see exactly how to succeed. Publish annotated exemplars at multiple grade bands, map criteria to visible “hallmarks of quality,” and use a concise, checklist-style rubric that travels with the work. Calibrate markers with short norming exercises and commit to a realistic feedback SLA, then publish actual turnaround times.
Lean into people-led strengths. Make it easy for students to access the staff they rate highly: clearly signposted office hours, timely replies, and consistent channels for quick questions. Keep the effective elements of career guidance—live briefs, alumni input, employer touchpoints—and make outcomes visible so students understand the value they’re getting.
Tidy up the delivery basics. Set a baseline for digital delivery (platform, structure, materials availability), reduce last‑minute timetable changes, and maintain a single source of truth for course communications with a brief weekly update (“what changed and why”).
Where collaboration is mixed, structure it. For group tasks, use contribution logs, interim check‑ins, and brief peer‑review moments so students experience teamwork as fair and developmental rather than risky.
How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.
Student Voice Analytics turns open-text survey comments into clear priorities you can act on. It tracks topics, sentiment and movement by year for every discipline, enabling whole‑institution views as well as fine‑grained analysis by school, department and programme.
Most importantly, it enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), so you can evidence progress against the right peer group. You can segment by site/provider, cohort or year of study, and generate concise, anonymised theme summaries and representative comments for programme teams and external partners. Export‑ready outputs (for web, decks or dashboards) make it straightforward to share priorities and track progress.