Scope. UK NSS open‑text comments for Mechanical Engineering (CAH10-01-02) students across academic years 2018–2025.
Volume. ~4,576 comments; 97.6% successfully categorised to a single primary topic (≈4,464 used).
Overall mood. Roughly 49.8% Positive, 45.9% Negative, 4.3% Neutral (positive:negative ≈ 1.08:1).
Mechanical Engineering students focus first on the substance of their degree: the type and breadth of course content (8.9% share) and how teaching is delivered (7.0%). Content is viewed positively overall (sentiment index +14.7) but sits below the sector tone, while Delivery of teaching lands negative (−11.4), notably under sector. Staff themselves are rated positively (+17.9), suggesting the issues lie more in delivery mechanics (structure, clarity, timing) than in staff commitment or expertise.
Assessment and feedback are prominent and challenging. Together, Feedback (8.3% share, −25.7), Marking criteria (4.1%, −46.1), and Assessment methods (3.0%, −28.1) account for a substantial portion of conversation and trend more negative than sector. The pattern is consistent with students wanting clearer criteria, more actionable feedback, and predictable turnaround.
Operational reliability also matters. Organisation and management (3.9%, −26.5), Scheduling/timetabling (2.1%, −31.9) and Communication about course and teaching (1.3%, −36.1) collectively point to friction around planning and change‑handling. Remote learning (2.8%, −23.0) remains a negative outlier.
Set against this are several bright spots. Opportunities to work with other students (4.3%) are strongly positive (+30.9, well above sector), echoing good experiences of teamwork and peer learning. Career guidance/support (3.1%, +38.0), Availability of teaching staff (2.1%, +37.0), Student life (2.3%, +34.2), Library (1.5%, +33.4) and Learning resources (3.9%, +18.6) also contribute to a positive “people and support” narrative. Placements/fieldwork/trips are far less central here than in the wider sector (0.9% vs 3.4%).
| Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
|---|---|---|---|---|---|---|
| Type and breadth of course content | Learning opportunities | 8.9 | 6.9 | +2.0 | +14.7 | −7.9 |
| Feedback | Assessment & feedback | 8.3 | 7.3 | +1.0 | −25.7 | −10.7 |
| Delivery of teaching | The teaching on my course | 7.0 | 5.4 | +1.6 | −11.4 | −20.2 |
| Teaching Staff | The teaching on my course | 6.7 | 6.7 | 0.0 | +17.9 | −17.7 |
| Opportunities to work with other students | Learning community | 4.3 | 2.0 | +2.3 | +30.9 | +29.8 |
| Student support | Academic support | 4.2 | 6.2 | −2.0 | +6.1 | −7.1 |
| Marking criteria | Assessment & feedback | 4.1 | 3.5 | +0.5 | −46.1 | −0.4 |
| General facilities | Learning resources | 4.0 | 1.8 | +2.2 | +14.6 | −8.9 |
| Learning resources | Learning resources | 3.9 | 3.8 | +0.1 | +18.6 | −2.8 |
| Organisation & management of course | Organisation & management | 3.9 | 3.3 | +0.5 | −26.5 | −12.6 |
| Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
|---|---|---|---|---|---|---|
| Marking criteria | Assessment & feedback | 4.1 | 3.5 | +0.5 | −46.1 | −0.4 |
| Workload | Organisation & management | 2.6 | 1.8 | +0.8 | −45.9 | −5.9 |
| COVID-19 | Others | 3.8 | 3.3 | +0.5 | −32.3 | +0.7 |
| Scheduling/timetabling | Organisation & management | 2.1 | 2.9 | −0.8 | −31.9 | −15.4 |
| Assessment methods | Assessment & feedback | 3.0 | 3.0 | +0.1 | −28.1 | −4.3 |
| Organisation & management of course | Organisation & management | 3.9 | 3.3 | +0.5 | −26.5 | −12.6 |
| Feedback | Assessment & feedback | 8.3 | 7.3 | +1.0 | −25.7 | −10.7 |
Shares are the proportion of all Mechanical Engineering comments whose primary topic is the category. Sentiment index ranges from −100 (more negative than positive) to +100 (more positive than negative).
| Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
|---|---|---|---|---|---|---|
| Career guidance, support | Learning community | 3.1 | 2.4 | +0.7 | +38.0 | +7.9 |
| Availability of teaching staff | Academic support | 2.1 | 2.1 | 0.0 | +37.0 | −2.3 |
| Student life | Learning community | 2.3 | 3.2 | −0.9 | +34.2 | +2.1 |
| Opportunities to work with other students | Learning community | 4.3 | 2.0 | +2.3 | +30.9 | +29.8 |
| Learning resources | Learning resources | 3.9 | 3.8 | +0.1 | +18.6 | −2.8 |
| Teaching Staff | The teaching on my course | 6.7 | 6.7 | 0.0 | +17.9 | −17.7 |
| Type and breadth of course content | Learning opportunities | 8.9 | 6.9 | +2.0 | +14.7 | −7.9 |
Make assessment clarity a design principle. Publish annotated exemplars, checklist‑style rubrics and sample marked scripts; map marking criteria to learning outcomes in plain language; and agree realistic feedback SLAs with visible progress tracking. This will lift Feedback, Marking criteria and Assessment methods simultaneously.
Stabilise delivery mechanics. Use a single source of truth for module information and weekly plans; set and keep a “no surprises” window for timetable changes; and share a brief “what changed and why” update when plans must move. These steps address Organisation & management, Scheduling and course Comms together.
Reinforce what already works. Protect time for staff availability and signpost office hours; keep teamwork structures (roles, milestones, peer review) that are driving the strong sentiment around collaboration; and continue targeted Career guidance with tangible touchpoints (clinics, CV/interview support, employer connects).
Tune remote elements for purpose. Where online components remain, make expectations explicit (format, interaction, assessment fit) to avoid the persistent negative tail on Remote learning and Delivery of teaching.
How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.
Student Voice Analytics turns open‑text survey comments into clear, prioritised actions. It tracks topics and sentiment by year for Mechanical Engineering and every other discipline, so teams can see where teaching delivery, assessment clarity or course operations most affect student experience. Outputs include concise, anonymised theme summaries and representative comments for programme teams and external partners, without trawling through thousands of responses.
Crucially, it enables like‑for‑like proof of change with sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status). You can segment by site/provider, cohort and year to target interventions, and view results at whole‑institution level as well as by faculty, school or department. Export‑ready views (web, decks, dashboards) make it straightforward to share priorities and progress across the organisation.