Scope. UK NSS open-text comments for Electrical and Electronic Engineering (CAH10-01-08) students across academic years 2018–2025.
Volume. ~1,935 comments; 96.9% successfully categorised to a single primary topic.
Overall mood. Roughly 51.2% Positive, 44.6% Negative, 4.2% Neutral (positive:negative ≈ 1.15:1).
Electrical and Electronic Engineering students talk most about assessment. Feedback is the single largest topic (10.5% share) and carries a negative tone (sentiment index −21.7). Two closely related themes—Marking criteria (5.6%, −48.8) and Assessment methods (3.9%, −33.5)—reinforce that clarity and consistency in assessment design, expectations and marking standards are the main friction points. This concentration is above sector by share and below sector on tone.
Teaching and curriculum are prominent and mixed. Students are positive about Teaching Staff (7.5%, +21.1) and the Type and breadth of course content (8.1%, +16.0), but the Delivery of teaching trends negative overall (6.4%, −8.1) and Remote learning is notably negative (2.5%, −24.9), both below sector on tone. Module choice/variety is discussed less and is close to neutral (3.7%, +1.2).
Operational delivery issues appear, though at lower volumes than assessment. Organisation and management of the course is negative (4.2%, −25.6), as are Scheduling/timetabling (1.9%, −19.9) and Communication about course and teaching (1.8%, −32.2). Student voice is present (2.8%) but leans negative (−10.4), suggesting students want clearer routes to input and to see actions taken.
Support and resources show contrasts. Student support is relatively under-represented (3.4% vs 6.2% sector) and near neutral to negative (−3.0). Personal Tutor is rarely mentioned (1.1%) and sits near neutral (+1.5), well below sector on tone and share. By contrast, General facilities (3.6%, +47.3), Career guidance/support (3.4%, +43.6) and Student life (2.8%, +44.7) are strong positives and often above sector on tone. Placements/fieldwork/trips are far less prominent than in the sector (1.0% vs 3.4%) but, when mentioned, are strikingly positive (+51.0).
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Feedback | Assessment and feedback | 10.5 | 7.3 | +3.1 | −21.7 | −6.6 |
Type & breadth of course content | Learning opportunities | 8.1 | 6.9 | +1.2 | +16.0 | −6.6 |
Teaching Staff | The teaching on my course | 7.5 | 6.7 | +0.8 | +21.1 | −14.4 |
Delivery of teaching | The teaching on my course | 6.4 | 5.4 | +1.0 | −8.1 | −16.8 |
Marking criteria | Assessment and feedback | 5.6 | 3.5 | +2.1 | −48.8 | −3.1 |
Organisation & management | Organisation and management | 4.2 | 3.3 | +0.9 | −25.6 | −11.7 |
Assessment methods | Assessment and feedback | 3.9 | 3.0 | +0.9 | −33.5 | −9.7 |
Module choice/variety | Learning opportunities | 3.7 | 4.2 | −0.5 | +1.2 | −16.2 |
General facilities | Learning resources | 3.6 | 1.8 | +1.8 | +47.3 | +23.9 |
Career guidance/support | Learning community | 3.4 | 2.4 | +1.0 | +43.6 | +13.6 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
Workload | Organisation and management | 2.7 | 1.8 | +0.9 | −50.1 | −10.1 |
Marking criteria | Assessment and feedback | 5.6 | 3.5 | +2.1 | −48.8 | −3.1 |
COVID-19 | Others | 2.9 | 3.3 | −0.4 | −42.4 | −9.5 |
Assessment methods | Assessment and feedback | 3.9 | 3.0 | +0.9 | −33.5 | −9.7 |
Organisation & management of course | Organisation and management | 4.2 | 3.3 | +0.9 | −25.6 | −11.7 |
Remote learning | The teaching on my course | 2.5 | 3.5 | −1.0 | −24.9 | −15.9 |
Feedback | Assessment and feedback | 10.5 | 7.3 | +3.1 | −21.7 | −6.6 |
Category | Section | Share % | Sector % | Δ pp | Sentiment idx | Δ vs sector |
---|---|---|---|---|---|---|
General facilities | Learning resources | 3.6 | 1.8 | +1.8 | +47.3 | +23.9 |
Student life | Learning community | 2.8 | 3.2 | −0.3 | +44.7 | +12.6 |
Career guidance/support | Learning community | 3.4 | 2.4 | +1.0 | +43.6 | +13.6 |
Teaching Staff | Teaching on my course | 7.5 | 6.7 | +0.8 | +21.1 | −14.4 |
Type & breadth of course content | Learning opportunities | 8.1 | 6.9 | +1.2 | +16.0 | −6.6 |
Learning resources | Learning resources | 3.1 | 3.8 | −0.7 | +13.3 | −8.1 |
Opportunities to work with others | Learning community | 3.1 | 2.0 | +1.1 | +11.4 | +10.4 |
Make assessment clarity the first priority. Publish annotated exemplars, checklist-style rubrics and indicative grade profiles; state how criteria map to outcomes; and agree a realistic feedback SLA with visible ownership. Run light-touch marker calibration to reduce variance between modules.
Stabilise delivery operations. Use a single source of truth for timetables and changes, share a brief weekly “what changed and why,” and minimise late adjustments. Where remote elements remain, set expectations for format, interaction and materials so the experience is predictable.
Strengthen support visibility. Ensure the Personal Tutor model and wider support routes are easy to find and consistent across cohorts. Keep doing what works—students rate facilities, careers support and the broader student experience highly—so signpost these at key points in the academic cycle.
Student Voice Analytics turns open-text survey comments into clear priorities you can act on. It tracks topics and sentiment by year for every discipline, including Electrical and Electronic Engineering, so teams can focus on high‑impact areas such as Feedback, Marking criteria, Assessment methods, Delivery of teaching, Organisation and course communications.
It supports whole‑institution views as well as fine‑grained department and school analysis. The platform produces concise, anonymised theme summaries and representative comments for programme teams and external stakeholders, so you can brief colleagues without trawling thousands of responses. Most importantly, it enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), and segmentation by site/provider, cohort and year—letting you evidence whether change is real relative to the right peer group. Export‑ready outputs (web, deck, dashboard) make it straightforward to share priorities and progress across the institution.