Student Voice Analytics for Aeronautical and Aerospace Engineering — UK student feedback 2018–2025

Scope. UK NSS open‑text comments for Aeronautical and Aerospace Engineering (CAH10-01-04) students across academic years 2018–2025.
Volume. ~1,124 comments; 97.4% successfully categorised to a single primary topic.
Overall mood. Roughly 48.9% Positive, 46.3% Negative, 4.8% Neutral (positive:negative ≈ 1.06:1).

What students are saying

The conversation in Aeronautical and Aerospace Engineering is led by the content and mechanics of the course. The largest single topic is the type and breadth of course content (7.9% share), which is discussed positively (sentiment index +22.9) and in line with the sector. Students also talk a lot about teaching: Teaching Staff (7.0%) draw a positive tone (+16.1) but are markedly less positive than the sector benchmark, while Delivery of teaching (6.4%) is mixed to negative (−6.5).

On the infrastructure side, General facilities stand out: they are mentioned far more often than in the sector (5.4% vs 1.8%) and with strong positivity (+41.2). Peer collaboration also features more than average—Opportunities to work with other students (5.2%)—and is viewed favourably (+12.7).

Assessment & Feedback is a clear pressure point. Feedback (7.6%) is negative (−20.4); Assessment methods (4.1%) is very negative (−40.5) and below sector; Marking criteria (3.6%) is among the lowest‑scoring topics (−51.6). In short, expectations, methods and marking transparency are the common threads. Related operational topics also lean negative: Scheduling/timetabling (3.0%, −43.9) and Organisation & management of the course (3.0%, −29.4). Remote learning (3.1%, −27.4) and COVID‑19 (3.9%, −46.4) retain a negative tail.

Some topics are less present here than in the wider sector. Placements/fieldwork/trips appear in just 1.0% of comments (vs 3.4% sector), albeit with a positive tone (+33.4). Student support is mentioned less than sector (3.0% vs 6.2%) and is only slightly positive (+4.6).

Top categories by share (discipline vs sector)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Type and breadth of course content Learning opportunities 7.9 6.9 +0.9 +22.9 +0.3
Feedback Assessment and feedback 7.6 7.3 +0.3 −20.4 −5.3
Teaching Staff The teaching on my course 7.0 6.7 +0.3 +16.1 −19.4
Delivery of teaching The teaching on my course 6.4 5.4 +0.9 −6.5 −15.2
General facilities Learning resources 5.4 1.8 +3.6 +41.2 +17.7
Opportunities to work with other students Learning community 5.2 2.0 +3.2 +12.7 +11.7
Assessment methods Assessment and feedback 4.1 3.0 +1.1 −40.5 −16.8
COVID-19 Others 3.9 3.3 +0.6 −46.4 −13.4
Learning resources Learning resources 3.7 3.8 +0.0 +14.9 −6.5
Marking criteria Assessment and feedback 3.6 3.5 +0.0 −51.6 −5.9

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment and feedback 3.6 3.5 +0.0 −51.6 −5.9
COVID-19 Others 3.9 3.3 +0.6 −46.4 −13.4
Scheduling/ timetabling Organisation and management 3.0 2.9 +0.1 −43.9 −27.4
Workload Organisation and management 2.9 1.8 +1.1 −42.0 −2.0
Assessment methods Assessment and feedback 4.1 3.0 +1.1 −40.5 −16.8
Organisation, management of course Organisation and management 3.0 3.3 −0.3 −29.4 −15.4
Remote learning The teaching on my course 3.1 3.5 −0.4 −27.4 −18.4

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
General facilities Learning resources 5.4 1.8 +3.6 +41.2 +17.7
Student life Learning community 2.6 3.2 −0.5 +40.0 +7.9
Type and breadth of course content Learning opportunities 7.9 6.9 +0.9 +22.9 +0.3
Teaching Staff The teaching on my course 7.0 6.7 +0.3 +16.1 −19.4
Learning resources Learning resources 3.7 3.8 +0.0 +14.9 −6.5
Career guidance, support Learning community 2.5 2.4 +0.1 +13.4 −16.6
Opportunities to work with other students Learning community 5.2 2.0 +3.2 +12.7 +11.7

What this means in practice

  • Make assessment clarity a priority. Publish annotated exemplars, checklist‑style rubrics, and short marking rationales. Calibrate markers and communicate how methods align to outcomes. A simple service‑level for feedback (what, when, where) will lift sentiment across Feedback, Assessment methods and Marking criteria.

  • Stabilise the operational rhythm. Name an owner for scheduling and organisation; keep a single source of truth for changes; share a weekly “what changed and why” note. This tackles the recurring negatives in Scheduling/timetabling, Organisation & management and Remote learning.

  • Protect what works. Students respond well to the breadth of content and to General facilities. Keep those experiences visible (e.g., structured overviews of content, clear access to facilities) and maintain the conditions that enable effective peer collaboration.

  • Sense‑check support touchpoints. Student support is less discussed and only slightly positive compared with the sector. Make it easy to find the right person, and close the loop when issues are raised.

Data at a glance (2018–2025)

  • Top topics by share: Type & breadth of course content (7.9%), Feedback (7.6%), Teaching Staff (7.0%), Delivery of teaching (6.4%), General facilities (5.4%), Opportunities to work with other students (5.2%).
  • Cluster view: the delivery & ops cluster (placements, scheduling, organisation, comms, remote) accounts for ~11.7% of all comments; the people & growth cluster (personal tutor, student support, teaching staff, delivery, personal development, student life) is ~21.9%.
  • Assessment & Feedback topics together are ~15.7% of all comments and lean negative overall.
  • Relative presence: Placements/fieldwork/trips are less discussed than sector (1.0% vs 3.4%) but skew positive; General facilities are discussed much more than sector (5.4% vs 1.8%) and are strongly positive.

How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all categorised comments. Sentiment is calculated per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text survey comments into clear priorities you can act on. It tracks topics and sentiment over time, so schools can see what is changing year‑on‑year at whole‑institution level and within specific faculties, departments or programmes.

It supports concise, anonymised summaries for programme teams and external stakeholders, and lets you prove change on a like‑for‑like basis with sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status). You can segment by site/provider, cohort and year to target interventions where they will move sentiment most. Export‑ready outputs (web, deck, dashboard) make it straightforward to share priorities and progress across the institution.

Insights into specific areas of aeronautical and aerospace engineering education