Student Voice Analytics for Information Technology — UK student feedback 2018–2025

Scope. UK NSS open‑text comments for Information Technology (CAH11-01-02) students across academic years 2018–2025.
Volume. ~4,130 comments; 95.7% successfully categorised to a single primary topic.
Overall mood. Roughly 53.0% Positive, 43.4% Negative, 3.6% Neutral (positive:negative ≈ 1.22:1).

What students are saying

Information Technology students talk first about the basics that enable learning: learning resources. At ~10.5% of all comments, “Learning resources” is the largest single topic and the tone is moderately positive (index ~+13.7), though a little below the sector’s average for the same topic. Within the same space, “IT Facilities” features quite prominently (4.0%) but carries a negative tone (−10.1), albeit slightly less negative than the sector. By contrast, mentions of the “Library” are strongly positive, though they appear less often.

The curriculum and how it is put together are a close second. “Type and breadth of course content” (8.6%) and “Module choice/variety” (4.3%) both trend mildly positive but sit below sector on tone. Experiences of “Delivery of teaching” are finely balanced to slightly negative (−1.9), even as “Teaching Staff” themselves are viewed warmly (+30.4).

Operational delivery stands out positively against the sector. “Scheduling/timetabling” (5.2%) and “Organisation, management of course” (3.9%) are both net positive (indices ~+22.6 and +10.4 respectively) and far above sector on tone. “Remote learning” (5.8%) is also notably more positive than sector. One caveat is “Communication about course and teaching” which, while a smaller topic by volume (1.5%), leans negative (−21.5) but is still less negative than sector.

In Assessment & Feedback, students’ priorities mirror the sector but with their own emphases. “Feedback” (7.7%) sits close to neutral overall (−2.2) and is notably less negative than the sector benchmark. However, “Marking criteria” (3.6%) and “Assessment methods” (3.5%) are strongly negative (−43.3 and −36.2), pointing to uncertainty about expectations and perceived fairness. Collaboration‑related experience is also a pain point: “Opportunities to work with other students” (2.5%) is negative (−18.7) and well below sector on tone.

Finally, some topics are far less present here than in the sector—most obviously “Placements/fieldwork/trips” (0.1% vs 3.4% sector)—indicating that day‑to‑day experience is shaped more by resources, curriculum and delivery than by off‑site learning.

Top categories by share (Information Technology vs sector)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Learning resources Learning resources 10.5 3.8 6.8 13.7 −7.7
Type and breadth of course content Learning opportunities 8.6 6.9 1.7 8.7 −13.9
Feedback Assessment and feedback 7.7 7.3 0.4 −2.2 12.8
Student support Academic support 5.9 6.2 −0.3 15.3 2.1
Remote learning The teaching on my course 5.8 3.5 2.3 12.6 21.6
Scheduling/ timetabling Organisation and management 5.2 2.9 2.3 22.6 39.1
Teaching Staff The teaching on my course 5.2 6.7 −1.5 30.4 −5.2
Module choice / variety Learning opportunities 4.3 4.2 0.1 5.1 −12.3
Personal Tutor Academic support 4.2 3.2 1.0 16.0 −2.7
IT Facilities Learning resources 4.0 1.2 2.8 −10.1 3.9

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment and feedback 3.6 3.5 0.1 −43.3 2.4
Assessment methods Assessment and feedback 3.5 3.0 0.5 −36.2 −12.5
COVID-19 Others 2.8 3.3 −0.5 −32.4 0.5
Opportunities to work with other students Learning community 2.5 2.0 0.6 −18.7 −19.7
IT Facilities Learning resources 4.0 1.2 2.8 −10.1 3.9
Feedback Assessment and feedback 7.7 7.3 0.4 −2.2 12.8
Delivery of teaching The teaching on my course 3.1 5.4 −2.4 −1.9 −10.7

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Personal development Learning community 2.7 2.5 0.2 47.8 −12.1
Teaching Staff The teaching on my course 5.2 6.7 −1.5 30.4 −5.2
Availability of teaching staff Academic support 2.1 2.1 0.0 28.5 −10.8
Student life Learning community 2.7 3.2 −0.4 23.8 −8.3
Scheduling/ timetabling Organisation and management 5.2 2.9 2.3 22.6 39.1
Personal Tutor Academic support 4.2 3.2 1.0 16.0 −2.7
Student support Academic support 5.9 6.2 −0.3 15.3 2.1

What this means in practice

  • Keep the foundations strong. Students notice when core learning resources work smoothly. Prioritise reliability and access (platforms, facilities, software/licences, and support routes). Where facilities are a friction point, publish clear status pages, ownership, and SLAs for fixes.

  • Make assessment expectations unmistakable. The most negative topics centre on criteria and methods. Share annotated exemplars, checklist‑style rubrics, and plain‑English briefs that tie tasks to criteria. Set and meet feedback SLAs; use quick “what to do next” summaries to improve usefulness.

  • Protect the operational rhythm. Students respond well to dependable schedules and visible ownership of course organisation. Keep a single source of truth for changes, with short weekly updates. Where communication dips, reduce channel sprawl and name accountable owners.

  • Build in collaborative structure. If peer collaboration is important, design it: clearly scoped group work, staged deliverables, and transparent contribution tracking lift both experience and perceived fairness.

Data at a glance (2018–2025)

Top topics by share are stable across the period: Learning resources (≈10.5%), Type and breadth of course content (≈8.6%), Feedback (≈7.7%), Remote learning (≈5.8%), and Scheduling/timetabling and Teaching Staff (each ≈5.2%). The delivery & ops cluster (placements/fieldwork, scheduling, organisation, course comms, remote) accounts for ~16–17% of all comments and is markedly more positive than sector on key items (scheduling +39.1, remote learning +21.6 vs sector). The people & growth cluster (personal tutor, student support, teaching staff, delivery of teaching, personal development, student life) holds roughly ~26% of comments, with strong positives for Teaching Staff and Personal development, but “Delivery of teaching” sits slightly negative. Assessment & Feedback topics together make up ~15% of comments; tone improves sharply when criteria and methods are clarified.

How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is summarised as an index from −100 (more negative than positive) to +100 (more positive than negative) and compared with sector where available.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text into clear, prioritised actions. It tracks topics, sentiment and change by year so you can evidence improvement across the institution and within specific departments, schools and programme teams.

It also enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status). You can segment by site/provider, cohort and year, generate concise anonymised summaries for stakeholders and programme teams, and export/share results for briefing packs, dashboards and governance.

Insights into specific areas of information technology education