Student Voice Analytics for Information Systems — UK student feedback 2018–2025

Scope. UK NSS open-text comments for Information Systems (CAH11-01-03) students across academic years 2018–2025.
Volume. ~308 comments; 97.1% successfully categorised to a single primary topic.
Overall mood. Roughly 53.8% Positive, 43.8% Negative, 2.3% Neutral (positive:negative ≈ 1.23:1).

What students are saying

The Information Systems conversation is led by the substance and people delivering the course. “Type and breadth of course content” is the single largest topic (12.4% share) and is notably positive (sentiment index +27.1), appearing well above sector volume for the same topic. “Teaching Staff” is similarly prominent (12.0% share) and positive overall, though its tone sits below the sector benchmark.

By contrast, the “how” of teaching draws criticism. “Delivery of teaching” is the third-largest topic (9.7%) and leans negative (−22.0), significantly below the sector’s generally positive tone. Operational themes pull in the same direction: “Scheduling/timetabling” (4.3%, −42.6) is a consistent pain point and “Organisation & management of course” (4.0%) trends negative too. While smaller in volume, “Communication about course and teaching” carries particularly low sentiment (−62.4), indicating that clarity and predictability of information are recurrent issues.

Assessment & feedback mirrors sector patterns with local emphases. “Feedback” accounts for 8.0% of comments and skews negative when usefulness, timeliness or specificity are unclear. “Marking criteria” (3.7%) is also negative but less so than the sector, implying students see some progress when expectations are spelled out.

Set against this, students highlight people-centred strengths. “Student support” is a net positive (+20.2). “Student life” (2.7%) is very positive, and “Career guidance, support” stands out (+84.6), far above sector tone. “Learning resources” and “IT facilities” are broadly positive or at least better than sector baselines, and “Remote learning” is near-neutral but more positive than sector.

Some topics are less present here than sector-wide—“Module choice/variety” and “Placements/fieldwork/trips” appear relatively infrequently—suggesting the narrative is shaped more by delivery mechanics and assessment clarity than by optionality or placements.

Top categories by share (Information Systems vs sector)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Type and breadth of course content Learning opportunities 12.4 6.9 +5.4 +27.1 +4.5
Teaching Staff The teaching on my course 12.0 6.7 +5.3 +17.9 −17.6
Delivery of teaching The teaching on my course 9.7 5.4 +4.2 −22.0 −30.8
Feedback Assessment and feedback 8.0 7.3 +0.7 −16.4 −1.4
Student support Academic support 5.0 6.2 −1.2 +20.2 +7.0
Scheduling/ timetabling Organisation & management 4.3 2.9 +1.5 −42.6 −26.1
Organisation, management of course Organisation & management 4.0 3.3 +0.7 −14.0 0.0
Marking criteria Assessment and feedback 3.7 3.5 +0.1 −32.2 +13.5
Opportunities to work with other students Learning community 3.3 2.0 +1.4 +3.7 +2.7
Student voice Student voice 3.3 1.8 +1.6 −30.7 −11.4

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Scheduling/ timetabling Organisation & management 4.3 2.9 +1.5 −42.6 −26.1
Marking criteria Assessment & feedback 3.7 3.5 +0.1 −32.2 +13.5
Student voice Student voice 3.3 1.8 +1.6 −30.7 −11.4
Delivery of teaching The teaching on my course 9.7 5.4 +4.2 −22.0 −30.8
Feedback Assessment & feedback 8.0 7.3 +0.7 −16.4 −1.4
Organisation, management of course Organisation & management 4.0 3.3 +0.7 −14.0 0.0

Shares are the proportion of all Information Systems comments whose primary topic is the category. Sentiment index ranges from −100 (more negative than positive) to +100 (more positive than negative).

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Career guidance, support Learning community 2.3 2.4 −0.1 +84.6 +54.6
Student life Learning community 2.7 3.2 −0.5 +48.0 +15.9
Personal development Learning community 2.0 2.5 −0.5 +40.3 −19.5
Type and breadth of course content Learning opportunities 12.4 6.9 +5.4 +27.1 +4.5
Module choice / variety Learning opportunities 2.3 4.2 −1.8 +22.2 +4.8
Learning resources Learning resources 3.0 3.8 −0.8 +20.9 −0.6
Student support Academic support 5.0 6.2 −1.2 +20.2 +7.0

What this means in practice

  • Strengthen the operational rhythm. The timetabling signal is clear. Set and communicate a change window, name an owner for schedules, and maintain a single “source of truth” for course updates. For communications specifically, agree response and update SLAs and use templated change notes so students understand what changed and why.

  • Rebuild delivery fundamentals. Students respond poorly when delivery feels unclear or inconsistent. Make learning outcomes visible per session, signpost structure (overview, practice, recap), and use short, active tasks to keep pace and purpose clear. Check alignment between session content and assessment demands.

  • Make assessment clarity the default. Publish annotated exemplars, adopt checklist-style rubrics, and calibrate marking across markers. Commit to realistic feedback turnaround times and describe how to use feedback in the next task.

  • Keep and scale the positives. Protect what’s working—career guidance, student life, and accessible support. Bring careers touchpoints earlier in modules and link content choices to pathways to maintain that very strong sentiment.

Data at a glance (2018–2025)

  • Top topics by share: Type & breadth of course content (≈12.4%), Teaching Staff (≈12.0%), Delivery of teaching (≈9.7%), Feedback (≈8.0%), Student support (≈5.0%).
  • Cluster view:
    • Delivery & ops cluster (placements, scheduling, organisation, course comms, remote): ≈13.7% of comments, with especially low tone on timetabling and course communications.
    • People & growth cluster (personal tutor/support, teaching staff, delivery of teaching, personal development, student life): ≈33.7% of comments, generally positive, led by careers and student life.
  • Relative to sector: students talk more than average about content, teaching staff and delivery mechanics, less about module choice and placements. Several delivery topics carry a worse tone than sector, while careers, student life, IT facilities and remote learning are better than sector baselines.
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is calculated per sentence and summarised as an index from −100 to +100, then averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open-text responses into clear priorities by tracking topics, sentiment and movement by year for every discipline, including Information Systems. It supports whole‑institution views as well as fine‑grained analysis at department and school level, producing concise, anonymised theme summaries and representative comments so teams and stakeholders can act without trawling thousands of responses.

Most importantly, it enables like‑for‑like evaluation: you can make sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), and segment by site/provider, cohort and year to target interventions where they will shift sentiment most. Export‑ready outputs (for web, decks and dashboards) make it straightforward to share priorities and progress across the organisation.

Insights into specific areas of information systems education