Updated Mar 11, 2026
Most universities track satisfaction closely, but dissatisfaction often shows where the student experience is starting to break down. A recent study by A. Mark Langan and W. Edwin Harris explains why dissatisfied and neutral responses can reveal risks and improvement opportunities that headline scores miss.
Student surveys in the UK have moved well beyond simple feedback tools. As tuition fees have risen and students have increasingly been framed as consumers, survey results have become powerful drivers of university rankings, policy decisions, and institutional priorities. That makes it risky to focus only on satisfaction, because dissatisfaction and neutrality may reveal problems that headline scores fail to capture.
Langan and Harris analysed more than 2.7 million responses from the National Student Survey (NSS) across 12 years. Using machine learning techniques, they set out to identify the predictors of dissatisfaction and neutrality, two outcomes that have received far less attention than satisfaction. That matters because institutions need to know not just how students rate their experience, but where dissatisfaction is most likely to take hold.
The findings challenge a common assumption: improving satisfaction alone gives you a full picture of the student experience. For institutions trying to act on survey data, the more useful question is what dissatisfied and neutral responses reveal that positive scores can hide.
The study revealed a complex interplay between dissatisfaction, neutrality, and satisfaction. Rather than treating satisfaction as a simple opposite of dissatisfaction, Langan and Harris showed that neutrality and disagreement also play an important role in shaping overall satisfaction metrics, a pattern that matters when interpreting sentiment in student feedback. The practical takeaway is that a single satisfaction score can miss meaningful variation in the student experience.
One of the study's most useful findings was the identification of key predictors of dissatisfaction and neutrality. Course organisation and teaching effectiveness emerged as significant factors, echoing known predictors of satisfaction but with important differences. For universities, that distinction matters because it points to practical levers for reducing frustration, not just lifting approval scores.
One of the most thought-provoking aspects of Langan and Harris's work is the illustrative league table based on dissatisfaction metrics. This hypothetical table shows how university rankings could shift if the focus were on minimising dissatisfaction rather than maximising satisfaction. That idea challenges current ranking logic and encourages a more useful debate about which metrics best reflect educational quality, especially when universities try to make better use of student evaluation data.
The insights from Langan and Harris's study call for a broader set of metrics in higher education. Focusing only on satisfaction overlooks the insight contained in dissatisfaction and neutrality. When universities track all three, they gain a fuller understanding of the student experience and a stronger basis for targeted improvement.
Incorporating student voice through text analysis of open-ended survey responses can deepen that understanding further. It helps institutions explore why students are dissatisfied, where neutral responses cluster, and which issues repeatedly surface across courses or services. In other words, text analysis adds context to the numbers and makes it easier to respond with precision.
The study by Langan and Harris is a useful reminder that student feedback becomes more valuable when institutions examine the full range of responses, not just the positive ones. By taking dissatisfaction and neutrality seriously, universities can identify pressure points earlier and act on them more effectively. As higher education continues to navigate the pressures of marketisation, the next practical step is to pair survey metrics with systematic analysis of NSS written feedback.
Q: How can universities effectively incorporate and act upon the qualitative data from student dissatisfaction and neutrality expressed in text comments?
A: Universities can act on qualitative feedback by using structured text analysis to surface recurring themes, pressure points, and differences between student groups. That work is most effective when a student experience or insights team combines comment analysis with survey scores and institutional context. Just as importantly, universities should close the loop by telling students what changed in response to their feedback, which helps build trust in the process and encourages future participation.
Q: What are the ethical considerations and potential biases involved in analysing and acting upon student dissatisfaction and neutrality, especially from text comments?
A: The main ethical priorities are anonymity, fair interpretation, and responsible use of the results. Universities need to protect student identities, avoid over-interpreting isolated comments, and use clear analytical frameworks so context is not lost. Diverse review teams and transparent governance can reduce the risk of bias, especially when text comments inform policy or performance decisions.
Q: How can student voice, particularly from dissatisfaction and neutrality expressed in text comments, influence policy making and strategic planning at universities?
A: Student voice can influence policy and strategy by showing where institutional decisions fail to match lived student experience. When universities analyse dissatisfaction and neutrality carefully, they can prioritise improvements in teaching, organisation, support, and communication based on evidence rather than assumption. That also helps build a more responsive culture, because students can see that their feedback shapes real decisions.
[Source] Langan, A.M., Harris, W.E. Metrics of student dissatisfaction and disagreement: longitudinal explorations of a national survey instrument. High Educ 87, 249–269 (2024).
DOI: 10.1007/s10734-023-01004-0
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.