Published Jan 24, 2022 · Updated Feb 26, 2026
A halo effect is a strong correlation between two questionnaire items that should be unrelated.
Do your student evaluation scores move together, even when the questions are meant to measure different things? That pattern can be a halo effect, and it can blur what the numbers are actually telling you.
This paper analyses correlations in student evaluations of teaching (SETs) and uses a novel identification procedure to test for halo effects. The authors find that, even when halo effects distort some questions, responses can still be informative.
Takeaway: halo effects can flatten item-by-item scores, but they do not automatically make SET data useless. Triangulating ratings with independent variables and open-ended student comments makes it easier to interpret what is driving the feedback.
At Student Voice AI, we follow this work because it informs survey design and how institutions interpret quantitative scales alongside student comments.
Halo effects occur when responses to a questionnaire are highly correlated and reveal little more than an overall evaluation. Halo effects are difficult to identify in SETs because some correlation is expected between different items. Shortening an evaluation can reduce cross-correlation and survey fatigue in higher education, which can improve response rates. It may not, however, make halo effects easier to identify.
One method to find halo effects is to introduce a variable that should not correlate with the other variables (the questions in the survey). Preliminary evidence suggests, for example, that students may be more likely to agree with positive statements if they are first asked about an unrelated question, such as “Was the lecture room large enough?”
The study looked at the effect of halo bias on true correlations between attributes at an Italian university. In this setting, variance in scores across attributes was taken as an indication of accuracy. The data indicated that any correlation unjustified by true underlying correlations should be considered a form of halo effect.
SETs are often used to explore relationships between different items. High correlation between items can be problematic for validity. However, it also suggests that long (and potentially burdensome or costly) questionnaires may be unnecessary, and that a smaller number of well-chosen questions can be enough.
In summary, SETs are not a perfect measure of teaching quality. They have limitations that affect how confidently you can use them to judge a teacher's effectiveness. Even so, educators can use them to identify patterns and find areas for improvement.
Halo effects are a drawback of SETs because they reduce the reliability of within-teacher distinctions by flattening the overall profile of ratings. On the other hand, halo effects can magnify differences in the mean ratings received by different teachers. The paper states that there is no reason to believe the SETs contain fundamental problems that would invalidate the dataset. Block rating is more common in some other studies, but this study found weak evidence that students commonly block rate.
Across the university, there is little evidence of extreme halo effects, but responses to questions are still highly correlated. This corroborates the presence of a halo effect. To diagnose it properly, the authors argue we need independent information, in this case about the rooms themselves, and to map that to a question in the SET.
At the item level, independent information about rooms has more explanatory power than a general halo effect. Because SETs are often used diagnostically, not just summatively, halo effects still matter when you need specific, actionable feedback to improve teaching.
Q: How can Student Voice initiatives more effectively incorporate text analysis to detect and mitigate halo effects in open-ended student feedback?
A: Student Voice initiatives can apply text analysis (for example, NLP) to open-ended comments to surface themes and sentiment that rating scales can hide (including sentiment analysis for UK universities). To detect possible halo effects, look for patterns where comments about one aspect of the experience (for example, the physical classroom environment) consistently appear alongside broad praise or criticism of teaching. Comparing comment themes with item-level scores helps teams separate specific teaching feedback from impressions shaped by unrelated factors, leading to more actionable and trustworthy insights.
Q: What specific strategies or methodologies can be employed to ensure the validity and reliability of SETs, considering the presence of halo effects?
A: To strengthen the validity and reliability of Student Evaluation of Teaching (SETs) when halo effects are possible, combine quantitative scales with a small number of well-designed open questions and analyse them together. Keep questionnaires short and clearly worded, and explain to students what good, specific feedback looks like and why it matters. Anonymous evaluations can also reduce social desirability pressure and encourage more honest responses.
Q: In what ways can Student Voice initiatives leverage independent variables, like the unrelated question about the lecture room size mentioned, to better understand and interpret the data collected through SETs?
A: Independent variables can act as a kind of control. If an unrelated factor, such as satisfaction with room size, correlates strongly with multiple teaching-quality items, it can signal that overall impressions are bleeding into specific ratings. Analysing these correlations alongside themes in comments helps teams interpret SET data more cautiously and identify external issues, such as room constraints, that may be shaping feedback.
[Paper Source] Edmund Cannon & Giam Pietro Cipriani "Quantifying halo effects in students’ evaluation of teaching"
DOI: 10.1080/02602938.2021.1888868
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.