Student Voice

Halo Effects in the Student Voice: Unwanted Correlations

By Stuart Grey

A Halo effect is a significant correlation between two items of the questionnaire which should be unrelated.

At Student Voice, we are always looking for areas where our understanding of how we capture and use student comments can be improved. A key issue is in the validity of certain survey instruments and in particular the use of the results of quantitative scale questions.

This paper analyzes correlations in student evaluations of teaching. The authors used a novel identification procedure to assess the presence of halo effects and found that, despite the distortion due to halo effects, responses to distorted questions remained informative.

Overall, the results of this experiment suggest that the distortion in evaluation questionnaires caused by halo effects need not be a concern for higher education institutions.

Halo effects occur when all responses to a questionnaire are highly correlated and reveal little more than an overall evaluation. Halo effects are difficult to identify in SETs because correlation is expected between responses to different items. Shortening the evaluation could reduce cross-correlation, as well as survey fatigue and improve response rates, but may not help with identifying halo effects.

One method to find halo effects is to find a new variable that does not correlate to the other variables (the questions in the survey). Preliminary evidence suggests that students may be more likely to agree with positive statements if they are asked about their opinion on an unrelated question (i.e. “was the lecture room large enough?”).

The study looked at the effect of halo bias on true correlations between attributes at an Italian university. In a university setting, variance in scores for different attributes was taken as an indication of the accuracy of the test. The data indicated that any correlation which is unjustified by true underlying correlations should be considered a form of halo effect.

SETs are often used to measure the strength of relationships between two sets of items. The high correlation between SETs can be problematic for validity. However, they do suggest that long (and potentially burdensome or costly) questionnaires may be unnecessary and it may be better to ask a much smaller number of questions

In summary, SETs are not a perfect measure of teaching quality. They do have some flaws and shortcomings that affect the validity of their use in determining a teacher's effectiveness. However, they can be used by educators help find areas for improvement.

Halo effects are drawbacks to SET because they reduce the reliability of within-teacher distinctions by flattening the overall profile of ratings. On the other hand, halo effects can magnify differences in the mean ratings received by different teachers. The paper states that there is no reason to believe that the SETs contain fundamental problems which would invalidate our data set. Block rating is more common in some other studies, but this study found weak evidence that students commonly block rate.

For the university as a whole, there is little evidence of extreme halo effects. However, responses to questions are highly correlated with each other. This is corroborative evidence for a halo effect but to really get an understanding of whether a halo effect exists we need to consider independent information, in this case about the rooms themselves and map that to a question in the SET.

At SET level the independent information about rooms has more explanatory power than the halo effect. SETs usually have a diagnostic goal and not only a summative goal, and if they are meant to provide specific feedback in order to improve teaching performance, halo effects represent a problem.


  1. Halo effects are present in SET
  2. Evaluations remain informative of the various aspects being judged, despite the distortion
  3. No evidence for a need to design evaluations differently because of halo effects


[Paper Source] Edmund Cannon & Giam Pietro Cipriani "Quantifying halo effects in students’ evaluation of teaching"
DOI: 10.1080/02602938.2021.1888868

Related Entries