Student Voice

Lexicon and Software Choice in Education Text Analysis

By Student Voice

The most common and popular strategy for assessing student feedback is to use quantitative data (e.g., course grades, which reflect the student's final opinion on the quality of instruction). However, at Student Voice, we feel that qualitative, free-text, comments from students can provide a level of detail and insight that assists lecturers and university management in understanding what the underlying issues and areas for improvement are.

Qualitative and quantitative data can both be used for reporting purposes in order to provide a more thorough representation of the information. Qualitative data should not replace quantitative data, but instead should be included with it in order to get a better picture of what students are thinking.

The aim of this research is to find ways in which universities can make interpreting student feedback easier. The paper discusses a university's use of an automated text analysis software to undertake the examination of student comments to better understand what students are saying and how academics can grapple with this valuable information. The paper focuses on the journey in the implementation of a suitable strategy for the text analysis of student comments collected in standardised unit evaluations.

This initiative sought to develop a system that would systematically evaluate the quality of learning and teaching in a university. The project aimed to provide a more effective, timely and thorough interpretation of the qualitative feedback received from standardised unit surveys.

There are often challenges in evaluating large numbers of student comments. This study trialed the use of text analysis tool to summarise student comments for selected units with very large enrolments. The study adopted a descriptive approach which assisted in understanding contextual issues, identifying key elements and to trial processes aimed at addressing the challenges involved.

The value of text analysis is that it simplifies the process by which qualitative data can be summarised, enabling academics to quickly gain an understanding of what comments are associated with particular themes. The potential limitation of this process is that it may not be representative as a result of the automatic classification.

The process of text analysis for feedback is used to find patterns in student responses about their university experiences. The decision of which software or service to use is a key decision.

"The software packages are not easily interchangeable due to the different ways in which they work with text data. The decision on which package to use would depend on the nature of the research being undertaken."

The authors finally concluded that as the data complexity and heterogeneity grow, it becomes more difficult to make a simple decision about which tool is best for your needs. The right question could be what you want out of the tool rather than what tools are available.

The study found that, as the literature posits, there is considerable value in using automated text analysis to support the appraisal of comments students have made about their experiences. This particularly applies when the survey data are relatively large, such as in units which have large numbers of students.

The use of an automated process to reduce the time spent on text analysis does not however necessarily produce reports that are more meaningful or detailed when considering the categories applied. The adoption of a standardised dictionary may also present information in broad categories, hence the dictionary/lexicon should be continually refined to become more accurate, thorough and comprehensive.

This study aimed to explore the use of text analysis software for investigating students' experience in university courses. It has found that despite some limitations, the use of a software for text analysis can contribute to a fuller, more nuanced picture of students' experiences of learning and students' suggestions for improving teaching practices.

The introduction of the text analysis tool for qualitative data is an innovative approach to generating reports that can be used by stakeholders in their efforts to assess and improve quality assurance and enhancement processes. Further investigation is necessary to determine the usefulness of such reports to the stakeholders of the institution.

Overall we feel this study supports Student Voice's mission to improve the analysis of student free-text comments to make sure we are measuring both how effective our teaching is and the quality of the student experience.

FAQ

Q: How does student voice through qualitative feedback specifically contribute to improving teaching practices?

A: Student voice, when expressed through qualitative feedback, offers detailed insights into the students' learning experiences that quantitative data alone cannot provide. This type of feedback can highlight specific areas of teaching that are effective or need improvement, allowing educators to make targeted changes. For instance, comments on the clarity of lectures, the usefulness of course materials, or the effectiveness of interactive sessions give lecturers direct input on what aspects of their teaching methods resonate with students and which aspects could be enhanced. By integrating text analysis, universities can systematically identify common themes and concerns across large volumes of student comments, thus ensuring that the student voice directly informs decisions about teaching practices and curriculum development.

Q: What are the challenges involved in implementing automated text analysis for student feedback, and how can they be addressed?

A: Implementing automated text analysis for student feedback involves several challenges, including ensuring the accuracy of the data interpretation and managing the variability of natural language in student comments. One of the main difficulties is that automated systems may not always correctly interpret the context or sentiment of the feedback, which could lead to misrepresentations of the student voice. To address these challenges, it's essential to continually refine the algorithms and dictionaries used in text analysis to better capture the nuances of student feedback. This might involve incorporating machine learning techniques that adapt and improve over time based on the feedback they process. Additionally, combining automated analysis with a level of human review can help validate the findings and ensure they accurately reflect the students' experiences and suggestions.

Q: How can universities ensure that the integration of student voice through text analysis remains inclusive and representative of all student groups?

A: Ensuring inclusivity and representation in integrating student voice through text analysis requires deliberate strategies to capture a diverse range of student experiences. Universities should encourage feedback from all student demographics, including international students, students with disabilities, and those from various socioeconomic backgrounds, to ensure the data reflects the entire student body. This can be achieved by creating multiple channels for feedback, ensuring anonymity to encourage honesty, and actively reaching out to underrepresented groups. Additionally, the text analysis tools and processes should be designed to recognise and appreciate the diversity of student feedback, avoiding biases that might overlook certain groups. Regularly reviewing and adjusting the text analysis methodology to account for emerging themes or issues particular to specific student groups can further enhance the representativeness and inclusivity of the student voice in university decision-making.

References

[Paper Source]: Elizabeth Santhanam, Bernardine Lynch and Jeffrey Jones "Making sense of student feedback using text analysis – adapting and expanding a common lexicon"
DOI: 10.1108/QAE-11-2016-0062

Related Entries