Research on Education Text Analysis and Teaching Best Practice
By Daniel Johnston
With many universities and colleges adapting courses into blended learning models, understanding students’ engagement with different delivery mediums is massively important. As is likely apparent to instructors across higher education, there are seemingly a myriad of reasons as to why students don’t engage effectively with online resources. For an interesting insight into students’ self-regulation behaviours in this context, the work of Lust et al.  should be consulted. Indeed, there is a significant volume of literature on the development of software which can help educators understand students’ engagement. The CLAS tool is one of these and was the subject of a recent Student Voice Blog article. That particular study  used student-/user-feedback on the use of this software.
However, as previous studies discuss, there is the concern of “Social Desirability Bias” to skew the results of such investigations – whereby the participants may return answers in line with how they want to be perceived rather than give entirely honest answers . Essentially, students in some survey-based studies might have given responses that make them look like “good students”, rather than more honest (and genuinely useful) responses. The paper considered in this article circumvents this obstacle to investigate the archetypal behaviour-profiles of students engaging in virtual-learning.
Studying students’ interaction with a video annotation tool, the work of Mirriahi et al.  attempts to answer 3 poignant questions:
Regarding the video annotation tool, the researchers elected to use the CLAS tool (which was the subject of a previous article in the Student Voice Blog). Notably, the participants were all from the same higher education institute (in North America), they were enrolled in the same discipline, and they had no experience of using the CLAS tool (or similar). Their course was, like many, divided into two distinct phases: phase 1 was concerned with the development of knowledge and understanding; and phase 2 was focused on the implementation of the phase 1 content. When the students watched a video lecture, the following data was recorded for each student.
The authors are careful to note, for the latter measure, that this does not guarantee the students were paying attention to the video – since the software can only track mouse-clicks. After this data was collected, it was subjected to a hierarchical cluster analysis.
The graphs that were produced from this analysis for each student could then be compared to the group and, subject to further analyses, revealed the existence of 4 distinct behavioural archetypes. These clusters (“A” to “D”) were termed: “minimalists”, “task-focused”, “disenchanted”, and “intensive”, as a shorthand means of characterising their learning engagement. The authors note that it is possible to divide the participants into further behavioural groups, but there is significant overlap seen if a higher number of divisions is chosen.
These students were seen in relatively high numbers in the group of participants, being the second largest group. Based on their behavioural metrics, they engaged much less actively with video content (particularly in terms of making annotations). The researchers posit that this is not necessarily a sign of laziness, and rather could indicate a more social approach to learning.
This cluster was larger than any of the others, accounting for 37% of the participants. By contract to the minimalists, these students produced the highest level of engagement with the video lectures.
Encouragingly, this was the smallest cluster observed (at only 13% of the participant population). As the name suggests, this group showed relatively low interaction with the video content by comparison to the task-focused students. Interestingly, they did initially engage with the content, but this engagement did fade leading the authors to comment on their “limited sustained effort compared to students in clusters B and D”.
This group was relatively small, just slightly more populous than cluster C. The defining characteristic of this group, as the name suggests, was their exhibited effort and apparent internal drive. Most notably, the students in this population engaged more actively with the video content than any other cluster – except for the fast-forwarding measure.
Across the different courses covered by the participants, researchers were also able to address their 2nd research question. Namely, that students who were not incentivised (via grading) to engage with the content, would not as readily align with clusters B or D – instead taking a “minimalist” or “disenchanted” approach. They also observed that if a student was incentivised through phase 1 of their course, but not through phase 2, then they would not tend to maintain their phase 1 behaviour. This confirmed further work (cited in ) which suggests that students’ engagement is reliant on external factors (i.e. mode of delivery) as well as internal factors.
What then can be taken from this research? With regards to this latter point, it seems that students benefit (seeing a boost in academic performance) from consistent modes of delivery across their course (across the various classes and throughout the year). If video lectures are here to stay – as entirely virtual courses, or as part of a blended learning approach – then instructors would likely benefit from understanding the behavioural clusters that exist in their own class so that their delivery and interventions can be adapted to suit. While the implementation of video annotation software and the standardisation of teaching practices and processes are by-no-means easy tasks, it appears that there is a great deal of benefit to be found for all involved if these things are pursued.
 Lust, G., J. Elen, and G. Clarebout, Regulation of tool-use within a blended course: Student differences and performance effects. Computers & Education, 2013. 60(1): p. 385-395.
 Risko, E.F., et al., The Collaborative Lecture Annotation System (CLAS): A New TOOL for Distributed Learning. IEEE Transactions on Learning Technologies, 2013. 6(1): p. 4-13.
 Beretvas, S.N., J.L. Meyers, and W.L. Leite, A Reliability Generalization Study of the Marlowe-Crowne Social Desirability Scale. Educational and Psychological Measurement, 2002. 62(4): p. 570-589.
 Mirriahi, N., et al., Uncovering student learning profiles with a video annotation tool: reflective learning with and without instructional norms. Educational Technology Research and Development, 2016. 64(6): p. 1083-1106.