Student behavioural profiles in blended learning courses

By Daniel Johnston

With many universities and colleges adapting courses into blended learning models, understanding students’ engagement with different delivery mediums is massively important. As is likely apparent to instructors across higher education, there are seemingly a myriad of reasons as to why students don’t engage effectively with online resources. For an interesting insight into students’ self-regulation behaviours in this context, the work of Lust et al. [1] should be consulted. Indeed, there is a significant volume of literature on the development of software which can help educators understand students’ engagement. The CLAS tool is one of these and was the subject of a recent Student Voice Blog article. That particular study [2] used student-/user-feedback on the use of this software.

However, as previous studies discuss, there is the concern of “Social Desirability Bias” to skew the results of such investigations – whereby the participants may return answers in line with how they want to be perceived rather than give entirely honest answers [3]. Essentially, students in some survey-based studies might have given responses that make them look like “good students”, rather than more honest (and genuinely useful) responses. The paper considered in this article circumvents this obstacle to investigate the archetypal behaviour-profiles of students engaging in virtual-learning.

Studying students’ interaction with a video annotation tool, the work of Mirriahi et al. [4] attempts to answer 3 poignant questions:

  1. Are there specific archetypes that emerge from analysis of students’ engagement?
  2. Does the delivery method of a course help to develop students’ behaviours?
  3. If the archetypes are found for the first question, do these translate into notable differences in attainment?

Regarding the video annotation tool, the researchers elected to use the CLAS tool (which was the subject of a previous article in the Student Voice Blog). Notably, the participants were all from the same higher education institute (in North America), they were enrolled in the same discipline, and they had no experience of using the CLAS tool (or similar). Their course was, like many, divided into two distinct phases: phase 1 was concerned with the development of knowledge and understanding; and phase 2 was focused on the implementation of the phase 1 content. When the students watched a video lecture, the following data was recorded for each student.

  • How often they fast-forwarded each video;
  • How often they rewound each video;
  • How many times they watched a video continuously (without pausing, fast-forwarding, or rewinding);
  • How many times they paused each video; and,
  • How long each video was played for.

The authors are careful to note, for the latter measure, that this does not guarantee the students were paying attention to the video – since the software can only track mouse-clicks. After this data was collected, it was subjected to a hierarchical cluster analysis.

The graphs that were produced from this analysis for each student could then be compared to the group and, subject to further analyses, revealed the existence of 4 distinct behavioural archetypes. These clusters (“A” to “D”) were termed: “minimalists”, “task-focused”, “disenchanted”, and “intensive”, as a shorthand means of characterising their learning engagement. The authors note that it is possible to divide the participants into further behavioural groups, but there is significant overlap seen if a higher number of divisions is chosen.

Cluster A: The Minimalists

These students were seen in relatively high numbers in the group of participants, being the second largest group. Based on their behavioural metrics, they engaged much less actively with video content (particularly in terms of making annotations). The researchers posit that this is not necessarily a sign of laziness, and rather could indicate a more social approach to learning.

Cluster B: The Task-Focused Students

This cluster was larger than any of the others, accounting for 37% of the participants. By contract to the minimalists, these students produced the highest level of engagement with the video lectures.

Cluster C: Disenchanted Students

Encouragingly, this was the smallest cluster observed (at only 13% of the participant population). As the name suggests, this group showed relatively low interaction with the video content by comparison to the task-focused students. Interestingly, they did initially engage with the content, but this engagement did fade leading the authors to comment on their “limited sustained effort compared to students in clusters B and D”.

Cluster D: Intensive Students

This group was relatively small, just slightly more populous than cluster C. The defining characteristic of this group, as the name suggests, was their exhibited effort and apparent internal drive. Most notably, the students in this population engaged more actively with the video content than any other cluster – except for the fast-forwarding measure.

Across the different courses covered by the participants, researchers were also able to address their 2nd research question. Namely, that students who were not incentivised (via grading) to engage with the content, would not as readily align with clusters B or D – instead taking a “minimalist” or “disenchanted” approach. They also observed that if a student was incentivised through phase 1 of their course, but not through phase 2, then they would not tend to maintain their phase 1 behaviour. This confirmed further work (cited in [4]) which suggests that students’ engagement is reliant on external factors (i.e. mode of delivery) as well as internal factors.

What then can be taken from this research? With regards to this latter point, it seems that students benefit (seeing a boost in academic performance) from consistent modes of delivery across their course (across the various classes and throughout the year). If video lectures are here to stay – as entirely virtual courses, or as part of a blended learning approach – then instructors would likely benefit from understanding the behavioural clusters that exist in their own class so that their delivery and interventions can be adapted to suit. While the implementation of video annotation software and the standardisation of teaching practices and processes are by-no-means easy tasks, it appears that there is a great deal of benefit to be found for all involved if these things are pursued.

FAQ

Q: How do students perceive their own engagement and student needs in relation to the identified behavioural archetypes (minimalists, task-focused, disenchanted, intensive)?

A: Students' perceptions of their own engagement and student needs in relation to the identified behavioural archetypes might vary widely. While some students may accurately recognise their patterns of engagement, others might not be as self-aware or may view their behaviours differently due to personal biases or misunderstandings of what constitutes effective engagement. The concept of student voice is crucial here, as directly gathering students' insights and reflections on their learning behaviours could provide valuable context to the behavioural data collected through tools like the CLAS. Understanding students' perspectives can help educators to see beyond the data, acknowledging the reasons behind certain behaviours and the students' own strategies for learning.

Q: What role does text analysis play in identifying and understanding the nuances of student engagement in online learning environments?

A: Text analysis plays a significant role in identifying and understanding the nuances of student engagement in online learning environments. By analysing textual feedback from students, such as annotations, forum posts, and feedback forms, educators can gain insights into students' thoughts, concerns, and experiences that are not immediately apparent through behavioural data alone. Text analysis can reveal patterns in students' attitudes towards learning materials, their understanding of the content, and their interactions with peers and instructors. Incorporating student voice through text analysis allows for a more comprehensive understanding of engagement, highlighting areas where support might be needed and offering a deeper look into the effectiveness of different teaching strategies.

Q: How can educators effectively use the insights from behavioural archetypes and text analysis to tailor their teaching strategies for different groups of students within the same course?

A: Educators can effectively use the insights from behavioural archetypes and text analysis to tailor their teaching strategies by first understanding the specific needs and preferences of the different groups of students within their course. For instance, minimalists may benefit from more collaborative and interactive activities that encourage active participation, while task-focused students might appreciate more challenging tasks that allow for deeper exploration of the content. Disenchanted students could require motivational strategies and more personalised feedback to re-engage them with the course material. Intensive students, on the other hand, might benefit from advanced materials or independent projects that cater to their drive for deeper learning. By incorporating student voice, educators can ensure that their adapted strategies are aligned with students' actual experiences and preferences, making the learning experience more effective and engaging for everyone involved. Tailoring teaching strategies based on a combination of behavioural data and text analysis ensures a nuanced approach that recognises the diversity of student needs within a single course.

References:

[1] Lust, G., J. Elen, and G. Clarebout, Regulation of tool-use within a blended course: Student differences and performance effects. Computers & Education, 2013. 60(1): p. 385-395.
DOI: 10.1016/j.compedu.2012.09.001

[2] Risko, E.F., et al., The Collaborative Lecture Annotation System (CLAS): A New TOOL for Distributed Learning. IEEE Transactions on Learning Technologies, 2013. 6(1): p. 4-13.
DOI: 10.1109/TLT.2012.15

[3] Beretvas, S.N., J.L. Meyers, and W.L. Leite, A Reliability Generalization Study of the Marlowe-Crowne Social Desirability Scale. Educational and Psychological Measurement, 2002. 62(4): p. 570-589.
DOI: doi/10.1177/0013164402062004003

[4] Mirriahi, N., et al., Uncovering student learning profiles with a video annotation tool: reflective learning with and without instructional norms. Educational Technology Research and Development, 2016. 64(6): p. 1083-1106.
DOI: 10.1007/s11423-016-9449-2

Related Entries