Student Voice

The Student Voice Blog

Research on Education Text Analysis and Teaching Best Practice

Collaborative Learning: Understanding Students’ Engagement

Daniel Johnston - Aug 01, 2022

One issue that frequently presents difficulties in teaching is knowing how much the students are engaging with lessons. In a time when we are moving to online pre-recorded content, this is an ever-growing concern. With this matter in mind, E.F. Risko et al. developed a tool to help manage this: the Collaborative Lecture Annotation System (CLAS) [1].

In essence, the authors designed this tool to record and give analytical feedback on students' engagement throughout a video lecture. It is important to note that the CLAS gives this feedback to both the student and the class instructor. Some of this feedback focuses on the individual, allowing instructors to understand individuals' engagement with the content. The rest of the feedback focuses on group elements: empowering students to compare their engagement to others'; this also allows instructors to view more general tendencies across the class.

The CLAS works similarly to the Microsoft Research Annotation System (MRAS) and VirtPresenter. However, it builds on this functionality by using a novel method of annotation tracking and exploiting the group statistics. The CLAS environment uses instructor-defined segmentations of the lesson to isolate note-worthy sections, or "events", with which students may engage.

This setup allows instructors to identify what has piqued students' engagement throughout a given lesson. Perhaps a particular PowerPoint slide, or worked example, prompts a flurry of note-taking across the class; or perhaps a slide that the instructor sees as crucial to the final exam prompts only a few students to take notes. In either of these situations, knowledge of the students' engagement is invaluable: enabling the instructor to adapt (or otherwise act) in the most productive way that they can.

So how does a student interact with the CLAS? Sitting down to work through a pre-recorded video lecture, as is so often the case, the student watches and makes notes. The note-taking takes place in the virtual environment (rather than on paper). The confinement of note-taking to this environment allows the system to track what lesson elements prompt engagement effectively. This initial use of the CLAS, however, is not the novel part of this development: as previously mentioned, these features are similarly available through MRAS and VirtPresenter.

The novelty of this is in the "post-lesson" stage. Once the lesson is complete, each student can see a "group graph" illustrating where their classmates took notes. The system doesn't share the students' notes. For a particular student, the CLAS could highlight: "This is where your classmates took a lot of notes, but you didn't take any" - giving the student effective, rapid feedback on their engagement. In the implementation discussed in E.F. Risko et al.'s work [1], the comparison given to students is visual; but, the authors see a reasonable scope for making this comparison numerical instead.

But how does this system look from the instructor's perspective? In a nutshell, the first advantage of the CLAS is that it helps the instructor reflect on the efficacy of the resources delivered. This is the case because the system is (in effect) determining what students understand to be relevant to them - the instructor can then take this determination and compare it to their intentions.

There is also scope, dependent only on data management issues, for instructors to keep records year-to-year - allowing them to evaluate their performance and content improvement. Additionally, this feature could allow instructors to spot potential outliers in their class populations. This leads to the second advantage of the system: it empowers instructors to intervene early to help struggling students.

To evaluate the effectiveness of the system they developed, the authors asked a group of students to try it out. This group of students had never participated in a class using video lectures before. In this study, they were asked to watch a video lecture, then they were given time to study for a test (using the CLAS) on this topic. The purpose of this examination was to consider the user experience throughout the process. Therefore, the authors don't present the test results, although they do present the student feedback. Overwhelmingly, the student response was that (in a majority of instances) they:

  • Found the CLAS easy to navigate and use;
  • Used the individual statistics as well group statistics; and,
  • Thought the CLAS was a useful tool that they would like to use in the future.

Additionally, the students involved in the study gave useful feedback. Many of these suggestions were based on user-interface issues such as adding Pause/Rewind buttons. Even when suggestions, such as this, were made to improve the system, the students still thought highly of it. A very encouraging result for the developers of the CLAS.

Finally, some issues may be worth considering. There may be a significant capital cost associated with implementing a system such as this: whether that comes in the form of a software license fee, or the development of a proprietary version of it. Additionally, this study does not explore the perceptions of students who HAVE taken part in classes using video lectures. In theory, it could be possible that (in a side-by-side comparison) students prefer - and perform better with - the "old fashioned" method of note-taking instead. It is worth noting that this system (first developed in 2013) has now undergone significant evolution and will be considered in future case studies. Given the current landscape of remote learning, it appears that tools such as this are only going to thrive in the near future!

References:

[1]. Risko, E. F., et al. (2013). "The Collaborative Lecture Annotation System (CLAS): A New TOOL for Distributed Learning." IEEE Transactions on Learning Technologies 6(1): 4-13.
DOI: 10.1109/TLT.2012.15