Student Voice

Collaborative Learning: Understanding Students’ Engagement

By Daniel Johnston

One issue that frequently presents difficulties in teaching is knowing how much the students are engaging with lessons. In a time when we are moving to online pre-recorded content, this is an ever-growing concern. With this matter in mind, E.F. Risko et al. developed a tool to help manage this: the Collaborative Lecture Annotation System (CLAS) [1].

In essence, the authors designed this tool to record and give analytical feedback on students' engagement throughout a video lecture. It is important to note that the CLAS gives this feedback to both the student and the class instructor. Some of this feedback focuses on the individual, allowing instructors to understand individuals' engagement with the content. The rest of the feedback focuses on group elements: empowering students to compare their engagement to others'; this also allows instructors to view more general tendencies across the class.

The CLAS works similarly to the Microsoft Research Annotation System (MRAS) and VirtPresenter. However, it builds on this functionality by using a novel method of annotation tracking and exploiting the group statistics. The CLAS environment uses instructor-defined segmentations of the lesson to isolate note-worthy sections, or "events", with which students may engage.

This setup allows instructors to identify what has piqued students' engagement throughout a given lesson. Perhaps a particular PowerPoint slide, or worked example, prompts a flurry of note-taking across the class; or perhaps a slide that the instructor sees as crucial to the final exam prompts only a few students to take notes. In either of these situations, knowledge of the students' engagement is invaluable: enabling the instructor to adapt (or otherwise act) in the most productive way that they can.

So how does a student interact with the CLAS? Sitting down to work through a pre-recorded video lecture, as is so often the case, the student watches and makes notes. The note-taking takes place in the virtual environment (rather than on paper). The confinement of note-taking to this environment allows the system to track what lesson elements prompt engagement effectively. This initial use of the CLAS, however, is not the novel part of this development: as previously mentioned, these features are similarly available through MRAS and VirtPresenter.

The novelty of this is in the "post-lesson" stage. Once the lesson is complete, each student can see a "group graph" illustrating where their classmates took notes. The system doesn't share the students' notes. For a particular student, the CLAS could highlight: "This is where your classmates took a lot of notes, but you didn't take any" - giving the student effective, rapid feedback on their engagement. In the implementation discussed in E.F. Risko et al.'s work [1], the comparison given to students is visual; but, the authors see a reasonable scope for making this comparison numerical instead.

But how does this system look from the instructor's perspective? In a nutshell, the first advantage of the CLAS is that it helps the instructor reflect on the efficacy of the resources delivered. This is the case because the system is (in effect) determining what students understand to be relevant to them - the instructor can then take this determination and compare it to their intentions.

There is also scope, dependent only on data management issues, for instructors to keep records year-to-year - allowing them to evaluate their performance and content improvement. Additionally, this feature could allow instructors to spot potential outliers in their class populations. This leads to the second advantage of the system: it empowers instructors to intervene early to help struggling students.

To evaluate the effectiveness of the system they developed, the authors asked a group of students to try it out. This group of students had never participated in a class using video lectures before. In this study, they were asked to watch a video lecture, then they were given time to study for a test (using the CLAS) on this topic. The purpose of this examination was to consider the user experience throughout the process. Therefore, the authors don't present the test results, although they do present the student feedback. Overwhelmingly, the student response was that (in a majority of instances) they:

  • Found the CLAS easy to navigate and use;
  • Used the individual statistics as well group statistics; and,
  • Thought the CLAS was a useful tool that they would like to use in the future.

Additionally, the students involved in the study gave useful feedback. Many of these suggestions were based on user-interface issues such as adding Pause/Rewind buttons. Even when suggestions, such as this, were made to improve the system, the students still thought highly of it. A very encouraging result for the developers of the CLAS.

Finally, some issues may be worth considering. There may be a significant capital cost associated with implementing a system such as this: whether that comes in the form of a software license fee, or the development of a proprietary version of it. Additionally, this study does not explore the perceptions of students who HAVE taken part in classes using video lectures. In theory, it could be possible that (in a side-by-side comparison) students prefer - and perform better with - the "old fashioned" method of note-taking instead. It is worth noting that this system (first developed in 2013) has now undergone significant evolution and will be considered in future case studies. Given the current landscape of remote learning, it appears that tools such as this are only going to thrive in the near future!


Q: How does the CLAS system specifically incorporate Student Voice in its design and feedback mechanisms?

A: The Collaborative Lecture Annotation System (CLAS) incorporates Student Voice primarily through its feedback mechanisms. By providing individual and group feedback based on engagement data, the system allows students to see how their engagement compares with their peers, encouraging reflection and self-assessment. While the original blog post does not detail specific methods of incorporating student input into the system's development, the inclusion of student feedback on the user interface and functionality suggests that Student Voice plays a role in the system's ongoing improvement. This feedback loop ensures that the system evolves in response to student needs and preferences, making it a more effective tool for enhancing engagement and learning.

Q: What are the privacy implications of tracking and comparing student engagement through note-taking in the CLAS system?

A: The privacy implications of using the CLAS system involve concerns around how student data—specifically engagement levels and note-taking habits—are handled. The system tracks individual and group engagement without sharing the content of notes, aiming to respect student privacy. However, ensuring privacy requires clear policies on consent, data storage, and access. The system must securely manage engagement data to prevent misuse and ensure that students feel safe and supported. While the blog post does not provide details on these aspects, addressing privacy concerns is crucial for maintaining trust in educational technologies that analyse and compare student engagement.

Q: How does the CLAS system's text analysis capabilities contribute to understanding and improving student engagement?

A: The text analysis capabilities of the CLAS system play a crucial role in understanding and improving student engagement by analysing the content and context of students' notes during video lectures. Although the blog post does not delve into the specifics of the text analysis process, such capabilities likely enable the system to identify key themes, questions, and areas of interest or confusion among students. By examining the depth and focus of students' notes, the system can provide nuanced feedback to both students and instructors, highlighting areas where students are most engaged or where they may need further clarification. This level of analysis helps tailor the learning experience to meet students' needs more effectively and promotes a more interactive and responsive educational environment.


[1]. Risko, E. F., et al. (2013). "The Collaborative Lecture Annotation System (CLAS): A New TOOL for Distributed Learning." IEEE Transactions on Learning Technologies 6(1): 4-13.
DOI: 10.1109/TLT.2012.15

Related Entries