Updated Mar 16, 2026
delivery of teachingremote learningWhen students watch a recorded lecture, how do you tell whether they are genuinely engaged, or simply letting the video run in the background? That question has become harder to answer as more teaching moves online, especially in online modules where sustaining engagement requires deliberate design, which is why E.F. Risko et al. developed the Collaborative Lecture Annotation System (CLAS) [1].
In essence, the authors designed CLAS to record note-taking activity during a video lecture and turn it into feedback on student engagement. Importantly, that feedback goes to both the student and the instructor. Some of it is individual, helping instructors understand how one student engaged with the content. Some of it is collective, helping both students and staff see broader patterns across the class.
CLAS works similarly to the Microsoft Research Annotation System (MRAS) and VirtPresenter. However, it extends that functionality through a more deliberate approach to annotation tracking and the use of group statistics. The CLAS environment uses instructor-defined segments of the lesson to isolate noteworthy sections, or "events", with which students may engage.
That setup gives instructors a practical way to spot what is actually capturing attention during a lesson. A particular PowerPoint slide or worked example might trigger a burst of note-taking across the class, while a slide the instructor considers essential might attract very little. In either case, the benefit is clear: the instructor gains evidence they can use to reinforce, revisit, or redesign the material.
So how does a student interact with CLAS? Sitting down to work through a pre-recorded video lecture, the student watches and makes notes inside the virtual environment rather than on paper. Because the note-taking happens within the system, CLAS can track which lesson elements prompt engagement. This initial use of CLAS is not the most novel part of the development. As noted above, similar functionality is already available through MRAS and VirtPresenter.
The real novelty appears in the post-lesson stage. Once the lesson is complete, each student can see a "group graph" showing where their classmates took notes, without revealing what anyone actually wrote. For a particular student, CLAS could effectively signal: "your classmates took a lot of notes here, but you did not". That creates quick, useful feedback on engagement. In the implementation discussed in Risko et al. [1], this comparison is visual, although the authors suggest there is scope to make it numerical instead.
But how does this system look from the instructor's perspective? In a nutshell, the first advantage of CLAS is that it helps instructors reflect on how effective their teaching resources really are. Because the system shows what students treated as relevant, instructors can compare those patterns with what they intended to emphasise.
There is also scope, subject to data management issues, for instructors to keep records from one year to the next. That would allow them to evaluate changes in performance and content, and to spot potential outliers in a cohort. This leads to the second advantage of the system: it supports earlier intervention when students may be struggling.
To evaluate the effectiveness of the system, the authors asked a group of students to try it out. These students had never previously participated in a class using video lectures. They were asked to watch a video lecture and were then given time to study for a test using CLAS. The purpose of the study was to examine the user experience throughout the process. As a result, the authors do not present the test results, but they do report the student feedback. In most cases, students said they:
Additionally, the students involved in the study gave useful feedback. Many of their suggestions focused on interface issues, such as adding pause and rewind buttons. Even so, students still thought highly of the system. That is an encouraging result for the developers because it suggests the core idea was valuable before the interface was fully refined.
Finally, some issues are worth considering. There may be a significant capital cost associated with implementing a system such as this, whether that comes through a software licence fee or the development of a proprietary alternative. In addition, this study does not explore the perceptions of students who have already taken part in classes using video lectures. In theory, a side-by-side comparison could still show that some students prefer, and perform better with, more traditional note-taking. Even so, the broader lesson remains persuasive. If institutions want a clearer view of engagement during recorded teaching, tools like CLAS offer a practical route to that evidence. The system has also evolved since its first development in 2013, so this remains a space worth watching.
Q: How does the CLAS system specifically incorporate Student Voice in its design and feedback mechanisms?
A: The Collaborative Lecture Annotation System (CLAS) incorporates Student Voice primarily through its feedback mechanisms. By providing individual and group feedback based on engagement data, the system allows students to see how their engagement compares with their peers, encouraging reflection and self-assessment. While the original blog post does not detail specific methods of incorporating student input into the system's development, the inclusion of student feedback on the user interface and functionality suggests that Student Voice in higher education plays a role in the system's ongoing improvement. This feedback loop ensures that the system evolves in response to student needs and preferences, making it a more effective tool for enhancing engagement and learning.
Q: What are the privacy implications of tracking and comparing student engagement through note-taking in the CLAS system?
A: The privacy implications of using the CLAS system revolve around how student data, specifically engagement levels and note-taking habits, are handled. The system tracks individual and group engagement without sharing the content of notes, which helps protect privacy. However, that protection still depends on clear policies for consent, data storage, and access. The system must manage engagement data securely to prevent misuse and ensure that students feel safe and supported. While the blog post does not provide details on these areas, addressing privacy concerns is crucial for maintaining trust in educational technologies that analyse and compare student engagement.
Q: How does the CLAS system's text analysis capabilities contribute to understanding and improving student engagement?
A: The blog post focuses more on annotation tracking than on a detailed text analysis workflow, so it does not explain this capability in depth. Even so, the system's note and annotation data could still help instructors understand where students concentrate, where they hesitate, and which parts of a lecture attract the most attention. Used well, that kind of evidence can support more targeted teaching changes and more responsive learning design.
[1]. Risko, E. F., et al. (2013). "The Collaborative Lecture Annotation System (CLAS): A New TOOL for Distributed Learning." IEEE Transactions on Learning Technologies 6(1): 4-13.
DOI: 10.1109/TLT.2012.15
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.