Insights and resources to support better data analysis in education
By Christine Enowmbi Tambe
The internationalisation of higher education brings together students from various secondary educational systems, each with its own set of instructional methodologies, potentially resulting in unequal chances: students attending a challenging quantitative methods module may underperform at the start of their bachelor’s degrees due to a lack of critical prior knowledge or learning dispositions. Furthermore, students who have been educated using teacher-centred pedagogies may suffer significant obstacles when transitioning to problem-based learning, where students are primarily responsible for their learning process. Fortunately, previous research (1) has highlighted the importance of feedback in supporting underperforming students (1).
Instructors and students can obtain feedback from learning analytics or exam performance. Learning analytics, in contrast to the latter, is timelier and more actionable because it is based on learning activity data collected by systematically analysing trace variables from e-learning systems, such as the number of problems solved by students, the proportion of successfully solved problems, how many and what type of scaffolds (hints, worked examples) the student used in solving the problems and the time-on-task. Learning analytics systems will generate performance predictions based on the presence or absence of learning activity data to identify students in need of assistance and provide both instructors and learners with relevant feedback through dashboards informing on learning progress, study tactics, and how effective these strategies have been. Tempelaar (2) discusses how dispositional learning analytics can be utilised to assist students who are underperforming because they lack prior appropriate knowledge or because their preferred learning approaches are at odds with the problem-based learning principle. Tempelaar (2) also reasons that digital learning platforms based on the mastery learning concept, which integrate assessment as, for, and of learning and provide both learners and teachers with extensive learning feedback, can play a vital role in addressing the problem of unequal opportunities.
To construct a dispositional learning analytics infrastructure, learner data such as dispositions, experiences, values, and attitudes that influence their engagement with learning, measured through self-report surveys, are combined with learning activity data gathered from online learning environments. Its goal is to make feedback even more actionable by connecting it to relevant educational interventions and uncover the mechanisms through which learning analytics can support students of various profiles, including those who are underperforming.
Timeliness is an important feature of actionable feedback: the desire for immediate feedback does not exclude the use of assessment data; prior research has shown that early assessment data is the best predictor of final module performance. There are three different types of assessments, and they work best when combined to provide actionable learning feedback.
When learning is assessment-guided, as is the case with many digital learning platforms founded on the notion of mastery learning, the most immediate type of assessment data is learning activity data. Assessment as learning takes the shape of formative e-tutorials, which begin a learning activity by posing a problem and challenging the student to solve it. If the student is successful in solving the problem, the student's mastery level is adapted, and a new problem is presented. If the student is unsuccessful in solving the problem on their own, scaffolds are provided, such as worked-out examples or instructions for individual problem solution phases. Learning activity data created from this sort of learning, which will be available from the beginning of the module, will then contain "assessment as learning" data.
Data on ‘assessment for learning' could be derived from summative quizzes given to students on a less frequent basis, for example, fortnightly quizzes which constitute a small weight of the final grade in Tempelaar’s case study (2). Because the quizzes are directly linked to the final test (Assessment of learning), they can help identify students who are at risk of failing the module early on. The downside is that quizzes have a greater time lag, making them less suitable for timely action.
A final written examination is used to assess student learning.
In Tempelaar’s case study (2), a blended or hybrid teaching method combining face-to-face workshops with technology-enhanced learning was adopted in an introductory mathematics and statistics module for students enrolled in business studies or economics degrees. Face-to-face interaction is the most important component of this blend, and it is mandatory for students: problem-based learning in small groups (14 students), supervised by a subject expert instructor. Following the student-centered educational principle, the use of e-tutorials in the online component is made optional in order to leave the primary responsibility for making educational decisions to the students. Nonetheless, students with limited prior knowledge can be motivated to make intensive use of e-tutorials by making quizzes summative which will affect their final performance and drawing questions for the quizzes from the same pool used for the practice sessions.
The student-centred approach in problem-based learning necessitates, first and foremost, adequate actionable feedback to students so that they may track their study progress and topic comprehension. The digital platforms are critical in this monitoring function: students can view their performance in practice sessions, progress in preparing for the next quiz, and receive comprehensive feedback on their completed quizzes, both in absolute and relative terms (to their peers) at any time. Therefore, blended design was chosen as the instructional format to complement the learning taking place in the problem-based learning context with learning constellations based on student dispositions other than those necessary in problem-based learning.
Two cluster analysis-based profiles could be usefully compared based on learning approach dispositions established by students in pre-tertiary educational systems, one of which is more focused on deep learning and self-regulation and the other on stepwise learning and external regulation. The first profile is considered adaptive in a problem-based learning program, where students are expected to be mostly self-directed. The second profile is considered less adaptive. Trace data from e-tutorials were used to understand how students find their own learning paths in the online learning environment. Students with a less adaptive profile were more inclined to employ the external regulation (assessment) built into digital learning platforms. Despite the differences in learning paths, differences in the final module grade were absent. As a result, it was indicated that combining assessment-guided learning technology with learning analytics-based learning feedback is crucial in the adaptation to a completely new learning context. Tempelaar (2) does recognize, however, that the findings of this study are confined to learning that occurs in digital environments. Learning outside of these platforms (self-study, in-person lessons) produces specific characteristics that can only be investigated in laboratory and field studies.
[Source] Tempelaar D. Supporting the less-adaptive student: the role of learning analytics, formative assessment and blended learning. Assessment & Evaluation in Higher Education. 2020 May 18;45(4):579-93.
 Pardo A. A feedback model for data-rich learning experiences. Assessment & Evaluation in Higher Education. 2018 Apr 3;43(3):428-38.