Student Voice

Supporting the Less Adaptive Student

By Christine Enowmbi Tambe

1. Introduction

The internationalisation of higher education brings together students from various secondary educational systems, each with its own set of instructional methodologies, potentially resulting in unequal chances: students attending a challenging quantitative methods module may underperform at the start of their bachelor’s degrees due to a lack of critical prior knowledge or learning dispositions. Furthermore, students who have been educated using teacher-centred pedagogies may suffer significant obstacles when transitioning to problem-based learning, where students are primarily responsible for their learning process. Fortunately, previous research (1) has highlighted the importance of feedback in supporting underperforming students (1).

Instructors and students can obtain feedback from learning analytics or exam performance. Learning analytics, in contrast to the latter, is timelier and more actionable because it is based on learning activity data collected by systematically analysing trace variables from e-learning systems, such as the number of problems solved by students, the proportion of successfully solved problems, how many and what type of scaffolds (hints, worked examples) the student used in solving the problems and the time-on-task. Learning analytics systems will generate performance predictions based on the presence or absence of learning activity data to identify students in need of assistance and provide both instructors and learners with relevant feedback through dashboards informing on learning progress, study tactics, and how effective these strategies have been. Tempelaar (2) discusses how dispositional learning analytics can be utilised to assist students who are underperforming because they lack prior appropriate knowledge or because their preferred learning approaches are at odds with the problem-based learning principle. Tempelaar (2) also reasons that digital learning platforms based on the mastery learning concept, which integrate assessment as, for, and of learning and provide both learners and teachers with extensive learning feedback, can play a vital role in addressing the problem of unequal opportunities.

2. Instructional methods used in the case study

2.1. The role of dispositional learning analytics

To construct a dispositional learning analytics infrastructure, learner data such as dispositions, experiences, values, and attitudes that influence their engagement with learning, measured through self-report surveys, are combined with learning activity data gathered from online learning environments. Its goal is to make feedback even more actionable by connecting it to relevant educational interventions and uncover the mechanisms through which learning analytics can support students of various profiles, including those who are underperforming.

2.2. The role of assessment

Timeliness is an important feature of actionable feedback: the desire for immediate feedback does not exclude the use of assessment data; prior research has shown that early assessment data is the best predictor of final module performance. There are three different types of assessments, and they work best when combined to provide actionable learning feedback.

2.2.1. Assessment as learning

When learning is assessment-guided, as is the case with many digital learning platforms founded on the notion of mastery learning, the most immediate type of assessment data is learning activity data. Assessment as learning takes the shape of formative e-tutorials, which begin a learning activity by posing a problem and challenging the student to solve it. If the student is successful in solving the problem, the student's mastery level is adapted, and a new problem is presented. If the student is unsuccessful in solving the problem on their own, scaffolds are provided, such as worked-out examples or instructions for individual problem solution phases. Learning activity data created from this sort of learning, which will be available from the beginning of the module, will then contain "assessment as learning" data.

2.2.2. Assessment for learning

Data on ‘assessment for learning' could be derived from summative quizzes given to students on a less frequent basis, for example, fortnightly quizzes which constitute a small weight of the final grade in Tempelaar’s case study (2). Because the quizzes are directly linked to the final test (Assessment of learning), they can help identify students who are at risk of failing the module early on. The downside is that quizzes have a greater time lag, making them less suitable for timely action.

2.2.3. Assessment of learning

A final written examination is used to assess student learning.

2.3. The role of blended learning

In Tempelaar’s case study (2), a blended or hybrid teaching method combining face-to-face workshops with technology-enhanced learning was adopted in an introductory mathematics and statistics module for students enrolled in business studies or economics degrees. Face-to-face interaction is the most important component of this blend, and it is mandatory for students: problem-based learning in small groups (14 students), supervised by a subject expert instructor. Following the student-centred educational principle, the use of e-tutorials in the online component is made optional in order to leave the primary responsibility for making educational decisions to the students. Nonetheless, students with limited prior knowledge can be motivated to make intensive use of e-tutorials by making quizzes summative which will affect their final performance and drawing questions for the quizzes from the same pool used for the practice sessions.

The student-centred approach in problem-based learning necessitates, first and foremost, adequate actionable feedback to students so that they may track their study progress and topic comprehension. The digital platforms are critical in this monitoring function: students can view their performance in practice sessions, progress in preparing for the next quiz, and receive comprehensive feedback on their completed quizzes, both in absolute and relative terms (to their peers) at any time. Therefore, blended design was chosen as the instructional format to complement the learning taking place in the problem-based learning context with learning constellations based on student dispositions other than those necessary in problem-based learning.

3. Outcomes and conclusions

Two cluster analysis-based profiles could be usefully compared based on learning approach dispositions established by students in pre-tertiary educational systems, one of which is more focused on deep learning and self-regulation and the other on stepwise learning and external regulation. The first profile is considered adaptive in a problem-based learning program, where students are expected to be mostly self-directed. The second profile is considered less adaptive. Trace data from e-tutorials were used to understand how students find their own learning paths in the online learning environment. Students with a less adaptive profile were more inclined to employ the external regulation (assessment) built into digital learning platforms. Despite the differences in learning paths, differences in the final module grade were absent. As a result, it was indicated that combining assessment-guided learning technology with learning analytics-based learning feedback is crucial in the adaptation to a completely new learning context. Tempelaar (2) does recognise, however, that the findings of this study are confined to learning that occurs in digital environments. Learning outside of these platforms (self-study, in-person lessons) produces specific characteristics that can only be investigated in laboratory and field studies.

FAQ

Q: How does student voice factor into the development and refinement of learning analytics tools and feedback mechanisms?

A: Student voice plays a critical role in the development and refinement of learning analytics tools and feedback mechanisms by ensuring that the systems are designed to meet the diverse needs and preferences of the student body. When student feedback and input are actively sought and incorporated, it helps in creating more user-friendly and relevant learning analytics platforms. This process can involve gathering student opinions on the effectiveness of feedback, the usability of digital platforms, and their overall learning experience. By prioritising student voice, developers and educators can make informed adjustments to the systems, making them more responsive and tailored to student needs. This approach not only enhances the learning experience but also encourages student engagement and ownership of their learning process.

Q: What role does qualitative text analysis play in understanding and improving student engagement and learning outcomes in a blended learning environment?

A: Qualitative text analysis plays a significant role in understanding and improving student engagement and learning outcomes in a blended learning environment by providing insights into student sentiments, challenges, and experiences that are not easily captured through quantitative data alone. Through the analysis of student-generated texts, such as forum posts and feedback comments, educators can identify common themes, concerns, and perceptions among students. This method allows for a deeper understanding of how students interact with both the digital and face-to-face components of blended learning, what they find beneficial, and what barriers they face. Incorporating text analysis into the evaluation of blended learning environments can lead to more nuanced and effective strategies for enhancing student engagement and tailoring learning experiences to meet the diverse needs of the student population.

Q: How are the challenges of ensuring equity and inclusion addressed when implementing dispositional learning analytics and assessments in a diverse student body?

A: Ensuring equity and inclusion when implementing dispositional learning analytics and assessments in a diverse student body involves several strategies to prevent the technologies from inadvertently favoring certain groups over others. This includes designing analytics and assessment tools that are culturally sensitive and adaptable to various learning styles and backgrounds. It is crucial to involve students from diverse backgrounds in the development and testing phases of these tools to identify and mitigate any biases. Furthermore, continuous monitoring and analysis of the data collected by these systems can help identify disparities in student engagement and performance, prompting timely interventions to support underrepresented or disadvantaged students. By actively addressing these challenges, educators can create a more equitable and inclusive learning environment that recognises and supports the diverse needs of its students.

References:

[Source] Tempelaar D. Supporting the less-adaptive student: the role of learning analytics, formative assessment and blended learning. Assessment & Evaluation in Higher Education. 2020 May 18;45(4):579-93.
DOI: 10.1080/02602938.2019.1677855

[1] Pardo A. A feedback model for data-rich learning experiences. Assessment & Evaluation in Higher Education. 2018 Apr 3;43(3):428-38.
DOI: 10.1080/02602938.2017.1356905

Related Entries