Research on Education Text Analysis and Teaching Best Practice
Marisa Graser - Nov 22, 2021
More and more courses are delivered entirely online. Accordingly, existing assessment approaches need to be adjusted to fit into the virtual teaching and learning environment (Robles and Braathen, 2002). To test for higher-order skills, most courses rely on the submission of coursework like essays. However, a number of challenges arise from this form of assessment and the online environment in general: Accessibility issues, legality of the assessment, identity security, and ensuring academic integrity (Akimov and Malin, 2020).
Akimov and Malin (2020) suggest online oral examinations to overcome some of these problems. For practical reasons, this style of assessment is most suitable for smaller, more advanced classes. Akimov and Malin (2020) followed a three-part assessment approach for their postgraduate finance course: An online quiz early in the course to track progress; an applied project with a time limit to test practical concepts taught in the course; and a final oral examination to test communication skills as well as higher-order skills like problem solving and critical thinking.
To prepare students for the oral exam, Akimov and Malin (2020) suggest distributing a list of questions at the beginning of the course that reflect the learning outcomes of every module. Students should prepare the relevant questions before every live lecture, where they should then be encouraged to discuss the questions and actively participate.
In Akimov and Malin’s case, the actual oral examination was set up in 30-minute slots over a two-day period. Students could select their time slots on a first-come, first-served basis. They structured their exam as follows:
Akimov and Malin (2020) advise to use an online form to immediately compile the mark without communicating it to the student directly to avoid delays. Instead, marks can be released after all examinations are completed. Furthermore, they suggest including the following four criteria in the marking rubric: completeness of response (worth 40%), accuracy of response (worth 40%), depth and breadth of knowledge (worth 10%), and oral communication skills (worth 10%).
To ensure successful implementation of the oral examination, Akimov and Malin (2020) highlight a few points. Firstly, an identification process should be integrated at the beginning of the exam, for example by showing a valid student ID or government-issued document, to avoid academic misconduct.
Secondly, the communication system should be tested in advance to avoid any technical issues.
Thirdly, intra-rater reliability should be ensured. All examinations could be recorded and a random selection thereof moderated by a second examiner. The selected recordings should thereby represent the breadth of marks achieved.
Lastly, to relieve some of the anxiety amongst students that is commonly associated with oral exams, mock online assessments could be set up. Akimov and Malin (2020) suggest that students can provide feedback on each other’s answers to increase the learning effect and reduce the workload on the course leader.
Although face-to-face examination is unfeasible in online learning, conducting an online oral assessment can make interactions and discussions possible. The delivery is flexible, and students receive immediate feedback. By using software for the marking process, cost and time can be saved as well and the turnover time for marking is shortened.
For students, communication and confidence are enhanced (Haque et al., 2016), which are sought after skills by employers (Murillo-Zamorano and Montanero, 2018). They are also more motivated to learn and understand the subject whilst cheating is discouraged (Akimov and Malin, 2020, Bhati, 2012). Overall, oral examinations are a great tool to integrate into online courses, especially when combined with other types of assessments.