- Staff-student partnerships to enhance assessment literacy

Staff-student partnerships to enhance assessment literacy

By Anosh Butt

Deeley and Bovill (2017) discovered that staff-student partnerships are quite uncommon in relation to assessments. While reviewing the literature, the authors accentuate that assessment and feedback are the weakest links in learning and teaching and the most significant contributors to student dissatisfaction (Rust et al., 2005). Students may lack clarity regarding assessment requirements and marking criteria, causing difficulties in understanding the feedback to improve skills applicable to future assignments (Deelay and Bovill, 2017).

Various models have been discovered in previous research, such as effective communicative strategies in feedback for students to know (a) the elements of good performance in a particular assessment are, (b) how students have performed in the assessment, and (c) what is needed by the student to bridge (a) and (b) (Sadler, 1989). The implementation of these models relies on students being conversant in academic subject discipline language, the language of assessment, and assessment literacy (Deelay and Bovill, 2017).

These models were coupled with an increased interest in staff and students working as partners in learning and teaching. In a staff-student partnership, there may be concerns regarding the meaningful participation of students as partners for assessment due to the validatory nature of assessment associated with the domain of the teacher; however, opportunities to assess what students have learned is an opportunity to design assessment for learning (Cook-Sather et al., 2014). The study explores student participation with academic staff in assessment processes to co-create different assessment and feedback elements (Deelay and Bovill, 2017).


To investigate staff-student partnerships student perspectives about learning during staff-student partnerships were explored. These partnerships entailed engaged students as co-designers within assessment and feedback processes (Deelay and Bovill, 2017). They were undergraduate MA Social Sciences students from a Scottish university. The solution involved students studying two optional Public Policy honours courses taught in the first and second semesters of 2013/14. The classroom environment consisted of a mixed classroom of third- and fourth-year students and international students visiting for one semester. The courses were assessed by a 3000-word essay, having a weightage of 40% of the overall grade and a two-hour examination weighted 60%.

The areas of assessment and feedback where a partnership approach was adopted in the first semester are as follows: (1) staff-student co-creation of students’ essay titles; (2) staff-student co-creation of essay marking criteria; (3) students’ formative self-assessment of their essays, using co-designed marking criteria that later compared with teacher’s feedback on essays; (4) staff-student co-creation of formative and summative examination marking criteria; and (5) student peer review of formative examination and agreed co-designed marking criteria (Deelay and Bovill, 2017).

Staff-student partnerships in the second semester consisted of a typed summative examination followed by online feedback on their final examination using a previously agreed marking criteria consistent with the format of the formative examination feedback. It should be appreciated regarding summative examination feedback that it was not a usual practice at the Scottish university as marked examination papers were previously not returned to students (Deelay and Bovill, 2017).

Measurable impact

The research findings showed successful staff-student partnership. For some students, it was an unfamiliar approach and came across as a curious arrangement. These courses were optional, and it was made clear to the students that if they did not feel comfortable with the assessment methods, they could opt for a different course (Deelay and Bovill, 2017). The research has a great influence on the flourishing concept of enhancing assessment literacy essential to all participants of assessment practice.

More frequent staff-student partnerships in the educational community in which students are contributing partners would challenge conventional pedagogy and assumptions of learning and building a body of work to ensure that assessment is a powerful process for learning not just for testing accumulated knowledge. Deelay and Bovill’s (2017) research concerned a very specific and small case study to attempt to engage students actively and meaningfully with their learning through assessment. Prolonged staff-student partnerships would enable and enhance democratic classroom practice. Co-creating marking criteria followed by students formatively self-assessing their essays enabled students to compare their assessment with the teacher’s assessment facilitating them to understand the assessment requirements fully. Deelay and Bovill (2017) believed that this was more efficient than merely asking students to self-assess their work because it allowed them to scale their ability against criteria, they were familiar with, and calibrate their self-assessment with their peers and teacher’s assessment.

However, staff-student partnerships may not be appropriate in all learning situations, and it may be outside the comfort zone for some students and teachers. The research concludes by offering various practical recommendations that include: (1) starting small, designing partnership in one assessment or one element of assessment; (2) ensuring partnership is voluntary and alternatives are available for students who do not wish to take part; and (3) thinking proactively about ensuring inclusive approaches for all participating students (Deelay and Bovill, 2017). Staff-student partnerships to enhance students’ assessment literacy through democratic practice are innovative as they facilitate intrinsic motivation, active engagement, and deeper learning approaches.


Q: How do students' perceptions of their own academic capabilities change as a result of engaging in staff-student partnerships?

A: Engaging in staff-student partnerships tends to positively influence students' perceptions of their academic capabilities. Through active involvement in the assessment process, including the co-creation of essay titles and marking criteria, students gain a deeper understanding of what is expected of them. This involvement helps demystify the assessment process, making it less about meeting arbitrary standards and more about personal growth and achievement. The use of student voice in these partnerships ensures that students feel heard and valued, which can boost their confidence and sense of agency. Furthermore, text analysis of students' reflective writings or feedback could reveal increases in self-efficacy, as students articulate their learning processes and outcomes more confidently and accurately. This approach not only enhances their understanding of the subject matter but also develops their assessment literacy, allowing them to better evaluate their own work and the work of their peers.

Q: What are the challenges and limitations of implementing staff-student partnerships in diverse educational settings?

A: Implementing staff-student partnerships in diverse educational settings presents several challenges and limitations. One significant challenge is ensuring that all students feel comfortable and willing to engage in these partnerships. Differences in cultural backgrounds, educational experiences, and personal preferences can affect students' willingness to participate actively. Some may find the approach too radical or outside their comfort zone, preferring more traditional, teacher-led assessment methods. Another limitation is the resource intensity of such programmes, requiring significant time and effort from both staff and students to co-create assessment criteria, titles, and feedback mechanisms. Additionally, there's the challenge of ensuring that the student voice is genuinely heard and valued, rather than the process being dominated by staff perspectives. Text analysis could help identify these challenges by analysing feedback from students across various settings, enabling educators to adapt the partnerships to better meet the needs of a diverse student body.

Q: How does the inclusion of student voice in the assessment process impact the design of future curricula and assessment strategies?

A: The inclusion of student voice in the assessment process can have a profound impact on the design of future curricula and assessment strategies. When students are actively involved in creating assessment criteria and feedback mechanisms, educators gain valuable insights into how students understand and engage with the material. This can lead to adjustments in teaching methods, the introduction of new topics, or changes in how subjects are assessed to better align with students' student needs and needs. Over time, this collaborative approach can lead to a more dynamic and responsive curriculum that better prepares students for the challenges they will face in their academic and professional lives. Text analysis of student feedback and assessments can play a crucial role in this process, providing educators with a detailed understanding of student experiences and perceptions. This data-driven approach ensures that changes to the curriculum and assessment strategies are grounded in evidence, enhancing the overall educational experience for future students.


[Source] Deeley, S. J., & Bovill, C. (2017). Staff student partnership in assessment: enhancing assessment literacy through democratic practices. Assessment & Evaluation in Higher Education, 42(3), 463-477.
DOI: 10.1080/02602938.2015.1126551

[1] Cook-Sather, A., C. Bovill, and P. Felten. 2014. Engaging Students as Partners in Learning and Teaching. San Francisco, CA: Jossey Bass.
Available Here

[2] Rust, C., B. O’Donovan, and M. Price. 2005. “A Social Constructivist Assessment Process Model: How the Research Literature Shows us This Could be Best Practice.” Assessment & Evaluation in Higher Education 30 (3): 231–240.
DOI: 10.1080/02602930500063819

[3] Sadler, D. R. 1989. “Formative Assessment and the Design of Instructional Systems.” Instructional Science 18: 119–144.
DOI: 10.1007/BF00117714

Related Entries