Updated Apr 08, 2026
Assessment and feedback are often where student dissatisfaction becomes most visible. Deeley and Bovill (2017) show that staff-student partnerships can make assessment clearer, more transparent, and more useful to learners.
In their literature review, the authors note that assessment and feedback remain weak points in learning and teaching and major contributors to student dissatisfaction (Rust et al., 2005). When students are unclear about assessment requirements or marking criteria, they struggle to interpret feedback and apply it to future work (Deeley and Bovill, 2017).
Earlier research identified feedback models that help students understand three things: what good performance looks like in a given assessment, how their own work measures up, and what they need to do to close the gap between the two (Sadler, 1989). For institutions, that matters because these models depend on students understanding disciplinary language, the language of assessment, and the wider concept of assessment literacy (Deeley and Bovill, 2017).
That context has increased interest in student voice in assessment and feedback and staff and students working as partners in learning and teaching. Assessment has traditionally sat within the teacher's domain, so involving students more directly can feel challenging. Yet if assessment is also a tool for learning, not only for validation, then inviting students into its design can improve both understanding and engagement (Cook-Sather et al., 2014). Deeley and Bovill's study explores how students and academic staff can co-create elements of assessment and feedback, giving students a more active role in shaping how learning is judged and improved.
To explore this approach, the study examined how students experienced learning through staff-student partnerships in assessment. Students acted as co-designers in assessment and feedback processes, rather than passive recipients of them (Deeley and Bovill, 2017). The participants were undergraduate MA Social Sciences students at a Scottish university, enrolled on two optional Public Policy honours courses delivered across the first and second semesters of 2013/14.
The classroom included third- and fourth-year students, along with international students attending for one semester. Assessment on the courses combined a 3,000-word essay worth 40% of the overall grade and a two-hour examination worth 60%. This gave the partnership work a clear practical focus: students were contributing to real assessment decisions that shaped their learning experience.
In the first semester, the partnership covered five areas of assessment and feedback: (1) co-creating essay titles; (2) co-creating essay marking criteria; (3) carrying out formative self-assessment of essays using the co-designed criteria, then comparing that with teacher feedback; (4) co-creating formative and summative examination marking criteria; and (5) conducting peer review of the formative examination using the agreed criteria (Deeley and Bovill, 2017).
In the second semester, the partnership extended to a typed summative examination followed by online feedback on the final examination, using previously agreed marking criteria aligned with the formative exercise. That was notable because, at the Scottish university in question, marked examination papers were not usually returned to students (Deeley and Bovill, 2017). The practical takeaway is clear: partnership changed not just discussion about assessment, but the assessment experience itself.
The findings suggest that the staff-student partnership was successful, although some students initially saw it as unfamiliar or unusual. Because the courses were optional, students were told they could choose a different course if they were uncomfortable with the approach (Deeley and Bovill, 2017). That voluntary basis mattered because it created space for experimentation without forcing participation.
The wider value of the study lies in what it shows about assessment literacy. More frequent partnerships, in which students contribute as genuine partners, could challenge conventional assumptions about pedagogy and build stronger evidence that assessment can support learning, not simply test accumulated knowledge. Even though this was a small and specific case study, it shows how active student involvement can make expectations clearer and feedback more usable.
One especially useful example was the co-creation of marking criteria followed by formative self-assessment. That process allowed students to compare their own judgments with the teacher's assessment, helping them understand requirements more fully. Deeley and Bovill (2017) argue that this worked better than asking students to self-assess in isolation because students could judge their work against criteria they understood and calibrate that judgment against both peers and staff. For readers working in higher education, that is the core takeaway: partnership can make assessment standards visible instead of implicit.
The study also acknowledges limits. Staff-student partnerships will not suit every learning context, and some students or teachers may find them outside their comfort zone. Deeley and Bovill (2017) end with practical recommendations: start small by redesigning one assessment or one element of assessment; keep participation voluntary and offer alternatives; and plan proactively for inclusive participation. Those recommendations make the model more realistic for institutions that want to improve assessment literacy without overhauling every course at once.
Q: How do students' perceptions of their own academic capabilities change as a result of engaging in staff-student partnerships?
A: Engaging in staff-student partnerships can improve students' confidence in their academic capabilities because it gives them a clearer view of what good performance looks like and how their work is judged. When students help shape essay titles, marking criteria, and feedback processes, assessment becomes less opaque and more developmental. That clearer line of sight can strengthen agency and self-efficacy. If institutions analyse reflective comments or open-text feedback at scale alongside these changes, they can also see whether students are describing greater confidence, clearer expectations, and better understanding of standards over time.
Q: What are the challenges and limitations of implementing staff-student partnerships in diverse educational settings?
A: Implementing staff-student partnerships in diverse educational settings brings practical challenges. Institutions need to ensure that students feel comfortable participating, especially when cultural background, prior educational experience, or personal preference makes a more traditional model feel safer. The approach also takes time, because staff and students need space to co-create criteria, titles, and feedback processes thoughtfully. Another challenge is making sure student voice genuinely shapes decisions rather than being consulted superficially. Analysing open-text feedback can help institutions see where students feel excluded, uncertain, or unconvinced, making it easier to adapt the model for a wider range of learners.
Q: How does the inclusion of student voice in the assessment process impact the design of future curricula and assessment strategies?
A: Including student voice in assessment design can reshape future curricula and assessment strategies by showing educators how students interpret tasks, criteria, and feedback in practice. Those insights can lead to changes in teaching methods, clearer assessment design, and better alignment between what staff intend and what students understand. Over time, that can produce a more responsive curriculum that supports both academic development and student confidence. Analysing open-text feedback alongside formal assessment data adds another layer of evidence, helping institutions make changes based on patterns in student experience rather than assumption alone.
[Source] Deeley, S. J., & Bovill, C. (2017). Staff student partnership in assessment: enhancing assessment literacy through democratic practices. Assessment & Evaluation in Higher Education, 42(3), 463-477.
DOI: 10.1080/02602938.2015.1126551
[1] Cook-Sather, A., C. Bovill, and P. Felten. 2014. Engaging Students as Partners in Learning and Teaching. San Francisco, CA: Jossey Bass.
Available Here
[2] Rust, C., B. O’Donovan, and M. Price. 2005. “A Social Constructivist Assessment Process Model: How the Research Literature Shows us This Could be Best Practice.” Assessment & Evaluation in Higher Education 30 (3): 231–240.
DOI: 10.1080/02602930500063819
[3] Sadler, D. R. 1989. “Formative Assessment and the Design of Instructional Systems.” Instructional Science 18: 119–144.
DOI: 10.1007/BF00117714
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.