Not consistently. Across NSS (National Student Survey) open-text comments on marking criteria, students register 87.9% Negative sentiment (index −44.6) from 13,329 comments, showing widespread concern about how criteria are presented and applied. In History of Art Architecture and Design, students are broadly positive overall (51.4% Positive), yet sentiment on marking criteria remains strongly negative (−49.4). These sector patterns shape the actions below: use transparent, calibrated expectations and timely, criteria-referenced feedback so assessment supports learning rather than undermines it.
Marking criteria sit at the heart of assessment in creative programmes. Student voice analysis highlights gaps between expectations and grading practice, especially when criteria are generic, released late, or applied unevenly. The sections below analyse recurrent issues and offer practical steps for programme teams.
What helps students understand marking criteria?
Criteria should be discipline-specific and tied to each assessment type. Programmes that publish annotated exemplars at typical grade bands and use checklist-style rubrics with weightings and common error notes help students see what “good” looks like. Release criteria alongside the assessment brief and run a short walk-through, so students can map creativity, technical skill, conceptual depth and presentation to learning outcomes. Workshops and review sessions work when they focus on applying the rubric to real work rather than re-stating it.
Where do transparency gaps arise, and how do they affect students?
Ambiguity often appears at course start and around larger projects, when students most need anchors for judgement. Without criteria-linked explanations, feedback reads as subjective, and students struggle to plan next steps. Embed an explanation of criteria into induction, revisit them at key project milestones, and highlight any intentional differences across modules where outcomes overlap. Keep a simple VLE FAQ that captures recurring questions and the agreed answers, so the message stays consistent across a cohort.
How does inconsistency in grading play out across modules and markers?
Variation in how staff interpret the same rubric erodes trust. Regular marker calibration using a small bank of shared samples, followed by brief “what we agreed” notes to students, aligns expectations. Double marking or moderation for substantial pieces, and refined rubrics that remove ambiguous wording, reduce drift between modules and markers.
What does effective feedback look like against the criteria?
Students progress faster when feedback explicitly references rubric lines and states how the judgement was reached. Provide a short “how your work was judged” summary with each grade and focus written comments on two or three priorities for improvement. For dissertations and larger projects, introduce a structured mid-point guidance checkpoint that turns formative advice into concrete next steps.
How do we reduce subjectivity and bias in marking?
Define constructs such as “creativity”, “originality” and “technical skill” with observable indicators. Use annotated exemplars to show how these indicators appear at different grade bands. Combine double marking or moderation with routine calibration conversations, and record decisions where interpretation is contested. This builds a shared frame of reference and strengthens student confidence in the process.
How should students and tutors communicate about criteria and judgement?
Encourage students to test ideas against the rubric during tutorials and crits. Schedule brief advising sessions around submission windows to translate criteria into action on current work. Keep an open channel for assessment queries and ensure staff signpost where to find criteria, exemplars, and FAQs. This normalises dialogue about standards and helps staff tailor support to individual needs.
What should programmes do next?
How Student Voice Analytics helps you
Student Voice Analytics shows how sentiment about marking criteria moves over time by cohort, site and mode, with like-for-like comparisons for History of Art Architecture and Design. You can target the cohorts where tone is most negative, export concise anonymised summaries for programme teams and boards, and share progress with ready-to-use outputs for web, decks and dashboards.
Request a walkthrough
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.
© Student Voice Systems Limited, All rights reserved.