Published Jun 07, 2024 · Updated Oct 12, 2025
marking criteriabusiness studiesNo. In the marking criteria category of National Student Survey (NSS) open‑text comments used across UK higher education to benchmark assessment practice, 87.9% of remarks are negative and the overall sentiment index sits at −44.6. Within business studies, the subject grouping used for sector comparisons, Marking criteria accounts for 4.6% of all comments and carries a −43.1 tone; Feedback is also prominent at 8.1% of comments. This evidence explains why students often describe criteria as unpredictable and sets the direction for practical fixes that strengthen transparency, calibration and communication.
Where do students see clarity and consistency gaps in marking?
Students describe inconsistent interpretation of assessment briefs and variable application of rubrics, which erodes confidence and affects attainment. Programmes should publish annotated exemplars at key grade bands, use checklist‑style rubrics with weightings and common error notes, release criteria with the assessment brief, and run short marker calibration using a shared sample bank. Sharing a concise “what we agreed” note after calibration helps students understand how standards are applied across a cohort.
Why does feedback timeliness matter, and what works?
Delayed feedback slows learning and undermines motivation. Set and meet a credible service level for returns, and provide a short “how your work was judged” summary that references rubric lines rather than generic comments. Use structured comment banks to maintain quality at pace, and schedule brief feed‑forward opportunities before major deadlines so students can action guidance on their next submission.
How can group work marks reflect individual contribution?
Students report uneven contribution but shared grades. Calibrated peer assessment, short group contracts and reflective logs provide evidence of individual input without excessive process. Staged milestones with quick formative checks reduce end‑stage disputes and make moderation more straightforward. Where outcomes demand a shared mark, explain the rationale and how individual performance is recognised elsewhere in the module.
What guidance on dissertations do students need?
Anxiety often stems from opaque expectations. Provide a template structure, minimum evidence standards and a mapping from learning outcomes to rubric criteria. Use exemplars that illustrate acceptable methodological choices and levels of critical analysis. Text‑analysis tools can support consistency, but supervisors’ qualitative judgement should guide feedback on argument, synthesis and contribution.
What happens when assignment criteria are vague?
Vague criteria push students to guess what “good” looks like and push markers toward subjective judgement. Replace broad descriptors with unambiguous, observable indicators and make any intentional differences between modules explicit. Align each rubric line to learning outcomes and assessment type, and distribute criteria at the point of briefing with a short Q&A or walk‑through.
How do communication gaps in criteria and feedback affect performance?
Sparse, grade‑only returns limit improvement. Require feedback that points to specific rubric descriptors and next‑step actions. After moderation, publish a brief cohort‑level explanation of frequent strengths, recurring pitfalls and how these mapped to the rubric. Track repeated queries in a simple VLE FAQ so students see issues closed and standards reinforced.
What drives perceptions of unfairness or bias in grading?
Perceptions of rushed or inconsistent grading persist when standards are opaque. Programmes should schedule rapid calibration before high‑volume marking, sample second‑marking where risk is highest, and provide short rationales tied to criteria with each grade. Routine training for new markers, including practice on borderline cases, helps align standards and reduces appeals.
What should business programmes do next?
How Student Voice Analytics helps you
Student Voice Analytics shows how sentiment on marking criteria moves over time by cohort, site and mode, with like‑for‑like comparisons to business studies peers. It pinpoints where tone is most negative, surfaces representative comments, and exports concise summaries for programme teams and boards. Flexible segmentation and ready‑to‑share outputs make it straightforward to evidence change and keep assessment standards consistent across modules.
Request a walkthrough
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Most teams are live within a week
© Student Voice Systems Limited, All rights reserved.