Do business studies students trust marking criteria?

Updated Mar 23, 2026

marking criteriabusiness studies

Marking criteria should remove guesswork, not create it. Yet in the marking criteria category of National Student Survey (NSS) open-text comments used across UK higher education to benchmark assessment practice, 87.9% of remarks are negative and the overall sentiment index sits at -44.6. Within business studies, the subject grouping used for sector comparisons, marking criteria accounts for 4.6% of all comments and carries a -43.1 tone; feedback is also prominent at 8.1% of comments. This evidence explains why students often describe criteria as unpredictable, and it points to practical fixes that strengthen transparency, calibration, and communication.

Where do students see clarity and consistency gaps in marking?

Students describe inconsistent interpretation of assessment briefs and variable application of rubrics, which erodes confidence and affects attainment. Programmes should publish annotated exemplars at key grade bands, use checklist-style rubrics with weightings and common error notes, release criteria with the assessment brief, and run short marker calibration using a shared sample bank. Sharing a concise "what we agreed" note after calibration helps students understand how standards are applied across a cohort, so they spend less time guessing what markers want.

Why does feedback timeliness matter, and what works?

Delayed feedback slows learning and undermines motivation. Set and meet a credible service level for returns, and provide a short "how your work was judged" summary that references rubric lines rather than generic comments. Use structured comment banks to maintain quality at pace, and schedule brief feed-forward opportunities before major deadlines so students can act on guidance in their next submission. That turns feedback into something students can use, not just something they receive.

How can group work marks reflect individual contribution?

Students report uneven contribution but shared grades. Calibrated peer assessment, short group contracts, and reflective logs provide evidence of individual input without excessive process. Staged milestones with quick formative checks reduce late-stage disputes and make moderation more straightforward. Where outcomes demand a shared mark, explain the rationale and how individual performance is recognised elsewhere in the module. That makes group assessment feel more defensible for students and easier to moderate for staff.

What guidance on dissertations do students need?

Anxiety often stems from opaque expectations. Provide a template structure, minimum evidence standards, and a mapping from learning outcomes to rubric criteria. Use exemplars that illustrate acceptable methodological choices and levels of critical analysis. Text-analysis tools can support consistency, but supervisors' qualitative judgement should guide feedback on argument, synthesis, and contribution. Clearer dissertation guidance lets students focus on building a strong argument instead of decoding hidden rules.

What happens when assignment criteria are vague?

Vague criteria push students to guess what "good" looks like and push markers toward subjective judgement. Replace broad descriptors with unambiguous, observable indicators and make any intentional differences between modules explicit. Align each rubric line to learning outcomes and assessment type, and distribute criteria at the point of briefing with a short Q&A or walk-through. That makes expectations easier to act on and reduces avoidable variation in marking.

How do communication gaps in criteria and feedback affect performance?

Sparse, grade-only returns limit improvement. Require feedback that points to specific rubric descriptors and next-step actions. After moderation, publish a brief cohort-level explanation of frequent strengths, recurring pitfalls, and how these mapped to the rubric. Track repeated queries in a simple VLE FAQ so students see issues closed and standards reinforced. This gives students a clearer route to improvement and reduces the same questions being asked each term.

What drives perceptions of unfairness or bias in grading?

Perceptions of rushed or inconsistent grading persist when standards are opaque. Programmes should schedule rapid calibration before high-volume marking, sample second-marking where risk is highest, and provide short rationales tied to criteria with each grade. Routine training for new markers, including practice on borderline cases, helps align standards and reduces appeals. These steps make grading decisions easier to defend and help students trust the process, even when marks disappoint.

What should business programmes do next?

  • Publish exemplars and checklist-style rubrics with weightings; release both alongside the assessment brief.
  • Run marker calibration on shared samples and share a short "what we agreed" note with students.
  • Return grades with a "how your work was judged" summary, referencing rubric lines and next-step actions.
  • Standardise criteria across modules where outcomes overlap, flagging any intentional differences.
  • Offer brief feed‑forward access before deadlines and keep an evolving FAQ to close the loop on common queries.

How Student Voice Analytics helps you

If you want to see where trust in marking criteria is breaking down, Student Voice Analytics shows how sentiment moves over time by cohort, site, and mode, with like-for-like comparisons to business studies peers. It pinpoints where tone is most negative, surfaces representative comments, and exports concise summaries for programme teams and boards. Flexible segmentation and ready-to-share outputs make it straightforward to target changes, evidence improvement, and keep assessment standards consistent across modules.

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

More posts on marking criteria:

More posts on business studies student views:

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.