Are economics students getting clarity and consistency in marking?

Published May 30, 2024 · Updated Oct 12, 2025

marking criteriaeconomics

No. Across NSS (National Student Survey) open-text data on marking criteria, a sector-wide lens on how criteria are presented and applied, 87.9% of comments are negative (index −44.6) across ~13,329 comments; within economics, the tone on criteria is even more negative (−48.1), and feedback occupies 9.8% of the discipline conversation. That pattern frames this case: students respond best when programmes publish unambiguous rubrics and exemplars, release criteria with the brief, and calibrate markers so judgements are consistent across modules.

Exam and Coursework Guidance: what do students need to see?

The area of exam and coursework guidance in economics education reveals a large variance in the clarity and articulation of marking criteria. Some courses demarcate expectations with detailed rubrics and exemplars, and these courses tend to see higher student satisfaction and understanding. Numerous students, however, report ambiguity in guidance, especially when preparing for exams. Unclear criteria complicate preparation and affect the perceived fairness of grading. When students understand precisely what markers are looking for, they can more effectively demonstrate their knowledge and skills.

Economics departments should publish checklist-style rubrics with weightings and common error notes, align them to learning outcomes, and provide annotated exemplars at key grade bands. Release criteria with the assessment brief and hold a short walk-through or Q&A; offer a brief feed-forward clinic before submission windows in high-volume modules. Use jargon-free language to maintain a level playing field and reduce anxiety so students focus on the substantive content of their courses.

What challenges do economics students face in grading?

Challenges often stem from highly specific marking criteria that can seem impenetrable to students. It is well-known that achieving marks above 70 is quite rare, raising concerns about stringency and fairness. Strict standards ensure that high grades reflect deeper understanding and analytical ability, yet students can feel discouraged when sustained effort does not translate into expected outcomes.

Programme teams should communicate transparently what each grade bracket demands and provide illustrative examples of high-quality work. Marker calibration using a short bank of shared samples, with brief notes on what meets versus exceeds the criteria, reduces noise about fairness and consistency. These measures align marking practices with educational goals and keep intellectual inquiry at the centre.

How timely and useful is feedback in economics?

Timeliness and relevance of feedback drive learning and satisfaction. Prompt feedback allows students to reflect while content remains fresh; delays hinder learning and reduce motivation. Relevance matters as much as speed: comments should reference the rubric lines used, with explicit pointers for improvement.

Agree a realistic service level for feedback turnaround at programme level and share it with cohorts. Provide a short “how your work was judged” summary when grades are returned, linking comments to marking criteria. Use feed-forward as standard, not exception, to steer students into the next assessment with clarity.

Are group assignments marked consistently?

Perceptions of fairness in group work often hinge on the uniformity of applied criteria. Variation in how markers interpret group outcomes fuels concerns about equity and strategic uncertainty for students.

Standardise criteria across modules where learning outcomes overlap and highlight any intentional differences up front. Calibrate markers on group assessment exemplars and publish a simple “what we agreed” note to students. Co-create or review criteria with student representatives to increase transparency and buy-in.

What happens when marking is late or inconsistent?

Late results cause anxious waits and constrain preparation for subsequent assessments. Inconsistencies across markers or assignments leave students unclear about expectations and how to improve. While large cohorts and complex content increase pressure, students require timely, consistent judgement to navigate their academic journey.

Provide regular staff development focused on the agreed criteria and moderate with shared samples to standardise judgement. Give students examples of graded work and a short decision rationale tied to rubric lines to demystify expectations. These steps support fairness and sustain trust in the assessment process.

How do communication gaps affect grade appeals?

Poor communication from examiners and a complicated appeal process discourage students from challenging grades when they have valid concerns. When criteria and rationales are opaque, students struggle to understand outcomes or pathways to improvement.

Create a single source of truth for assessment information on the VLE, publish criteria with descriptors and weightings, and maintain a short FAQ that captures recurring queries. Encourage students to request grade discussions focused on the rubric and ensure staff have guidance on effective, consistent explanations. Streamlining the route into and through the appeal process helps maintain confidence in the system.

What would transparent, cohesive criteria look like?

Transparency demystifies evaluation and improves fairness. Implement consistent criteria across exams and coursework, map them to learning outcomes, and show students how evidence aligns to grade bands. Regular dialogue with students should refine and clarify standards.

Publish annotated exemplars, release criteria with assessment briefs, hold short walk-throughs, and run feed-forward clinics before submissions. Run marker calibration and share a short “what we agreed” note. Track and close the loop on recurring questions, updating guidance where needed. Sector evidence on marking criteria shows sustained negativity (NSS category sentiment index −44.6), so visible, consistent practice is the most credible route to improvement.

How Student Voice Analytics helps you

Student Voice Analytics shows how student sentiment about marking criteria moves over time and by cohort, site or mode, with drill-downs from provider to school, programme and module. You can compare like-for-like with other disciplines and demographics to target cohorts where tone is most negative, and benchmark economics against its peers. The platform produces concise, anonymised summaries for programme teams and boards, and export-ready outputs for sharing priorities and progress across the institution.

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

More posts on marking criteria:

More posts on economics student views:

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.