Published Jun 16, 2024 · Updated Oct 12, 2025
marking criteriahuman geographyMostly not. In National Student Survey (NSS) open-text analysis of the marking criteria theme across UK HE, 87.9% of 13,329 comments are negative; in human geography, students contribute 3,159 comments overall and, when they raise marking criteria, it accounts for 3.7% of remarks with a sentiment index of −47.3. Across the sector this category captures concerns about clarity and consistency, while within the subject area students emphasise the need for usable criteria, exemplars and predictable application.
What is distinctive about human geography for assessment?
Its interdisciplinary design blends social science reasoning with spatial and methodological rigour, so criteria must reward conceptual synthesis alongside technique. Students often rate fieldwork, trips and staff support highly in this subject, yet the criteria themselves need to translate that strength into assessable standards that reflect both theory and applied enquiry.
How should criteria balance qualitative and quantitative work?
Weight qualitative interpretation and argumentation against quantitative accuracy and method so students can see how each contributes to the grade. Use checklist-style rubrics with descriptors for both strands, and state weightings per element. For mixed-methods tasks, publish annotated exemplars at key bands to show what good looks like across modes of evidence.
What blend of subjective and objective judgement works best?
Combine structured rubric lines (objective anchors) with space for professional judgement on synthesis, originality and use of evidence (subjective appraisal). Run marker calibration on a small bank of shared samples and publish brief “what we agreed” notes so students understand how discretion operates within the rubric.
Where do student expectations diverge from practice?
Students expect criteria that map directly to the assessment brief, transparent weighting for fieldwork versus write-up, and consistent marking across modules. They often encounter opaque descriptors or different rules for similar tasks. Release criteria with the assessment brief, signpost any intentional differences across modules, and offer a short walk-through or Q&A so cohorts can test their understanding before submission.
What feedback practice helps students use criteria?
Provide a concise “how your work was judged” summary tied to rubric lines, with one or two actionable feed-forward points per criterion. Align comments to the language of the marking scheme so students can self-assess against it in future tasks. Where dissertations and research projects are involved, set and meet a realistic feedback service level and reference the criteria explicitly in supervision notes.
How do criteria shape learning outcomes?
Criteria direct attention and effort. When students understand how learning outcomes translate into standards and weightings, they plan data collection, analysis and argument more effectively. Standardise criteria where outcomes overlap, and make any departures explicit so students can transfer learning between modules without second-guessing the rules.
What should programmes change now?
How Student Voice Analytics helps you
Student Voice Analytics surfaces where criteria cause friction across programmes and within human geography, with like-for-like comparisons by cohort, mode and domicile. You can track sentiment over time, drill from provider to school/department/programme, and export concise, anonymised briefs for boards and teaching teams. The platform points to the cohorts where tone is most negative, shows whether interventions shift sentiment, and provides ready-to-use summaries that close the loop with students.
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.