Economics students want assessments that are transparent, consistent and varied, with timely feedback and predictable timetabling. Across the sector, the assessment methods category in National Student Survey (NSS) open‑text analysis aggregates 11,318 comments with 66.2% Negative sentiment (index −18.8). Within the Common Aggregation Hierarchy discipline for economics, Assessment methods attracts a −32.5 tone and accounts for 3.8% of student remarks. These signals frame the student expectations discussed below and show why clarity, calibration and programme‑level coordination matter for economics.
This blog post explores the various assessment methods used in economics courses from the students' perspective. By scrutinising prevalent issues encountered and suggesting potential remedies, the post taps into the concerns flagged by economics students across UK universities. As we look into the diversity of these assessment methods, we encounter a substantive need to evaluate their effectiveness. Students participate through mechanisms like student surveys and text analysis, which collectively amplify the student voice in academic circles. Traditionally, exams have been the mainstay in assessment strategies, but a growing dialogue surrounding alternative methods suggests a hunger for change that better reflects both student needs and the complexities of the modern economic field. Through an analytical approach, we unveil underlying themes from student feedback and highlight the importance of adapting assessment techniques to foster not only academic success but also a broader understanding of the subject. Integrating student perspectives through surveys contributes to a more nuanced view of how assessments evolve to more accurately reflect the real-world applications of economics, thereby enhancing the educational process for students. The implications are far-reaching, urging both staff and institutions to analyse and innovate current practices.
Where does communication about assessments break down?
A recurrent concern is ambiguity about what to do, how work will be marked, and when results arrive. Students highlight delayed results that hinder their learning and preparation for subsequent assessments, and ambiguous assignment and exam requirements that drive unnecessary stress. Publish a one‑page assessment brief for each task (purpose, alignment to learning outcomes, weighting, allowed resources, common pitfalls) and use checklist‑style rubrics with separated criteria and grade descriptors. Set and meet a realistic service level for feedback turnaround, and provide a short post‑assessment debrief summarising common strengths and issues before individual marks. For students new to UK HE conventions, a short orientation on formats, academic integrity and referencing with mini‑practice tasks reduces avoidable confusion. Building accessibility in from the start—plain‑language instructions and alternative formats—improves equity for disabled students.
How does time management affect assessments?
Students report that stringent time constraints can compromise depth of understanding in complex economic concepts. Staff should scrutinise whether timed tasks validly test the intended outcomes and whether timetabling concentrates deadlines. Coordinate at programme level with a single assessment calendar to avoid deadline pile‑ups and method clashes across modules. Offer predictable submission windows and early release of briefs, which particularly supports mature and part‑time learners. Where appropriate, pilot take‑home projects or extended essays that better reflect applied analysis and reduce dependence on speeded recall.
What goes wrong in group work?
Group projects can deliver collaboration and real‑world problem‑solving, but students experience uneven workload and grading fairness concerns. Structure groups deliberately, run brief coordination checks, and assess individual contribution alongside the group product. Calibrate markers with 2–3 anonymised exemplars at grade boundaries, and use targeted spot checks where variance is highest. Provide asynchronous alternatives for oral components so participation does not hinge on a single time/place.
Which assessment formats work for economics?
Students favour a balanced mix—essays, problem sets, presentations and applied projects—so they can demonstrate analytical reasoning and application, not just memorisation under time pressure. Map each assessment to the relevant learning outcomes and avoid duplication of methods within the same term. Use annotated exemplars to show what “meets” versus “exceeds” looks like, and explain how session content links to assessed outcomes. For international cohorts, state expectations around evidence, use of data and independence of work explicitly.
Are assessment methods relevant and fair?
Students scrutinise exam‑heavy approaches for limited capture of practical knowledge. Blend application‑based methods (portfolios, structured problem sets, continuous assessments) with rigorous moderation. Record concise moderation notes, sample double‑mark where variance is likely, and ensure marking criteria and feedback reference the rubric. Design inclusive options that do not disadvantage any group, and publish the rationale for chosen methods to improve perceived fairness.
How does feedback quality affect learning?
Timely, substantive feedback strengthens students’ analytical development. Where feedback lacks depth or actionable guidance, motivation dips. Agree programme‑wide expectations on turnaround times and minimum components (feed‑forward advice, reference to criteria, example of how to improve). Use brief debriefs to close the loop after each major assessment and offer staff development on effective feedback approaches. Provide worked examples and short “what to do next” guides that direct effort productively.
Should in-person exams continue during periods of high Covid-19 risk?
Students raise health risks around crowded exam settings, particularly for those who are immunocompromised or live with vulnerable family members. Many institutions evaluate digital or take‑home formats. Consider open‑book or extended essays that allow demonstration of understanding without large in‑person gatherings, while maintaining academic integrity through clear parameters and expectations. Evaluate the trade‑offs module by module.
What should economics departments do next?
Prioritise clarity, parity and flexibility. In economics, students’ comments concentrate on assessment expectations and the usefulness and timeliness of feedback, so departments should publish unambiguous briefs and rubrics, calibrate markers, coordinate an assessment calendar, and provide early orientation for diverse cohorts. These adjustments address the sector‑wide pattern in assessment methods and align with what students value in the discipline—coherence, choice and visible links between teaching and assessment.
How Student Voice Analytics helps you
Request a walkthrough
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.
© Student Voice Systems Limited, All rights reserved.