Mostly, not. Across the National Student Survey (NSS), analysis of assessment methods comments shows a negative skew (index −18.8 from 11,318 comments), with engineering and technology more critical than most subjects (−25.5). The category captures how programmes across the sector design, weight and communicate tasks; within civil engineering, a discipline used for sector benchmarking, students call for clearer marking criteria and sustainable workload, with sentiment on marking criteria at −47.3 and workload at −44.1. These insights shape the improvements below.
How should coursework be weighted against exams?
In civil engineering courses across the UK, the balance of coursework versus examinations often sparks lively discussion among students and staff. This is not merely about comparing hours spent on coursework to exam preparation but evaluating the depth of understanding each method promotes. Coursework typically involves practical, ongoing assessment that mirrors real-world challenges, while examinations test the integration of theory under time constraints. Given the sector evidence that engineering students trend more negative on assessment methods, programme teams should calibrate for consistency and fairness: use short marker calibration with exemplars at grade boundaries, publish a one-page assessment method brief per task, and balance assessment types across modules so the mix aligns with learning outcomes rather than habit.
Do students receive enough information to complete assessments?
A frequent critique from civil engineering students centres on the lack of specific details provided for assessments. Students report needing additional readings to grasp requirements, which can lead to confusion and anxiety. Unclear guidance hinders equitable assessment, especially for students with less access to support. Prioritise unambiguous instructions: provide checklist-style rubrics, state purpose, weighting, allowed resources and common pitfalls, and give short orientation to formats and referencing. Engaging student voice in reviewing assessment briefs ensures instructions remain accessible and comprehensive.
Why do uncoordinated assessment dates undermine performance?
Students often find themselves overwhelmed when critical exams and project deadlines converge. Pile-ups magnify stress in a demanding curriculum and risk obscuring actual learning. Simultaneous deadlines may simulate professional pressures, but they can depress work quality and distort outcomes. Programme-level coordination helps: maintain a single assessment calendar, avoid method duplication within a term, and provide predictable submission windows so cohorts can plan effectively.
How can group projects assess both team and individual contributions?
Group projects offer authentic practice in teamwork, communication and project management, alongside technical application. They also risk uneven workload distribution and variable commitment. Make roles visible, assess both collective outputs and individual contributions, and use structured peer input. Brief reflective activities post-project help students evidence learning and inform targeted feed-forward.
How do exam errors affect fairness?
Errors in exams mislead students and undermine the integrity of assessment. While identifying anomalies is part of engineering judgement, exam mistakes cause avoidable stress and can penalise students who would otherwise excel. Strengthen pre-publication checks with a named proofreader and a short sign-off record, and provide a route for students to flag suspected errors during or immediately after the exam so issues can be corrected promptly.
When does time pressure test proficiency rather than learning?
Time limits can approximate professional constraints, but excessive pressure pushes surface learning and memorisation. Where outcomes emphasise problem solving or design judgement, consider formats that allow depth, such as extended time, open-book tasks, or staged submissions. Offer asynchronous alternatives for oral components where appropriate and release briefs early so mature and part-time learners can plan.
What does limited use of technology leave out of assessment?
Underusing tools such as BIM, AutoCAD and 3D modelling reduces authenticity and can slow transition to practice. Pilot technology-enhanced assessments with clear criteria and supported training for staff and students. Phase changes, scaffold the learning curve, and ensure equitable access to software and hardware so performance reflects learning, not tool familiarity.
Where do coordination and support break down?
Students often experience fragmented communication around deadlines, changes and academic support. Establish a single source of truth for assessment information, name an owner for timetabling and course organisation, and issue concise weekly updates during peak periods. Brief post-assessment debriefs that summarise common strengths and issues improve perceived fairness and transparency even before individual marks are released.
How Student Voice Analytics helps you
Request a walkthrough
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.
© Student Voice Systems Limited, All rights reserved.