Published Jun 21, 2024 · Updated Oct 12, 2025
costs and value for moneyeconomicsMostly, they remain sceptical. Across the National Student Survey (NSS), comments tagged to costs and value for money are 88.3% negative with a sentiment index of −46.7, and within economics the value-for-money signal is similarly negative at −55.0. Full-time students, who account for 78.7% of these comments, trend more negative (−50.4) than part-time peers. The category aggregates open‑text feedback on affordability across providers, while the economics subject area groups discipline‑level comments across UK provision. That sector pattern frames how economics students apply cost–benefit thinking to fees, living costs and outcomes.
How do tuition fees shape value decisions?
Economics students typically evaluate tuition as an investment, weighing price against expected returns. As of 2022, domestic students often spend up to £9,250 annually—a figure that prompts scrutiny of affordability and accessibility. Students analyse what fees entail—access to academics, resources, and peer networks—and compare these with potential long‑term gains in salary and progression. This sharper return‑on‑investment lens shapes expectations of programme quality, assessment standards, and employability support.
How do economics students manage living expenses while studying?
Students apply their training to budgeting for accommodation, utilities, food, transport, and materials. They compare on‑campus options with private renting, testing which choice minimises total spend and volatility in bills. Many prioritise cost‑effective food and travel and plan leisure spend to protect study time and savings. This everyday optimisation builds financial capability while balancing study demands.
What drives perceptions of value in the curriculum and delivery?
Perceived value rests on the relationship between cost, curriculum relevance, and teaching quality. Students look for explicit links between sessions, assessment briefs and marking criteria, strong coherence across modules, and timely feedback they can use. They notice breadth and choice in content and appreciate opportunities to apply theory to real‑world problems. They judge remote or poorly structured delivery more harshly, and they expect the VLE and handbooks to specify what is included, what is optional, and when extra costs arise.
How do career prospects influence return‑on‑investment judgements?
Opportunity cost and expected earnings sit at the centre of decision‑making. Some students prioritise programmes with strong pipelines into high‑remuneration roles; others value portable analytical skills and diverse career routes. Students weigh the total cost of study against projected earnings and the likelihood of securing roles that use economic analysis, and they expect careers support and employer links to be visible and effective throughout the programme.
How does economics training shape cost–benefit judgements?
Familiarity with cost‑benefit analysis, marginal thinking and risk informs how students assess both direct and indirect costs. They test whether assessments and learning activities contribute to employability and skills acquisition, and they interrogate fairness and consistency in assessment briefs and marking criteria. This orientation encourages them to seek transparency from staff and to ask for evidence that programme design and delivery choices improve outcomes.
How do economics students’ views differ from other disciplines?
Compared with peers who emphasise intrinsic intellectual growth, economics students more readily quantify trade‑offs and expected returns. They tend to be more demanding about assessment clarity and the predictability of costs and timetables. Staff who engage this perspective—by making value propositions explicit, mapping assessments to outcomes, and minimising out‑of‑pocket spend—see stronger perceptions of value.
What should universities change now?
How Student Voice Analytics helps you
Student Voice Analytics pinpoints where value‑for‑money concerns bite hardest and why. It tracks topics and sentiment by mode, age, subject area and cohort, drills from institution to programme level, and provides like‑for‑like comparisons across subject groupings and demographics. Teams get concise, export‑ready summaries to brief programme leads and operations, monitor movement over time, and evidence whether changes to assessment clarity, delivery and cost management improve student perceptions.
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.