Are design studies students getting the right mix of course content and structure?

By Student Voice Analytics
type and breadth of course contentdesign studies

Yes: breadth is welcomed and students want programme organisation to keep pace. Across the National Student Survey (NSS), the type and breadth of course content analysis tracks the scope and variety students describe across UK higher education; between 2018 and 2025 it contains 25,847 comments, 70.6% positive. Within design studies, the standard subject grouping used for programme-level comparison, ~5,238 comments show a more mixed overall mood with 55.9% positive. The picture for design studies is therefore one of valued breadth that can under‑realise its potential when delivery mechanics falter.

This discussion sets the stage for understanding the diverse array of course contents available to design studies students and how these offerings align with industry demands and teaching methods. At the core is the student voice. Through text analysis of student surveys, this post presents a grounded picture of current experiences and priorities for enhancement. This approach enables staff to refine courses that meet the evolving needs of students and the design sector.

How do students weigh engaging content against practical depth?

Students want both. Many value lively, creative sessions that sustain motivation, while others prioritise technical depth and transferable methods that shape confident practice. Fast‑moving, applied areas push programmes to evidence currency and application across modules. The task is to design for breadth and progression, anchoring creative exploration in iterative, assessed practice so graduates leave with ideas, methods and robust outputs.

What teaching resources unlock better learning in design?

Students ask for more sustained, expert guidance on projects and portfolios. Better staff-to-student ratios support formative critique, timely advice and stronger assessment literacy. Programmes that protect contact for tutorials and studio feedback, and make staff availability predictable, help students convert breadth into coherent portfolios and professional narratives.

How should programmes integrate industry practice?

Students respond well to embedded industry methods: design sprints, live briefs and workshops led by practitioners. Bringing professionals into modules provides current insight and situates theory in use. Structured placements and external projects develop critical judgement and client‑ready delivery. Institutions that timetable these engagements reliably and assess them transparently increase perceived relevance and readiness for work.

Does access to facilities change outcomes?

Yes. Design thrives on making, and students notice when workshops, studios and digital suites are available, well maintained and easy to book. Portfolio quality benefits from predictable access and basic IT reliability; gaps here disrupt project flow. Investment in booking transparency, uptime communication and technician-led inductions compounds benefits for cohort learning and output quality.

Where do software and specialised training fit?

Students want more than introductions to CAD and Adobe tools—they want structured, hands‑on mastery linked to project briefs. Dedicated modules and clinics, exemplars that show how software decisions affect outcomes, and assessment that rewards process as well as product help students connect tools to practice.

What role do networking and industry exposure play?

Structured exposure widens horizons and contextualises classroom learning. Internships, visiting speakers and industry crits help students test their ideas against current practice and meet potential collaborators or employers. Programmes that curate these touchpoints across the year, not only at capstone stages, build confidence and career clarity.

How should course structure and organisation support learning?

Students ask for timetabling that protects studio time, avoids option clashes and sequences assessment sensibly. In design studies comments, scheduling/timetabling sentiment sits at -25.1, signalling that operational friction undermines otherwise valued content. A coherent “breadth map” across years, stable option pathways, concise assessment briefs with calibrated marking criteria, and a single source of truth for changes remove avoidable noise so students can focus on making and learning.

How Student Voice Analytics helps you

  • Track movement in type and breadth of course content over time and by segment, and compare design studies with like‑for‑like peers.
  • Drill from institution to school/department and subject group to see where facilities, timetabling and assessment clarity drive sentiment.
  • Generate concise, anonymised briefs that show what changed, for whom, and where to act next—ready for Boards of Study, APRs and student‑staff committees.
  • Evidence improvement with exportable summaries that link actions to shifts in NSS open‑text sentiment within design studies.

Request a walkthrough

Book a Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready governance packs.
  • Benchmarks and BI-ready exports for boards and Senate.

More posts on type and breadth of course content:

More posts on design studies student views: