What do media studies students say about remote learning?

Published Jan 19, 2024 · Updated Oct 12, 2025

remote learningmedia studies

Across the National Student Survey (NSS), comments on remote learning skew negative overall, and students in media studies reflect that pattern while praising staff. In the sector‑wide view, 42.0% of remote learning comments are positive and 53.8% negative, with media among the most negative subject groups (−15.3). Within media studies, students are notably positive about Teaching Staff (+40.7), yet full‑time cohorts feel the strain (−11.2). Remote learning cuts across subjects and modes; media studies, as a practice‑led grouping, reveals where digital delivery meets production pedagogy and why engagement, assessment clarity, and operational rhythm matter most.

Why does the sector picture matter for media studies students?

The shift to remote learning alters more than logistics: it reshapes how students engage, collaborate and sustain wellbeing. Media studies students adapted quickly to learning from home, but their feedback shows the balance between digital access and practice‑based learning remains delicate. Using student voice from surveys and text analysis, we focus on what changes staff introduce, how these land with different cohorts, and where programmes prioritise improvements.

How did course delivery and engagement shift online?

Delivery moved to virtual platforms and demanded new engagement strategies. Media studies’ digital orientation helped, but sustained participation required predictable structures. Live discussions, short practical online sessions and hybrid models worked best when programmes standardised a weekly rhythm, provided a single, stable link hub per module, and signposted tasks clearly. Seeking and acting on student feedback made engagement a shared endeavour rather than a passive broadcast.

What happened to mental health and wellbeing?

Isolation and loss of community affected motivation and study habits. Staff responded with virtual meet‑ups, online wellbeing sessions and targeted signposting to support. Building a digital community through forums and cohort groups helped students sustain momentum. Institutions that normalised check‑ins and low‑stakes social spaces alongside academic content saw better continuity of engagement because wellbeing and academic progress are intertwined.

How did strikes and communication gaps disrupt learning?

Industrial action and uneven communications delayed feedback, materials and timetabling, amplifying uncertainty. Regular email digests, scheduled virtual office hours and a single source of truth for updates stabilised expectations. For project‑heavy modules, timely guidance and interim check‑points reduced drift when schedules slipped.

Which course-specific issues surfaced?

Hands‑on components such as editing, sound design and branding workshops did not translate neatly to home settings. Staff negotiated academic licences, used cloud platforms and captured demonstrator videos to model techniques. For studio‑style learning, digital galleries, critique templates and step‑by‑step submission specs narrowed the gap between physical labs and online practice.

Where did flexibility and accessibility improve inclusion?

Asynchronous access broadened participation for students balancing work or caring responsibilities. Captioned recordings, transcripts, alt‑text and low‑bandwidth versions enabled parity for those with limited connectivity or variable schedules. Consistent structures for discussion and resource access allowed students to plan, revisit material and contribute at different times without losing coherence.

How did assessments and guidelines adapt?

Moving assessments online required tighter alignment between outcomes, briefs and marking criteria. Students reported uncertainty when objectives felt opaque. Programmes responded with checklist‑style rubrics, annotated exemplars and short “what we look for” walkthroughs. A predictable feedback turnaround and brief debriefs enabled students to act on advice, reducing frustration about criteria and standards.

What about equipment and resources at home?

Specialist hardware and software are hard to replicate outside campus. Loan schemes, remote access to labs, and discounted software reduced barriers, while enhanced online libraries supported research. Staff used student feedback to target provision where it mattered most, prioritising portability, compatibility and the tasks students needed to complete.

How did the wider university experience change?

Social interaction moved online, altering how cohorts build networks and collaborate. Virtual societies, discussion groups and informal ‘coffee chats’ sustained peer support, but spontaneous encounters were harder to reproduce. Blended approaches that combine scheduled online communities with occasional in‑person touchpoints better preserved the sense of belonging students value.

Why did disorganisation and delays matter?

Dispersed materials and slow responses cost students time and eroded confidence. Consolidating readings, assessment briefs and recordings in coherent spaces, and publishing response norms, reduced friction. Programmes that protected an operational rhythm and minimised late changes saw fewer complaints about disorganisation and timetabling.

How did Wi‑Fi problems derail study?

Unreliable home connectivity disrupted live teaching and collaboration. Downloadable materials, audio‑only or low‑bandwidth versions, and offline access mitigated loss of access. Where feasible, institutions provided data support or portable devices; staff designed sessions so students could re‑enter learning without penalty.

What should providers take from this?

Media studies students affirm the value of strong teaching and responsive support while highlighting the limits of digital substitution for practice‑based learning. The sector evidence explains why: remote learning views are net‑negative overall, and media subjects feel this more acutely. Providers that standardise online delivery, protect asynchronous parity, and make assessment expectations explicit help cohorts sustain engagement despite disruption.

How Student Voice Analytics helps you

Student Voice Analytics shows where remote learning and media studies intersect: it tracks topic volume and sentiment over time, compares like‑for‑like cohorts and subject groupings, and surfaces the operational friction students report. Teams can drill from institution to school and programme, produce concise, anonymised briefings for programme teams and committees, and export charts and tables to support continuous improvement. Weekly monitoring of access, audio, link churn and timetable slips helps you close the loop with short “what we fixed” updates that students notice.

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

More posts on remote learning:

More posts on media studies student views:

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.