What did politics students say about the pandemic response?
Published May 21, 2024 · Updated Oct 12, 2025
COVID-19politicsPolitics students reported that pandemic-era delivery felt uncertain and often weak on communications, echoing the sector-wide pattern for COVID-19 in National Student Survey (NSS) comments, where 68.6% of comments are negative (index −24.0). Within politics in the sector’s Common Aggregation Hierarchy, the COVID‑19 lens remains strongly negative (−40.5), with communication about course and teaching particularly weak (−43.9). Against that backdrop, political science students here acknowledge staff commitment and curricular breadth, but point to variable online delivery, uneven access to resources, and unresolved questions about assessment and value for money (−56.5).
During the COVID-19 pandemic, political science students found themselves navigating a radically changed arena of higher education. In the face of widely introduced lockdowns and remote learning protocols, the immediate challenges were multifaceted for both students and staff at universities. The pandemic's impact on academic communication, access to resources, and the shift to virtual learning environments threw several aspects of their educational experience into the spotlight. While some policies implemented during this crisis worked, others generated substantive hurdles, affecting academic and personal lives simultaneously.
Where did communication break down, and what worked better?
Across the early months, updates about course adjustments, health guidelines and operations often arrived late, through multiple channels and with mixed messages, generating uncertainty and frustration. Students valued institutions that named a single source of truth, issued short weekly “what changed and why” notes, and provided rapid Q&A sessions at programme level. Departments that adopted virtual town halls and kept FAQs current saw improvements, but inconsistency between modules undermined trust. The pattern points to a need for disruption-ready communications: simple playbooks, accountable owners for course-level updates, and explicit statements about any temporary adjustments and how they will be reviewed.
How did access to resources change, and who adapted fastest?
Loss of physical libraries and study spaces exposed licensing limits and patchy digital infrastructure. Where universities expanded e‑book licenses, negotiated broader database access and stood up stable remote desktop services, students could sustain research momentum; where they did not, delays compounded assessment pressure. Institutions that mirrored stronger sector practices from clinically oriented disciplines—continuity of learning, clarity about assessment changes, and upfront resource lists—tended to stabilise student experience more quickly. Political science benefits when digital reading lists are guaranteed, with alternatives or scans pre-authorised for high-demand texts.
Did online learning meet Politics’ interactive needs?
Debate and deliberation are core to political studies. Video platforms sustained engagement, but absence of non-verbal cues and reduced immediacy limited the quality of discussion for some cohorts. Students appreciated flexibility and recorded content; they asked for more structured small‑group seminars, clearer expectations for participation, and assessment briefs designed for online discussion formats. Programmes that combined concise pre‑work with shorter, focused live sessions and explicit feed‑forward in feedback preserved interaction better than long lectures on video.
What happened to internships and experiential learning?
Internships, placements and fieldwork were curtailed or moved online, with mixed educational value. Virtual alternatives maintained continuity but rarely replicated networking and context-rich learning. Students responded best when departments articulated mitigations—how learning outcomes would be evidenced differently, what substitution tasks would count, and how professional skills would be recognised. Building simulations, practitioner-led live cases and structured, work-integrated rhythms helped sustain motivation while physical opportunities were paused.
How did students judge value for money?
As delivery moved online and campus facilities closed, students questioned whether fees aligned with the experience available. Concerns were sharper where communication faltered or teaching replacements and assessment adjustments were unclear. In politics feedback, Costs/Value for money trends strongly negative (−56.5). Students were more accepting where providers documented mitigations, set out what was and was not included in fees during disruption, and evidenced that learning outcomes and standards were protected.
Were support systems and wellbeing provision sufficient?
Feelings of isolation and stress increased as routines dissolved. Online counselling and workshops helped, but coverage and responsiveness varied. Students wanted predictable academic advising, visible escalation routes through personal tutoring, and coherent signposting to specialist services. Providers that published service standards for tutoring and support, maintained a single point of entry for queries, and made disability-related adjustments explicit when arrangements changed reduced friction and repeat contacts.
What are the lessons and immediate improvements?
Three priorities stand out. First, make assessment clarity non‑negotiable: align marking criteria across modules, issue annotated exemplars, and give feedback that explains what to do next. Second, tighten the operational rhythm: maintain one up‑to‑date source of truth for timetables, assessment changes and rooming; issue weekly change logs; and assign an owner for course communications. Third, make subject‑level pain points visible: run short, time‑bound reviews of workload pacing, access to specialist activities and assessment briefs, and publish the fixes. Document mitigations for pandemic or industrial action, and show how continuity of learning and standards are maintained.
How Student Voice Analytics helps you
Student Voice Analytics turns open‑text survey comments into structured insight for Politics and COVID‑19 topics. It tracks topic volume and sentiment over time, lets you compare like‑for‑like across subject groups and demographics, and surfaces the delivery and assessment issues that move NSS results. Programme and school teams can generate concise, anonymised summaries, segment by cohort or site, and export shareable outputs to brief committees and external partners. The result is faster prioritisation, targeted fixes, and an auditable record of improvements.
Request a walkthrough
Book a Student Voice Analytics demo
See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.
-
All-comment coverage with HE-tuned taxonomy and sentiment.
-
Versioned outputs with TEF-ready governance packs.
-
Benchmarks and BI-ready exports for boards and Senate.
More posts on COVID-19:
More posts on politics student views: