Tracking student belonging over time: why first-generation trajectories matter

Updated Feb 24, 2026

At Student Voice AI, we work with universities that want to understand student experience as it unfolds, not only as an end-of-year score. In a new Studies in Higher Education article, Gilani, Thomas and McArthur examine dynamic belonging by asking how student belonging changes over time for first-generation students. The paper is a useful prompt for UK teams because it highlights a practical measurement risk: a single belonging item, asked once, can be treated as stable evidence even when belonging may fluctuate in response to specific academic and social touchpoints. Read the paper here.

Context and research question

Belonging is often discussed as a foundation for engagement, continuation, and wellbeing. But institutions typically measure it in blunt ways: once per term, once per year, or only indirectly through proxy questions (for example, satisfaction, support, or community).

For UK universities working on access and participation, the first-generation lens matters because it is closely connected to transitions, confidence navigating institutional systems, and the extent to which students feel they have people and processes they can rely on. Gilani and colleagues focus on a core question with practical consequences: if belonging changes over time, when and how should we measure it, and what might that imply for how we support first-generation students?

Key findings

The central message is that belonging is dynamic, not a fixed trait. In operational terms, that means a single data point can be misleading. A cohort can look stable in averages while individuals (and subgroups) experience sharp shifts across weeks that never show up in end-of-module reporting.

The paper also points towards a more time-sensitive way of thinking about equity gaps. If first-generation students experience different belonging trajectories, it is not enough to compare annual means. What matters is the pattern: where belonging lifts, where it dips, and which moments in the student journey are acting as triggers.

For practitioners, one of the most useful implications is that belonging is often produced by systems. Students experience belonging through repeated interactions with teaching, assessment, peer culture, communications, and support processes. This is why belonging work cannot sit only with one team. The evidence needs to be triangulated across academic and professional services, with a shared view of which touchpoints are most likely to move the dial.

Finally, the paper reinforces an analysis point we see repeatedly in student voice work: the reasons behind belonging shifts are rarely visible in a single scale item. Quantitative belonging measures can tell you direction, but open-text feedback is often what tells you mechanism, for example, whether the issue is assessment clarity, peer connection, tutor responsiveness, timetabling friction, or a sense that "people like me" do not fit.

Practical implications

For UK higher education teams, three practical moves follow from this dynamic view of belonging.

First, treat belonging like a time series rather than an annual metric. Add a short, repeated belonging measure (even one item) to existing pulse surveys at predictable points in the term, such as early teaching weeks, pre-assessment periods, and after results. The goal is not more data for its own sake, but earlier signals you can act on.

Second, pair belonging measures with targeted open-text prompts. A simple prompt like "What has most helped you feel part of your course and university in the last two weeks?" can produce operationally useful data when analysed at scale. This is where Student Voice Analytics can help: categorising and benchmarking open comments so teams can see what is changing, for whom, and where to focus improvement work.

Third, segment and act with clarity. If you are using first-generation status (or related widening participation markers), define it consistently and ensure students can self-identify in a way that feels safe and meaningful. Then use that segmentation to prioritise fixes at moments that matter, for example, transition support, academic expectations, assessment communications, and access to peer networks. Close the loop visibly so students see that feedback leads to change, which itself is a driver of belonging.

FAQ

Q: How can we track belonging for first-generation students without running yet another survey?

A: Use the touchpoints you already have. Add a short belonging item to existing pulse surveys, module evaluation midpoints, or induction check-ins, then use one open-text prompt to capture the reasons behind changes. Sampling can also work well: rotate a small set of questions across weeks and keep the cadence predictable. The key is consistency over time, not length.

Q: What should we be cautious about when interpreting changes in belonging across a term?

A: Timing effects and response bias can distort apparent trends. Treat short-term changes as signals to investigate, not definitive proof of causality, and triangulate with other evidence such as attendance, help-seeking, support demand, and student comments. Where you are comparing groups, keep the question wording stable and check whether your measures behave similarly across subgroups and time points.

Q: What does this imply for NSS and internal survey analysis in the UK?

A: It suggests NSS and end-of-year measures are better seen as endpoints than as diagnostic tools. If belonging is dynamic, then waiting for annual results can mean you discover problems after the period when interventions are most effective. Pair outcome surveys with time-sensitive pulses and systematic open-text analysis, so you can identify the moments and mechanisms driving belonging, and verify whether changes are working.

References

[Paper Source]: David Gilani, Liz Thomas, Daniel Mcarthur "Dynamic belonging – how student belonging changes over time for first-generation students" DOI: 10.1080/03075079.2026.2631772

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.