Updated Apr 03, 2026
Many universities measure belonging once in the first few weeks and treat that result as a stable picture of student experience. This paper shows why that can create false confidence, especially when first-generation gaps emerge later rather than earlier. In "Dynamic belonging – how student belonging changes over time for first-generation students", David Gilani, Liz Thomas and Daniel McArthur follow students across the first year using questionnaires and online diaries. For UK Student Experience, Access and Participation, and Market Insights teams, the takeaway is practical: if you rely on one snapshot or one headline average, you can miss the point at which belonging starts to weaken.
Belonging is now a familiar term in higher education strategy, but it is often treated as if it were stable: something institutions can capture in an induction survey, then improve with a welcome-week programme and a few generic engagement activities. Related work on conditional belonging for minority ethnic STEM students points in the same direction. That creates a practical risk. If belonging changes over time, and changes differently for different groups, universities can draw the wrong conclusions from early data and miss the moment when support matters most.
Gilani and colleagues ask a sharper question: how does students' sense of belonging change through the first year, and does that trajectory differ for first-generation students? Their study used a mixed-method longitudinal design with 101 students at two English universities. Students completed questionnaires at multiple points in the academic year and also submitted online diaries, which the authors analysed to explain the quantitative patterns. That makes the paper useful for teams that want both a signal of change and an explanation of what is driving it.
Overall belonging fell across the first year. The paper reports that average sense of belonging declined as the year progressed, rather than steadily strengthening after induction. That matters for UK institutions because many belonging interventions are frontloaded into the first few weeks. If the experience weakens later, a positive early result can create false reassurance and delay action until problems are harder to shift.
"overall students' sense of belonging declines over the first year of study."
The first-generation gap appeared later, not at the start. Students whose parents or caregivers had not attended university did not begin the year with clearly lower belonging than their peers, but by the end of the first academic year their scores were significantly lower. For Student Experience and Access teams, that is a crucial timing signal: an early pulse survey might show little difference, while a later measure reveals a meaningful divergence that needs a different response.
The diary data explains why the gap widened. First-generation students more often described fragile peer relationships on their course, cultural barriers, reluctance to talk about finances, and a weaker sense that they mattered to staff. This is where the paper becomes especially useful for student voice practice in higher education. The quantitative trend shows that belonging changed; the written accounts show why it changed, which gives teams a clearer basis for intervention.
The mixed-method approach is one of the study's biggest practical strengths. The authors did not stop at reporting a movement in scores. They used coding matrix queries on students' detailed diary responses to interpret the pattern over time. For universities, that is a strong reminder that scaled items and open-text evidence should not compete. They answer different parts of the same question, and together they make belonging data far more actionable.
For UK higher education teams, the first implication is simple: stop treating belonging as a welcome-week metric. Measure it at several points across first year, especially after the first assessment period and later in the second term, when strain around friendship groups, finances, and confidence can surface more clearly. That matters even if welcome week attendance boosts peer belonging, because an early lift does not guarantee durable connection. That gives teams a better chance of spotting deterioration before it hardens into withdrawal, disengagement, or lower confidence.
Second, segment belonging evidence by student group and combine scales with open text. A short pulse survey with one or two belonging items can tell you whether there is movement, but it will not tell you what is driving it. Add an open-text prompt such as "What has most helped you feel part of your course this month?" or "What has made it harder to feel you belong?". Then read the responses by first-generation status, commuting pattern, or other widening participation indicators where available. The benefit is simple: you move from noticing a gap to understanding what is causing it.
Third, treat "mattering to staff" as an operational issue, not a soft extra. If students only begin to feel peripheral later in the year, the response is not another induction message. It is better course-level contact, visible academic support, personal tutoring and tutor check-ins, stronger peer connection after the early weeks, and clearer signposting when financial or cultural pressures start to affect engagement. That is where a vague wellbeing ambition becomes a concrete retention and student experience plan.
This is where Student Voice Analytics fits naturally. The paper's diary analysis highlights themes that many institutions already collect in free-text comments but struggle to compare over time or across cohorts. Analysing those comments systematically helps teams see whether belonging issues are concentrated in particular student groups, which themes are driving them, and whether interventions are changing the pattern. That makes it easier to move from anecdotal concern to decision-grade evidence.
Q: How can universities monitor belonging across first year without creating survey fatigue?
A: Use a small number of repeat pulse points rather than a large standalone instrument. Two or three short check-ins across the year, each with one or two belonging items and one open-text question, are often enough to show whether the trajectory is improving or weakening. Where possible, align these with existing module, school, or Access and Participation feedback cycles so students are not asked to complete a separate survey every time.
Q: What should we be cautious about when interpreting early first-year belonging data?
A: Early data can be misleading if it is treated as the whole story. This paper shows that first-generation students did not necessarily look different at the start, but the gap emerged later. A strong induction result should therefore be read as provisional, not conclusive. Repeat measurement and qualitative follow-up are essential if you want to know whether early belonging is durable.
Q: What does this mean for student voice practice more broadly?
A: It reinforces that belonging is dynamic and that open text is often where the actionable explanation sits. Students may not describe "belonging" directly, but they do describe friendships, staff approachability, confidence, financial pressure, and whether they feel noticed. Bringing those comments together over time gives universities a stronger basis for action than a single average score.
[Paper Source]: David Gilani, Liz Thomas, Daniel McArthur "Dynamic belonging – how student belonging changes over time for first-generation students" DOI: 10.1080/03075079.2026.2631772
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.