Updated May 06, 2026
International taught postgraduates can be surrounded by support and still struggle to use it when the pressure points arrive. We see the same pattern in feedback about induction, academic support, and communication, especially where universities only hear the detail after a cohort has already hit friction. That is why Alison Leslie and Clare Wright's Student Engagement in Higher Education Journal paper, "Stepping Up, Standing Out: A Case Study on engaging with international student voices through co-creation", is useful. It shows what changes when international PGT students are not just asked for feedback, but involved in shaping the response across the year, a concern that also sits behind recent work on how digital transition shapes international students' belonging.
International PGT students matter enormously to UK higher education, but they often study on compressed one-year programmes that leave little room to recover from a weak start. Universities may provide extensive study materials, workshops, personal tutoring, and wellbeing support, yet students can still feel overwhelmed by unfamiliar academic expectations, hidden institutional rules, and fragmented information. That creates a practical student voice problem: if institutions only ask broad questions at one point in time, they can miss when support becomes usable, when it feels inaccessible, and what students actually need next.
Leslie and Wright address that problem through a three-year mixed-methods case study at a UK university, focused on international PGT students' engagement with academic support. The project used a baseline survey and focus group in year one, a larger survey plus two task-based focus groups and a co-creation workshop in year two, and a further co-creation workshop in year three. The first survey drew 24 responses, the second 43, with workshop participation ranging from five to eight students across later stages. The research question is practical rather than abstract: how can a university engage international PGT students over time, learn from their changing experiences, and turn those insights into support that students will actually use?
The first finding is that timing matters as much as provision. In the early stage of the project, students said they spent much of semester one simply settling in and learning how to manage their time. Many reported feeling overwhelmed by the volume of guidance and by the number of places they had to search for help. That matters for UK teams because a well-stocked support offer can still fail if students encounter it as a maze rather than a route.
Students valued human explanation more than resource volume. The case study found that one-to-one staff support was especially valued when staff took time to explain expectations clearly. At the same time, some students felt nervous about approaching academic staff or personal tutors individually. The practical implication is clear: international PGT support is not only about producing more material. It is also about making staff contact feel approachable, timely, and worth the effort.
The second-year work showed that support needs change across the academic calendar. When students co-created a timeline of their academic support journey, they emphasised the need for guidance early in the year, but also for repeated prompts at later pressure points such as dissertation preparation and the summer period. Students recognised that plenty of information existed, but often described it as hard to find, too fragmented, or poorly timed. For universities, that is a warning against treating induction as a one-off event rather than a sequence of moments when different kinds of support become relevant.
The paper describes the real value of this iterative approach neatly:
"we have opened the loop by involving, and therefore empowering, students in understanding how their voices matter in the feedback process."
Co-creation turned feedback into something more concrete than diagnosis. In response to survey and focus group findings, the project generated practical student-facing and staff-facing resources, including timeline-based guidance, peer-to-peer infographics, and materials on communicating with lecturers and making meaningful connections. These were then used in induction, recruitment, tutor training, and wider staff practice. That is an important distinction. The project did not stop at "students said X". It moved into "students helped design Y", which makes impact more visible to the cohort.
The study also found that institutions need to adapt, not only students. In later workshops, students stressed the value of support that recognised the individuality of international student experiences rather than treating them as one homogeneous group. They also highlighted the need for staff development, especially around intercultural awareness and inclusivity. For UK universities, that matters because support becomes more credible when it asks not only how students should adjust to the institution, but also how the institution should adjust to students.
For UK higher education teams, the first implication is to replace the one-shot international student survey with a staged feedback cycle. Collect evidence before arrival, after the first few weeks, around the first major assessment, and again near dissertation or summer pressure points. For taught postgraduate teams already thinking about how to use PTES more credibly, Nottingham's recent example of linking postgraduate survey collection to visible follow-up is a useful companion. The benefit is earlier action on support friction before it starts to affect belonging, wellbeing, or attainment.
Second, universities should organise support around the student journey, not around internal service boundaries. This paper suggests that students often need simpler pathways, clearer signposting, and support that is contextualised to where they are in the year. Resource maps, timed check-ins, named owners for key questions, and concise guides written in plain language can all help. The payoff is that students spend less time decoding the institution and more time using the help that already exists.
Third, institutions should treat co-creation as part of the support method, not just as a consultation exercise. Bring students into the design of guides, timelines, communications, and staff development materials. Make that work safe enough for honest participation, and visible enough that students can see the result. That is one of the most practical ways to avoid the dead end described in our earlier piece on why it is important to close the loop in student voice initiatives. The takeaway is simple: when students can see their input changing something concrete, trust in the process rises.
Finally, universities should pair partnership work with systematic analysis of open comments and support feedback. A workshop with student partners can surface issues quickly, but it should sit alongside wider evidence from PTES, local surveys, rep feedback, service comments, and induction responses. A structured workflow such as our student comment analysis governance checklist helps institutions route recurring concerns about communication, staff contact, hidden curriculum, and support access to the right owners. The result is a clearer, more defensible basis for action across schools, international offices, and student support teams.
Q: How should a university apply this paper to its international PGT support work?
A: Start by mapping the taught postgraduate journey from pre-arrival to dissertation submission, then identify three or four points where students are likely to need different kinds of support. Collect a small amount of targeted feedback at each point, and use one co-creation activity to turn that evidence into a resource, communication change, or staff prompt. The aim is not to run a large project every time. It is to make student voice timely enough that students can still benefit from the response while they are on the programme.
Q: What are the methodological limits of this study?
A: This is a single-institution case study with relatively small numbers at each stage, so it is not a sector benchmark. Its strength lies elsewhere: the authors followed the issue over three years, combined surveys with focus groups and workshops, and showed how findings fed into practical changes. UK universities should read it as strong practice-oriented evidence about process design, especially for international PGT support, rather than as a claim that one exact model will work unchanged everywhere.
Q: What does this change about student voice more broadly?
A: It shifts attention from collection to iteration. The most useful student voice systems do not ask once, report once, and move on. They create repeated opportunities for students to explain where support is breaking down, test whether changes helped, and shape the next response. For universities using free-text feedback, that makes student voice more than a snapshot. It becomes a route to better-timed action.
[Paper Source]: Alison Leslie and Clare Wright "Stepping Up, Standing Out: A Case Study on engaging with international student voices through co-creation" DOI: 10.66561/sehej.v7i1.1395
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.