International students' digital transition shapes belonging, not just convenience

Updated Apr 25, 2026

International students can arrive academically prepared and still lose momentum quickly if the host institution's digital environment feels unreadable. At Student Voice AI, we see the same pattern in student voice: confusion about apps, channels, and basic information often appears before a formal complaint about teaching, support, or belonging. That is why Rahmat Fadhli, Shanton Chang and Antonette Mendoza's Higher Education paper, "Charting the initial digital journeys of students in a new digital environment: a qualitative study", matters. It shows that early digital adjustment is part of academic and social integration, not a side issue for IT teams.

Context and research question

Much of the literature on international student transition concentrates on language, culture, or academic expectations. That work matters, but this paper argues that something important is often missed: students also have to learn a new digital environment, with its own platforms, habits, communication norms, and information routes. If they cannot work out where official information lives, which apps matter, or how local digital behaviour operates, their broader adjustment can stall.

The authors examine that gap through a phenomenological qualitative study of international students in Indonesia during their first six months. The study used 51 semi-structured interviews, conducted between August and October 2024, with full-degree international students across multiple Indonesian cities. The participants came from a wide range of countries, and the researchers used inductive qualitative content analysis to identify patterns in how students moved through this early digital transition. For UK higher education teams, the exact setting is not the point. The useful question is whether international students at your institution are also having to decode a local digital culture before they can participate confidently in academic and social life.

Key findings

The paper's core finding is that digital adjustment follows a recognisable process rather than happening automatically. Students moved through what the authors describe as stages of digital shock, transition, and possible disengagement. In the early phase, many struggled to understand which platforms were locally dominant, how to find reliable information, and how to navigate unfamiliar digital habits around communication, services, and academic tasks.

The abstract captures that sequence clearly:

"students go through a process of digital shock, transition, and (dis)engagement"

Students did not respond passively to that shock. They actively experimented with the digital environment around them. The paper describes students downloading local apps for everyday movement and services, changing how they used social media, and adopting new tools for study support such as translation software. These details matter because they show that digital transition is not only about access to a VLE. It is also about the digital routines that make daily student life workable enough for academic participation to follow.

The transition was social as well as technical. Students learned that certain platforms or behaviours mattered for joining activities, finding updates, or getting included. In the Indonesian context, students described how local norms around Instagram and messaging shaped access to information and relationships. The broader institutional lesson transfers easily: when students do not understand the unwritten digital rules of a course, campus, or city, they can miss opportunities for belonging even when support technically exists.

The paper also shows that adaptation does not always end in confident participation. Some students adjusted by exploring new tools and habits, but others disengaged because of overload, fatigue, or frustration. The result was uneven: some gained better local language skills or wider information access, while others remained only passively connected to local information or felt swamped by too many channels. For institutions, that is a warning against assuming that more digital provision automatically means better transition.

Support conditions made a decisive difference. Peer help, clearer university guidance, and stronger digital skills smoothed the transition, while language barriers, poor usability, fragmented information, and limited device capacity made it harder. That is a practical finding for UK teams because those barriers rarely sit with one department. They cross international offices, digital teams, academic schools, libraries, and student support.

Practical implications

For UK universities, the first implication is to treat digital onboarding as student experience design, not just systems induction. International students need more than login details. They need to know which platforms matter for teaching, support, and everyday participation, where official updates are likely to appear, and which channels are socially important for joining the life of a programme or university. When that guidance is explicit early on, institutions reduce avoidable friction before it turns into isolation or missed support.

Second, universities should collect targeted open-text feedback on digital transition in the first weeks and months. The right prompts are practical: What was hardest to access? Which systems felt confusing? Where did students struggle to find trusted information? What digital habits or channels were unfamiliar? This is exactly the kind of evidence gap highlighted in Jisc's work on digital equity in transnational education. The benefit is that institutions get specific, fixable evidence rather than a generic satisfaction score about "digital experience".

Third, institutions should separate digital problems by type and owner before they start acting on the comments. Platform usability, language accessibility, message overload, translation needs, digital confidence, device constraints, and social-channel norms are different issues. They need different responses. A documented workflow such as our student comment analysis governance checklist helps universities route those signals to the right owners instead of collapsing them into one broad "international student support" theme. That makes action more precise and easier to track.

Fourth, universities should track digital transition over time rather than treating induction as one completed stage. This paper focuses on the first six months, and that matters. The comments students give in week two are unlikely to be the same as the comments they give after the first assignment, the first group project, or the first attempt to access wider support services. For institutions building a repeatable feedback loop, our NSS open-text analysis methodology is useful because it shows how to turn narrative comments into evidence that can be monitored across time and ownership areas. That is where Student Voice Analytics fits naturally: it helps teams compare recurring themes such as information access, support, digital overload, and belonging across surveys without losing the detail that makes the comments actionable.

FAQ

Q: How should a UK university collect actionable feedback on international students' digital transition?

A: Ask early, and ask specifically. A short pulse survey during the first few weeks, followed by another after the first assessment point, is usually more useful than waiting for an end-of-year survey. Include one or two open-text questions about finding information, using key systems, and understanding which digital channels matter. That makes it easier to distinguish problems with platform design, support ownership, and local digital culture before those issues become harder to unwind.

Q: What are the limits of applying this study directly to UK higher education?

A: This is a qualitative study of international students in Indonesia, published on 28 March 2026, so it is best read as evidence about mechanisms rather than prevalence. The value lies in showing how digital transition can shape academic adjustment and social integration, not in claiming that every UK institution will see the same platform-specific issues. UK teams should use the findings to sharpen what they ask and listen for locally, then test those patterns against their own survey comments, support cases, and international student feedback.

Q: What does this change about how universities should think about student voice?

A: It widens the frame. Student voice on international student transition should not focus only on teaching quality, induction satisfaction, or broad wellbeing questions. Digital routines, information pathways, platform norms, and online support access are part of the lived student experience too. If universities ignore that layer, they risk missing the reasons students feel disconnected before the institution has even started interpreting the bigger survey themes.

References

[Paper Source]: Rahmat Fadhli, Shanton Chang and Antonette Mendoza "Charting the initial digital journeys of students in a new digital environment: a qualitative study" DOI: 10.1007/s10734-026-01657-7

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.