Updated Apr 02, 2026
Doctoral schools are increasingly asked to evidence “skills development” and “research culture” in ways that go beyond satisfaction scores. This paper offers a more practical way to do that, by measuring specific PhD competencies and testing how research style shapes confidence in them. At Student Voice AI, we work with universities to turn student feedback into evidence you can act on. For postgraduate research (PGR) in particular, leaders need clarity on which capabilities students feel they are developing, and where the support environment is helping or hindering progress, especially as UKRI sharpens expectations for doctoral support and PGR feedback evidence.
This recent paper in Higher Education tackles a common gap in doctoral education research. Many studies discuss “competencies” in broad terms, but without a shared, practical framework that covers both academic and non-academic career trajectories. The authors ask two linked questions: what does a defensible, comprehensive competency framework for PhD students look like, and do students’ research styles predict their perceived competencies?
The study develops and validates a new measurement instrument, the PhD Students’ Perceived Competencies Inventory (PSPCI). Using survey data from 1,105 PhD students in Hong Kong (split into calibration and cross-validation samples), the authors establish a 23-item inventory across seven competency scales: core competencies, job performance (novice academics), job performance (novice knowledge professionals), understanding the work environment (novice academics), understanding the work environment (novice knowledge professionals), identity development, and professional habits.
“Most empirical studies on PhD students’ competencies lack a conceptual framework…”
The second contribution is explanatory rather than descriptive: research styles statistically predict perceived competencies. The authors assess six research styles (legislative, conservative, liberal, executive, hierarchical, and monarchic) and find they predict scores across the PSPCI’s seven scales. Put simply, how students prefer to plan and carry out research work is meaningfully associated with how confident they feel about key doctoral and professional competencies. That gives doctoral schools a more useful question to ask: not just whether support exists, but which students are most likely to benefit from it as currently designed.
For institutions, the takeaway is practical: doctoral “skills development” is not only about offering workshops. Students’ study resources and working preferences appear to shape how they experience and interpret development opportunities, including professional habits and identity formation. That makes this more than a measurement paper. It suggests doctoral schools should review how skills support is structured, communicated, and adapted for different working styles.
For UK universities, this research is a useful prompt to tighten the loop between what you say you develop in doctoral training and what students actually experience. That matters because broad claims about “skills development” are hard to defend in PGR strategy, programme review, and supervisor development, particularly when institutions are already reviewing PRES 2025 and what it means for PGR feedback practice. Three practical takeaways stand out:
Student Voice Analytics is designed for exactly this kind of triangulation. It helps institutions analyse open comments from PTES/PRES and internal PGR surveys at scale, using a postgraduate research student comment themes and categories structure, for example supervision, research culture, and skills and professional development, so teams can connect narrative feedback to measurable themes and track movement over time. Even when the underlying survey instrument is strong, this is often where the practical insight sits: what students say happened, and what they believe would improve it. If you want a reproducible way to analyse the comments behind PGR competency scores, explore Student Voice Analytics.
Q: How can a doctoral school apply this research without introducing a whole new survey?
A: Start by mapping your existing PGR questions (PRES, local pulse surveys, supervision evaluations) onto a small set of competency areas: core skills, work environment understanding, identity development, and professional habits. Where gaps appear, add a small number of targeted items next cycle and use open-text prompts to capture the context behind the scores.
Q: What should we watch out for methodologically if we adapt the PSPCI or a similar inventory?
A: Treat it as a measurement instrument, not just a list of questions. Pilot changes, check reliability, and avoid mixing too many constructs into single items. If you use the instrument across disciplines or cohorts, test whether it behaves similarly (and interpret differences cautiously if it does not). Also remember that it captures perceived competencies: pairing it with open comments can help you understand what sits behind low confidence in particular areas.
Q: What does this mean for “student voice” in doctoral education more broadly?
A: It reinforces that student voice is strongest when it links experience to outcomes. A competency framework creates a clearer conversation with students about development; open comments explain how supervision, culture and resources shape that development. Together, they support evidence-based decisions about doctoral training, research culture, and supervisor practice.
[Paper Source]: Sook Mun Sim and Li-fang Zhang "Do PhD students’ research styles predict their perceived competencies?"
DOI: 10.1007/s10734-026-01629-x
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.