Updated Apr 11, 2026
Misconduct cases tell you what happened. Student reflections can tell you why it keeps happening. A 2025 paper in Studies in Higher Education by Kasia Owczarek, Thomas Lancaster and Helena Wels, "Examining student reflections on academic misconduct: insights for academic integrity in higher education", shows why that distinction matters for UK universities reviewing misconduct cases, assessment design, or integrity training, especially if they are also rethinking academic integrity in online assessments. Read in aggregate, the reflections do more than support a disciplinary process; they reveal recurring weaknesses in policy, communication, and support.
Academic integrity is often framed as a compliance issue: define the rules clearly, investigate breaches, and apply sanctions consistently. That matters, but it is only part of the picture. Universities also need to understand why students enter misconduct processes in the first place, especially when the same explanations recur across modules, schools, or student groups. When that happens, institutions are looking at a design and support problem as well as a conduct problem.
Owczarek and colleagues address that gap by analysing 3,070 student reflections written as part of educational interventions following misconduct cases. Their question is practical and unusually useful for Student Experience and Academic Quality teams: what do these reflections tell us about the underlying causes of misconduct, and how can institutions use that student voice to improve academic integrity policy and practice? The answer matters because it gives institutions a route from case-by-case reaction to earlier, better-targeted prevention.
The first finding is that misconduct reflections are not just administrative paperwork. They provide a large-scale, structured source of student voice about where academic integrity systems are breaking down. Rather than treating each reflection as an isolated case, the paper shows the value of reading them collectively, as evidence about recurring patterns in student understanding and behaviour. That makes them useful for institutional improvement, not only individual decision-making.
"Some educational interventions require students to produce reflections, which can serve as valuable feedback to academic integrity policies and procedures."
The second finding is that the most common explanations are not limited to deliberate dishonesty. Students repeatedly pointed to a lack of understanding about academic integrity, poor time management, and low self-efficacy. That matters for UK institutions because it shifts at least part of the response from punishment alone to prevention: clearer guidance, better assessment preparation, and more realistic support around workload and confidence. If the same themes appear repeatedly, the practical takeaway is to fix the conditions around the assessment, not only the consequences after the breach.
The third finding is that reflections can help institutions distinguish between different kinds of risk. If students are confused about citation, overwhelmed by deadlines, or unsure whether they can complete a task independently, those are different problems and they need different interventions. A single academic integrity workshop will not fix all three. The paper therefore strengthens the case for using qualitative analysis to separate integrity literacy issues from wider assessment and support issues, so interventions can be matched to the actual problem.
Finally, the authors argue that patterns vary by academic level and discipline. Misconduct is not experienced, explained, or prevented in exactly the same way across the institution. For UK teams, that is a reminder not to rely on one institution-wide narrative. Analysing reflections, appeals, and related student comments by faculty or cohort can reveal where policy language, assessment design, or skills support need to be adapted first.
For UK higher education institutions, the paper points towards a more mature use of academic integrity data. First, treat misconduct reflections as feedback, not only as evidence in an individual case. If hundreds of students are repeating the same explanations, that is an institutional signal about clarity, assessment pressure, or support design. Used well, reflections become an early-warning source for improvement.
Second, connect integrity work to the wider student voice system. Module evaluations, how universities analyse open-text NSS comments, and local survey text often contain early warnings about confusing briefs, bunching of deadlines, inaccessible support, or weak feedback loops. Read alongside misconduct reflections, those comments can help universities spot where risk is building before it turns into formal cases. That gives teams a better chance to intervene before students cross a formal threshold.
Third, use text analysis carefully and with governance. Reflections and misconduct-related comments can contain sensitive material, so institutions need privacy controls, clear inclusion rules, and defensible reporting, which our student comment analysis governance checklist sets out in practical terms. But when handled properly, they can help academic integrity teams, educational developers, and student experience leaders move from reacting to cases to improving the conditions that produce them. That is the wider opportunity: turning reflective text into decision-ready evidence.
Q: How can universities use misconduct reflections without turning them into another punitive dataset?
A: Start with aggregation and improvement, not individual surveillance. Review reflections in themed, anonymised batches to identify recurring issues such as citation confusion, deadline pressure, or low confidence in completing assessments independently. The goal is to improve briefing, support, and assessment design, not to widen disciplinary monitoring.
Q: What are the methodological limits of learning from student reflections written after a misconduct case?
A: Reflections are shaped by context. Some students may minimise responsibility, write strategically, or emphasise factors they think staff want to hear. That means the reflections should not be treated as neutral truth. They are still useful because repeated patterns across thousands of cases can reveal where institutional messaging and support are failing, especially when triangulated with survey comments and case data.
Q: What does this change about how we think about student voice in academic integrity work?
A: It broadens student voice in assessment practices beyond evaluations and representative systems. Students also speak through complaints, appeals, reflections, and other process-generated text. If universities analyse those sources well, they can understand not just whether a policy is being enforced, but whether students recognise expectations, trust the process, and feel equipped to meet academic standards.
[Paper Source]: Kasia Owczarek, Thomas Lancaster and Helena Wels "Examining student reflections on academic misconduct: insights for academic integrity in higher education" DOI: 10.1080/03075079.2025.2591419
Request a walkthrough
See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.
UK-hosted · No public LLM APIs · Same-day turnaround
Research, regulation, and insight on student voice. Every Friday.
© Student Voice Systems Limited, All rights reserved.