Question and answer sessions in online tutorials

By Marisa Graser

Updated Mar 27, 2026

Live Q&A sessions can give students the feedback they need at the moment confusion appears. The challenge is turnout: in many STEM subjects, in-person tutorials struggle to keep students engaged. Moving Q&A online may ease that problem, especially when tutors borrow tactics for increasing student engagement in online modules. Studies have reported higher attendance in online tutorials (Campbell et al., 2019) and similar gains in final grades when compared with face-to-face tutorials (Rennar-Potacco et al., 2019). Jansson et al. (2021) therefore examined how online Q&A sessions support students' inquiry processes, and what educators can learn from the way students use them.

How to implement text-based online Q&A

In their case study, Jansson et al. (2021) examined online tutoring with integrated Q&A in a master's course on advanced machine learning. Teachers created a dedicated Q&A space within the university's online environment. It included a chat area, an interactive whiteboard where students could add content such as freehand drawings, and tools for sharing files and images. Because all content was saved automatically, students could return to earlier exchanges, which supported both synchronous and asynchronous engagement.

To get the benefit of that setup, students need a clear introduction to the Q&A room before teaching starts. Jansson et al. (2021) also suggested setting a defined time when the teacher is online to answer questions. At the same time, students should be encouraged to use the space outside those hours, so the discussion does not depend entirely on staff availability.

Assessing online Q&A sessions with the Relationship of Inquiry Framework

To assess whether this approach supports learning, Jansson et al. (2021) analysed chat transcripts using the Relationship of Inquiry (RoI) framework (Stenbom et al., 2016). The framework argues that four elements shape a strong educational experience: teaching presence, cognitive presence, social presence, and emotional presence. Together, they help explain not just whether students join the discussion, but how they learn through it.

Teaching presence can come from the teacher or tutor, but students can also contribute to it. It includes the design and organisation of the session, the way discourse is facilitated, and direct instruction when needed.

Cognitive presence reflects whether students are encouraged to reflect, question, and take part in discussion that gives meaning to the material.

Social presence helps students identify with their course and with each other, a pattern that also appears in emotional engagement in online forums. That matters because collaboration, higher-order thinking, and open participation are easier when students feel safe speaking up.

Emotional presence is closely related, but distinct. It captures the emotions and feelings students express as they engage.

When analysing these dimensions, Jansson et al. (2021) highlighted several patterns. First, students often showed teaching presence themselves, mainly by steering their own inquiry. For example, they asked for help, explained what they had already tried, and built on each other's answers. This reduced the teacher's dominance compared with a standard in-person Q&A. Even so, students were still more active when they knew the teacher was available, and synchronous participation rose when staff visibly joined the space.

Another useful finding was that some students benefited without posting much at all. Even when they did not join the discussion directly, they still logged in to follow other students' questions and answers.

Social presence also appeared clearly through the language of "we" in conversations. Emotional presence showed up through emojis and through students expressing worry about upcoming assessments, among other cues.

Benefits of online text-based Q&As

Overall, Jansson et al.'s (2021) study suggests that online Q&A benefits both students who ask questions and those who quietly read the answers without contributing much themselves (Bozkurt et al., 2020).

The study also showed that online Q&A sessions were largely student-driven (Smith IV et al., 2020). Students started and ended conversations, kept them moving, and often helped one another progress. That matters because it builds responsibility for both personal learning and peer learning, while also strengthening metacognitive skills.

For instructors, scheduled online Q&A sessions can also reduce time-consuming office visits (Kolluru et al., 2017). They can ease staff workload in another way too: students can answer repeated questions about deadlines, course structure, or material that has already been shared. If you want online tutorials to work harder, look closely at the questions students ask, the answers they return to, and the gaps peers are filling for each other.

FAQ

Q: How does student voice impact the effectiveness of online Q&A sessions in promoting deeper learning and engagement?

A: Student voice in assessment and feedback strongly shapes how useful online Q&A sessions become. When students ask questions, share examples, and surface concerns in their own words, teachers can respond to real gaps in understanding rather than guessed ones. That makes the session more relevant, more engaging, and more likely to support deeper learning.

Student participation also changes the balance of the session. Instead of relying on a teacher-led format, students begin to direct the conversation toward the points that matter most to them. That shift can increase motivation, strengthen ownership of learning, and give students more practice articulating problems, evaluating responses, and thinking critically.

Q: What are the challenges and limitations of text-based online Q&A sessions in capturing and analysing student voice?

A: Text-based online Q&A sessions do have limits. Tone, hesitation, and nuance can be harder to read in text than in face-to-face discussion, which means teachers may miss how strongly a student feels or how uncertain they are. Students who are less confident writers may also contribute less, so their experience can be underrepresented.

Analysis adds another layer of difficulty. Text analysis can identify recurring topics or sentiment, but it may miss context, humour, or subtle emotional cues. If the discussion is partly asynchronous, delayed replies can also slow the conversation and reduce the immediacy that makes Q&A valuable in the first place.

Q: How can text analysis techniques be improved to better understand and support student inquiry processes in online Q&A sessions?

A: Better text analysis starts with models that can do more than count keywords. Text analysis software for education teams can surface themes, spot common questions, and detect sentiment at scale, which helps educators see where students are confused, confident, or disengaged. But the strongest systems also need educational context, so that technical terms, subject-specific language, and different types of questions are interpreted accurately.

It also helps to close the loop with students and staff. If educators can test whether the analysis matches what students meant, they can refine categories and interpretations over time. The result is a clearer picture of inquiry processes, and better evidence for improving teaching, feedback, and support.

References:

A. Campbell, A.M. Gallen, M.H. Jones, A. Walshe. (2019) The perceptions of STEM tutors on the role of tutorials in distance learning. Open Learning: The Journal of Open, Distance and e-Learning, 34 (1), pp. 89-102.
DOI: 10.1080/02680513.2018.1544488

D. Rennar-Potacco, A. Orellana, P. Chen, A. Salazar. (2019) Rethinking academic support: Improving the academic outcomes of students in high-risk STEM courses with synchronous videoconferencing. Journal of College Student Retention: Research, Theory & Practice, 20 (4), pp. 455-474.
DOI: 10.1177%2F1521025116678854

M. Jansson, S. Hrastinski, S. Stenbom, F. Enoksson. (2021) Online question and answer sessions: How students support their own and other students' processes of inquiry in a text-based learning environment. The Internet and Higher Education, Volume 51.
DOI: 10.1016/j.iheduc.2021.100817

S. Stenbom, M. Cleveland-Innes, S. Hrastinski. (2016) Emotional presence in a relationship of inquiry: The case of one-to-one online math coaching. Online Learning, 20 (1), p. 41.
DOI: 10.24059/olj.v20i1.563

A. Bozkurt, A. Koutropoulos, L. Singh, S. Honeychurch. (2020) On lurking: Multiple perspectives on lurking within an educational community. The Internet and Higher Education, 44, p. 100709.
DOI: 10.1016/j.iheduc.2019.100709

D. Smith IV, Q. Hao, V. Dennen, V. Dennen, M. Tsikerdekis, B. Barnes, et al. (2020) Towards Understanding Online Question & Answer Interactions and their effects on student performance in large-scale STEM classes. International Journal of Educational Technology in Higher Education, 17, p. 20.
DOI: 10.1186/s41239-020-00200-7

S. Kolluru, J.T. Varughese. (2017) Structured academic discussions through an online education-specific platform to improve pharm. D. students learning outcomes. Currents in Pharmacy Teaching and Learning, 9 (2), pp. 230-236.
DOI: 10.1016/j.cptl.2016.11.022

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.