Two types of student voice

Wellbeing data vs instructional data:
both matter, but only one feeds the loop.

Student voice collects two fundamentally different things depending on what you ask. “Did you enjoy today's lesson?” produces wellbeing and satisfaction data. “What part of today's topic are you most uncertain about?” produces instructional data. The first tells you how students feel. The second tells you what to teach differently tomorrow.

Both are legitimate and valuable. A school that ignores how students feel about learning produces technically informed but humanly disconnected instruction. But in the context of agile teaching, student voice's most underused function is as a precision data source that complements exit tickets — giving you qualitative information about what students think is happening in their learning, which exit tickets alone cannot capture.

💡When to use student voice instead of exit tickets
Exit tickets tell you what students produced. Student voice tells you what students think about what they produced. Use both together: the exit ticket reveals the gap; student voice reveals why the student thinks the gap exists. The combination is more diagnostic than either alone — especially for identifying whether students are aware of their own misconceptions.
Five questions

Five student voice prompts that generate
agile data, not satisfaction scores.

1
Uncertainty mapping
What exactly don't you know?

"Identify the one step in today's process where you stopped feeling confident. Describe what you understood up to that point, and write the specific question you'd need answered to feel sure." This format forces specificity — students must name the exact step, not give a general impression. "12 students named Step 3" tells you exactly where to open tomorrow's lesson.

Why it works better than "any questions?"
"Any questions?" invites students who already understand to show off. This prompt invites every student to identify their actual uncertainty without the social risk of public confusion.
2
Method preference
Which approach would have worked better for you?

"Would you have understood this better with more worked examples, a diagram, a different explanation, or more time to practice?" Giving students a structured set of options produces more reliable data than open-ended "how should I teach this better?" — students can identify which option they needed even when they couldn't articulate a better approach from scratch.

What this data informs
This prompt tells you how to introduce the next new concept of this type — not just how to re-teach this one. It builds a profile of your class's preferred entry points over time.
3
Readiness check
Are you ready to move on?

"Rate your readiness to work on the next topic (not your enjoyment of this one): (a) ready — I understand the core concept; (b) almost — one more practice; (c) not yet — I still have a significant gap." Used before introducing a dependent concept. Students are often accurate predictors of their own readiness when the question is specific and the stakes are low.

Validation approach
Compare readiness ratings to exit ticket performance on the same class. Over time you'll develop a reliable sense of which students' self-reports are well-calibrated and which over- or underestimate.
4
Connection prompt
Where else does this idea appear?

"Where have you encountered this concept or something like it — inside or outside school?" This surfaces prior knowledge the lesson didn't activate, real-world connections that strengthen retention, and analogies that might be more accessible than the ones you used. Student responses to this prompt regularly produce better analogies than the teacher's prepared examples.

When to use it
Mid-lesson or at a reflection point — while lesson content is still fresh. The responses feed back into your next lesson's opening example, creating a personalised entry point grounded in something students already know.
5
Teaching-back test
Can you explain it to someone who wasn't here?

"In two sentences, explain today's central concept to a classmate who missed the lesson. Use the specific vocabulary we covered. Don't just define the term — explain how it works." Students who understood can do this. Students who didn't will avoid the causal language and describe peripheral details.

The avoidance pattern
When a student writes two sentences that describe where the concept happens or what it looks like, but never explains why or how it works — the avoidance of mechanism language is the diagnostic signal. That is the specific gap to address tomorrow.
When to use it

The timing that makes voice data
actionable rather than archival.

Student voice is most actionable when it is collected at the moment the learning is fresh — end of lesson, end of week, or at the natural boundary between topics. Voice collected in an end-of-term survey is too delayed to be actionable: by the time you read it, the class has moved on. Voice collected at the end of a lesson can inform tomorrow's teaching directly.

Prompt
Best timing
Why this timing
Uncertainty mapping
End of lesson
Acts as exit ticket — directly informs tomorrow's opening
Method preference
After introducing a new concept type
Informs how you'll introduce the next new concept of that type
Readiness check
Before moving to a dependent concept
Prevents building on gaps by revealing them at the right moment
Connection prompt
Mid-lesson or reflection point
Generates real-world examples while lesson content is fresh
Teaching-back test
End of lesson
Produces the most diagnostic data when material is at peak recall
AI-generated voice prompts

Voice prompts alongside
the exit quiz. Automatically.

🤖Using AI to generate voice prompts
SprintUp Education's exit quiz tool includes a student voice prompt generator alongside the standard 3-question check. Input your learning objective — get a set of calibrated voice prompts alongside the formative questions. The combination produces both what students understood (exit ticket) and why they think they understood it (voice). Free on every school account.