The Question Behind the Question: Why Philosophy Belongs at the Heart of Schools in the Age of AI
- Adam Sturdee

- 6 days ago
- 5 min read

When ChatGPT emerged in late 2022, the public conversation in education quickly fixated on the wrong question. Will students cheat? Will essays survive? Will teachers be replaced? These are surface anxieties. The deeper question, the one that should now be reshaping curriculum and pedagogy, is this: in an age when machines can generate plausible answers to almost anything, what is the value of knowing how to ask the right questions?
This is not a new problem. It is a philosophical one. And it is why I believe philosophy, taught explicitly and practised orally, is one of the most important subjects schools can offer in the next decade.
The bias we bring to the prompt
Every interaction with a large language model is shaped by the user. The prompt is a mirror. If a teacher asks an AI tool, “Was my lesson good?” they will get a confirmation-shaped answer. If they ask, “What evidence in this transcript suggests my questioning was uneven across the class?” they will get something genuinely useful. The quality of the output is bounded by the quality of the question, and the quality of the question is bounded by the user’s awareness of their own assumptions.
This is where most of us get stuck. We do not naturally see our own biases. We default to the questions that flatter us, that confirm what we already suspect, that frame the world in the categories we already use. Philosophy, properly taught, is the discipline of stepping back from those defaults. It asks: what am I assuming here? What would someone with a different stake in this say? What is the question I am really trying to answer?
Without those habits, AI becomes an echo chamber with extra steps.
Why schools must teach this explicitly
Children are not born with the ability to interrogate their own thinking. It is taught. The most rigorous tradition for teaching it in schools is Philosophy for Children, Colleges and Communities, or P4C, developed by Matthew Lipman and now practised across the world. Its central claim is that genuine thinking is collaborative, critical, creative and caring. The four Cs are not decorative. They are the conditions for thought that goes beyond reaction.
Collaborative thinking pushes a young person to listen properly, to build on what someone else has said, to disagree without dismissing. Critical thinking presses for reasons, evidence, and the willingness to change one’s mind. Creative thinking opens the space of possible answers before narrowing down. Caring thinking insists that the people we are reasoning with, and about, matter.
Run that list back against AI. Every one of those four habits is precisely what is missing when a student copies a prompt response without scrutiny. And every one of those four habits is precisely what makes the difference between a teacher using AI as a shortcut and a teacher using AI as a thinking partner.
Teachers need this too
It is tempting to frame all this as a curriculum question. It is also a professional development question. Teachers are being handed tools of extraordinary power with very little guidance on how to use them well. The instinct is to ask the AI, “Plan me a lesson on photosynthesis.” The discipline is to ask, “What do my Year 9s already misunderstand about photosynthesis, and what sequence of questions would surface those misconceptions?” The first prompt produces serviceable output. The second produces something worth using.
This is the same shift that mature reflective practice requires. When I look at a transcript of one of my own lessons through Starlight, the platform we are building at STAR21, the value is not really in the report telling me what happened. It is in what I learn to ask of it. “Where did the conversation narrow?” “Whose voice carried the explanation, mine or the students’?” “When I thought I was checking understanding, what was I actually checking?” The transcript is neutral. The questions are not. Teachers who bring philosophical habits to that material draw three times the insight from the same evidence.
A few practical disciplines help here, whether the AI in question is a coaching tool, a planning assistant, or a chatbot in a student’s pocket:
• Ask the question you would least like the answer to. If you are about to ask “what went well?”, first ask “what did I miss?”
• Surface the assumption before you prompt. Write down what you already think the answer is. Then ask the AI for evidence that would disconfirm it.
• Reframe twice. Run the same question from two different stances, for example, the student who is quietest in your class, then the student who finds the work easy.
• Refuse the first answer. Treat the initial output as a draft to interrogate, not a conclusion to accept.
These are not technical skills. They are philosophical ones. They can be taught.
Oracy as the assessment frontier
There is a final reason philosophy matters now. As written work becomes harder to authenticate, schools will need to lean far more heavily on what cannot be outsourced to a machine: the capacity to think out loud, to defend a position under pressure, to change one’s mind in real time, and to reason with others. Oracy, in other words. The discussion, the viva, the live conversation between teacher and student about what the student actually knows and how they got there.
This is not a workaround for AI. It is a return to something education arguably should never have moved away from. Socrates did not set written exams. He talked. And the talk revealed not just what people thought, but the structure of why they thought it. Oracy assesses what generative AI cannot fake: the moment of hesitation, the working memory under load, the genuine reconsideration. If we want young people who can use AI well, we need young people who can talk well. Who can be questioned and respond with reasons. Who can examine a chatbot’s output and notice what is missing.
That capacity is built in classrooms where philosophical dialogue is part of the weekly rhythm, not a once-a-term enrichment activity.
A practical direction
For schools wanting to begin, the route is straightforward. Build philosophical enquiry into the curriculum from primary onwards. Train teachers in P4C facilitation. Invest in oracy as a core skill, assessed and developed deliberately. And encourage staff to bring those same habits of questioning to their own use of AI, including the tools they use to reflect on their own teaching.
This is why I am proud to be working with the team at Thoughtful, whose new platform is designed to support exactly this kind of work in schools and the wider communities they serve. Their commitment to philosophical dialogue as a foundation for learning is the right answer to a question many schools have not yet thought to ask. You can find their work at
The AI revolution in education will not be won by the schools with the best tools. It will be won by the schools that teach children, and their teachers, to ask better questions.
Philosophy is how that is done.
Adam Sturdee is a senior leader and co-founder of Starlight, the UK’s teacher-first AI-powered transcript-based coaching platform for educators. His work sits at the intersection of dialogic practice, instructional leadership and responsible AI strategy for schools and trusts.
He will be presenting his research on AI-supported coaching at the BERA TEAN Conference 2026: https://www.bera.ac.uk/conference/bera-tean-conference-2026
If you would like to explore these ideas further:
Learn more about Starlight: https://www.starlightmentor.com
Read more on AI and coaching: https://www.coaching.software
Connect on LinkedIn: https://www.linkedin.com/in/adam-sturdee-b0695b35a/
Enquire about speaking or consultancy: https://www.adamsturdee.com/consulting



Comments