top of page
Search

Who Owns a Teacher’s Second Brain? The Question Schools Haven’t Asked Yet


There is a quiet shift happening in staffrooms across the country, and most senior leadership teams have not yet noticed it.


Teachers are building second brains. Not metaphorically. Literally.


They are feeding lesson plans, schemes of work, pastoral notes, parental correspondence, behaviour reflections, feedback drafts, assessment rubrics, safeguarding ponderings, and fragments of their own teaching philosophy into AI tools. Day after day. Prompt after prompt. Many of them are paying for these tools themselves. A tenner a month for ChatGPT Plus here, twenty for Claude Pro there, various subscriptions for specialist tools, add it all up and the average engaged teacher is spending £300 to £600 a year out of their own pocket on the most significant shift in cognitive tooling since the arrival of the internet.


And nobody, at school level, is really talking about what this means.


The French philosopher Bernard Stiegler borrowed a term from cognitive science to describe tools that extend human thought: the exocortex. An external memory and reasoning layer that becomes, over time, genuinely inseparable from the person using it. A notebook is a primitive exocortex. A well-curated Evernote is a more sophisticated one. A large language model with access to two years of your teaching notes, your personal voice, your feedback style, your favourite explanations, your hard-won pedagogical instincts, is something else entirely.


It is, in a meaningful sense, part of the teacher.


So here is the uncomfortable question I want to put on the table.


Who should pay for it?


The teacher’s case


The instinctive answer from most teachers I speak to is straightforward. This is a professional tool. Professional tools are paid for by the employer. Schools buy the laptops, the textbooks, the photocopier toner, the CPD courses, the subject association memberships. Extending that logic to AI tools is not a stretch. It is simply keeping up.


There is a workload argument too. If AI genuinely saves teachers five, six, ten hours a week, as the early evidence suggests it can, then the return on a £240-a-year licence is obvious even at the most minimal level of cost-benefit analysis. Asking a teacher to personally fund the tool that makes their statutory workload manageable feels uncomfortably close to asking them to buy their own chair.


And there is a fairness argument. Teachers who cannot afford to pay, or who do not yet know how to use these tools, fall further behind the early adopters in their department. Over time, that gap widens. If schools leave AI adoption to the market, they are quietly tolerating a two-tier staffroom.


All of this is correct. But it is not the whole picture.


The school’s case


Sit on the other side of the desk for a moment.

A headteacher walks into the staffroom and sees forty teachers, each with a different AI tool, each with a different subscription tier, each with a different set of data-sharing settings, each uploading different slices of school data into different American and Chinese servers. Some have paid accounts. Some are on free tiers that train on user data. Some are using tools their department head recommended. Some are using tools the IT team has never heard of.

This is shadow IT at a scale that would give any DPO a quiet panic attack.


Under UK GDPR, the school is the data controller. The school is liable for how pupil data is processed. The school has a legal duty of care to ensure that the personal information of minors, their safeguarding notes, their SEND profiles, their referral forms, is not being fed into consumer-grade AI tools under personal accounts with no data processing agreement in place.


A teacher paying for their own AI tool is, from a governance perspective, a ticking clock. Not because the teacher is acting in bad faith. Almost nobody is. But because consumer accounts were never designed for school data, the school has no audit trail, no ability to revoke access when someone leaves, and no visibility over what is being processed.


There is also a strategic argument schools should be making more confidently. Team accounts and enterprise licences offer something personal accounts cannot: connected intelligence. When a whole department uses the same tool, under the same governance, fed by the same curriculum materials and the same assessment data, the collective output becomes exponentially more useful. The quality of what the AI can do for the next teacher improves because of what this teacher just did. The school becomes smarter. The department becomes smarter. CPD becomes smarter.


You cannot achieve that with forty individual subscriptions.


The exocortex problem


Now for the question that nobody has really sat with yet.


What happens when the teacher leaves?


If the school paid, the account belongs to the school. The teacher walks out of the building having lost access to two years of accumulated thinking, prompting, drafts, reflections, and personalised AI memory. Years of their own professional

development, walled off the moment their contract ends. That is not just inconvenient. For some teachers, it will feel like having their notebook confiscated on the way out.


If the teacher paid, the account belongs to the teacher. They take it with them to their new school. They keep building. But the school that funded much of the raw material, the lesson plans generated on school time, the assessment insights drawn from school data, has no rights over what leaves with them. And the school they move to inherits a teacher whose AI is trained on a previous employer’s context.


Neither answer is clean.


This is genuinely new territory. When a teacher leaves, we do not claim their handwritten notes, their muscle memory, their instincts. But an AI exocortex is a stranger hybrid. Partly the teacher’s. Partly the school’s. Partly the tool vendor’s. Partly, uncomfortably, the training data of whatever model comes next.


The law has not caught up. The unions have not caught up. Most school policies have not even acknowledged that the question exists.


Three things I would suggest leaders start doing now


I am not going to pretend I have the answer. I want this piece to open a debate, not close one. But I do have a view on what the first honest steps look like.


First, know what your staff are actually using. Not to police them. To understand the real shape of AI adoption in your school. Most leaders will be surprised. The gap between the official IT estate and what teachers are really doing is already wide and widening by the month.


Second, pay for something. Even a modest investment in a properly governed tool signals two things: that AI is part of the school’s professional infrastructure, and that the school takes its data protection responsibilities seriously. A team account with clear governance beats forty shadow accounts every time. Start somewhere.


Third, have the exocortex conversation explicitly. When a teacher leaves, what happens to their AI workspace? Can they export their prompts and history? Does the school retain any rights over AI-generated materials created on school time? These are the questions we will be answering in five years whether we prepare now or not.


The bit that is a little bit provocative


I want to end with something slightly uncomfortable, because I think the debate deserves it.


Teachers have, quite reasonably, expected schools to pay for their professional tools for a long time. But AI is not quite a professional tool in the way a textbook or a laptop is. It is something stranger and more intimate. It learns your voice. It remembers what you were worried about in September. It drafts in your rhythm. It knows which of your students you find hardest to reach and why.


Given that, part of me wonders whether there is a version of the future where teachers actually want to pay for their own AI. Not because schools should get out of it, but because the tool that holds your teaching soul probably should not belong to your employer. A school can buy you a laptop. It is harder, and maybe inadvisable, for a school to buy you a second brain.


I am not arguing for that position. I am putting it on the table.


What I am arguing for, unambiguously, is that this conversation needs to happen in every school leadership team, every staffroom, every union branch, and every DfE working group. Not in five years. Now. Because teachers are already building their exocortexes, whoever is paying for them. And the longer we avoid asking who owns them, the messier the answer will be when we finally have to.

The ink is still wet on this one. Let us at least start writing.


Adam Sturdee is a senior leader and co-founder of Starlight, the UK’s teacher-first AI-powered transcript-based coaching platform for educators. His work sits at the intersection of dialogic practice, instructional leadership and responsible AI strategy for schools and trusts.


He will be presenting his research on AI-supported coaching at the BERA TEAN Conference 2026: https://www.bera.ac.uk/conference/bera-tean-conference-2026


If you would like to explore these ideas further:

Learn more about Starlight: https://www.starlightmentor.com

Read more on AI and coaching: https://www.coaching.software

Enquire about speaking or consultancy: https://www.adamsturdee.com/consulting


 
 
 

Comments


bottom of page