Practical AI strategies
for schools
Frameworks, case studies, and opinion on making AI work in education. Written from the classroom up.
AI in PracticeUsing AI as a Thinking Partner: How Maths Teachers Can Use LLMs Effectively
Generic AI lesson planning fails for maths because LLMs hallucinate calculations. But AI as a thinking partner for anticipating misconceptions, clarifying goals, and sparking delivery ideas actually works.
AI StrategyWhy AI Rules Alone Don't Change Student Behaviour in the Classroom
Your school has an AI policy. Your students still copy-paste from ChatGPT. The problem isn't rules — it's culture. Here's how Crew Resource Management principles can transform AI use in classrooms.
AI LiteracyAI Hallucinations in Schools: How to Teach Students to Verify AI Output
80% of students miss deliberate AI errors. Here's a research-backed framework with three checkpoints to develop metacognitive awareness and teach students to catch AI hallucinations.
Emerging TechOpen-Source AI for Schools: Building Ethical Data Infrastructure
Commercial AI platforms dominate schools. But open-source alternatives offer transparency, data sovereignty, and ethical governance that closed systems cannot. Here's what's emerging.
NewsletterSchools are Legacy Systems
Every sector eventually faces the same question: do we modernise, or do we keep patching what we've got and hope it holds? Education hasn't had its modernisation moment yet. But it's coming.
NewsletterDirectors of Intelligence
AI has not broken the creativity equation. It has clarified it. Knowledge is cheap. Execution is accelerating. Direction is scarce.
Subscribe to AI Insights
Practical strategies for integrating AI in education, delivered to your inbox.
By subscribing, you agree to receive the IN&ED newsletter and email communications. You can unsubscribe at any time. Privacy Policy
AI SafetyAI Chatbot Safety in Schools: 7 Prompt Design Mistakes Teachers Make
Well-intentioned AI chatbot prompts can inadvertently threaten and isolate students. Here are 7 common prompt design mistakes and how to fix them using the DfE Product Safety Standards.
Student VoiceWhy Students Trust AI Less But Rely On It More
Students rate their own understanding as 'yellow' while rating AI as 'green'. The gap between trust and reliance reveals something important about classroom culture, cognitive load, and social safety.
AI in PracticeBespoke vs Personalised Learning: Why Teacher-Led AI Support Beats Automated Dashboards
EdTech loves 'personalised learning'. But there's a difference between bespoke support designed by a teacher and automated recommendations from a platform. Here's why intentionality matters.
AI in PracticeAI-Powered Assessment: Using RAG Ratings to Surface Student Knowledge Gaps
Most students don't know what they don't know. Here's how to use AI to generate self-assessment tools that reveal hidden knowledge gaps and drive targeted revision at scale.
Student VoiceWhat Students Really Think About AI: 7 Arguments Against AI Friendship
When asked 'Can AI be your friend?', every student in this Year 7-9 debate argued no. Their reasons reveal sophisticated thinking about vulnerability, privacy, and what makes relationships real.
AI in PracticeUsing AI to Build Better Lessons: Simulations, Scaffolding, and Preventing Shortcuts
AI can generate answers instantly. But the best AI-powered lessons make shortcuts impossible. Here's how to build simulations, scaffolds, and challenges that force genuine thinking.
AI SafetyAutonomous AI Agents Explained: What Teachers Need to Know
AI agents can now browse the web, write code, and control computers independently. Here's what autonomous AI means, why it matters for schools, and how to update your acceptable use policy.
AI LiteracyAI Literacy in the Classroom: Why Timing Your Transparency Matters
When teaching AI literacy, revealing how deepfakes work too early kills engagement. Too late breeds distrust. Here's how timed transparency turns an AI lesson into genuine critical thinking.
AI StrategyDirectors of Intelligence: Why AI Needs Human Direction, Not Competition
Students don't need to compete with AI. They need to direct it. The Directors of Intelligence framework reframes AI literacy around creativity, attitude, and human judgement.
AI in PracticeIn Defence of Vibe Coding in Education
Vibe coding gets a bad reputation. But for teachers building small, purposeful AI tools for their classrooms, it's a superpower. Here's where the line sits between useful prototyping and dangerous shortcuts.
AI SafetyRed Teaming AI in Schools: A Practical Guide to Testing Tools Before Deployment
Most schools adopt AI tools without testing them. Red teaming borrows from cybersecurity to give teachers a structured way to evaluate AI for bias, safety, and pedagogical value before it reaches students.
AI EthicsWhy Students Use AI to Cheat (And It's Not What You Think)
AI hasn't created a cheating epidemic. Research shows rates haven't changed much. The real drivers are systemic pressure, burnout, and assessment design that rewards outputs over thinking.
AI SafetyAI Risk Management for Schools: How to Build a Practical Risk Matrix
Stop evaluating AI tools by gut feeling. Here's how to build a structured risk matrix covering bias, privacy, safeguarding, and pedagogical impact for every AI tool in your school.
AI in PracticeAI in the Classroom: 5 Practices to Keep Humans in the Loop
Agentic AI can draft, plan, and decide on behalf of students. But cognitive offloading and skill erosion are real risks. Here are five evidence-based practices to keep human agency at the centre.
AI EthicsHuman Agency vs AI Agents: Why Student Choice Still Matters
Bandura's framework shows us what real agency looks like: intentionality, forethought, self-reactiveness, and self-reflection. AI agents have none of these. Here's why that matters for education.
AI LiteracyCognitive Offloading: Why AI Makes Students Lazy (And How to Fix It)
Human brains naturally seek shortcuts. AI supercharges this tendency. Here's the research on cognitive offloading and practical strategies to use AI as a trampoline, not a hammock.
AI LiteracyThe Peter Principle Meets AI: When Smart Tools Mask Real Competence
AI tools produce polished outputs without corresponding understanding. Here's how the Peter Principle accelerates in the age of AI and what schools can do about it.
AI EthicsAuthenticity in the Age of AI: How to Stay Real When Everything's Simulated
Deepfakes, AI-generated content, and algorithmic curation are blurring the line between real and artificial. Here's how educators and students can maintain authenticity.
AI EthicsCan AI Replace Human Empathy? The Limits of AI Emotional Support
AI can say 'I care' convincingly. But Carl Rogers showed us that real growth requires unconditional positive regard from another human. Here's why AI sympathy isn't enough.
Emerging TechQuantum Computing in Schools: Why Your Curriculum Needs to Catch Up
Quantum computing will transform AI, medicine, and cybersecurity within a decade. Here's what schools need to teach now to prepare students for a quantum-powered world.
Student VoiceWhen Year 13 Students Start Using AI Independently: A Teacher's Observations
Something shifted with this year's Year 13 cohort. They're using AI for research differently to how we taught them. Here are my early observations on independent AI use in the classroom.