AI can generate answers to almost any question a student is likely to ask. That is precisely why the most valuable thing you can do with AI right now is build lessons where getting the answer is not the point.
I've been spending my time building simulations, scaffolds, and deliberately constrained learning environments. Not AI that helps students answer questions, but AI that helps me build spaces where thinking is unavoidable. The difference matters more than it first appears.
The distinction that changes everything
There is now a very clear line in how AI can be used in schools. You cannot have AI features that face students under 18 directly. You can use AI to build things for students.
That distinction reshapes the entire conversation about AI in the classroom. Instead of asking "how can AI help my students?", the better question is: "how can AI help me design learning experiences that force genuine engagement?"
Once I started thinking about it that way, the design space opened up.
Simulations that make abstract concepts visible
A lot of my recent work has gone into building interactive simulations for topics that students find abstract or difficult to visualise.
Declarative programming is a good example. It is an abstract topic in Computer Science that students struggle with because the concepts do not map neatly to anything they can see or touch. I built a simulation using Google AI Studio that lets students practise past paper questions while actually seeing what is happening as the program runs. Instead of trying to imagine the behaviour from static text, they can watch it unfold.
Linked lists got a similar treatment. They are one of those algorithms where students often appear to follow right up until they do not. I built two simulations: one that walks through the algorithm line by line, explaining what is happening at each step, and another that is more exam-focused, where students work through past paper-style questions. They can see how the data changes, get feedback, and choose when to reveal model answers.
No AI giving them answers. Just structure and visibility.
Boolean algebra was probably the one I needed most myself. It has always felt like a topic that should click more easily than it does. So I started with maths and Venn diagrams, letting students interact with the ideas and see the laws in action. Then I added learning around the laws themselves, and finally past paper-style questions with solutions students can reveal once they have had a go.
The pattern is the same every time: build the environment, make the thinking visible, and keep the AI behind the scenes.
You do not need fancy tools to do this
If Google AI Studio feels like too much, Canva Code does this really well too. Much lower barrier to entry, same basic idea. You do not need anything fancy to build meaningful interaction. What matters is the design principle: the student does the thinking, the tool makes the thinking visible.
Designing chatbots that refuse to take shortcuts
With Year 7, the focus has been slightly different. Students have been working in ThingLink, building 3D scenes linked to sustainability and adding their own tags. They also have a chatbot to support project planning.
Here is the critical design decision: I had already learned how quickly these tools become a shortcut if you are not careful. So I deliberately designed the chatbot to block that off.
Students picked things they genuinely cared about. Animals came up a lot. When they tried to get the chatbot to summarise text for them, it refused. Instead, it asked questions. What would you say about this? What actually matters here? It pushed them back to the source material and made them do the extracting themselves.
Some students said the chatbot felt rude. I am not convinced it is. I think it is just neutral. I stripped out all the personality to meet the DfE's generative AI standards. No praise, no warmth, no "great question." I think students are used to AI being overly friendly. Take that away and it feels abrupt, even when it is doing exactly what it should.
What happens when shortcuts are not available
The most revealing test of this approach has come from reviewing entries for the Create. Code. Change. competition. When students are given AI tools that are safe, constrained, and clearly framed, the thinking does not disappear. It gets sharper.
There is a real difference between tools that help students avoid effort and tools that give students space to do better work. Going through those entries made that painfully clear.
Three principles for AI-enhanced lesson design
If you are building AI-powered learning experiences, these are the principles I keep coming back to.
1. AI builds the environment, not the answers. Use AI to create simulations, interactive exercises, and structured practice spaces. The student interacts with the environment you built, not with the AI itself. This sidesteps both the regulatory concerns and the pedagogical ones.
2. Make the thinking visible. The reason simulations work better than static text is not because they are more engaging (though they are). It is because students can see the process, not just the outcome. When a linked list simulation shows the pointer updating step by step, the student is watching their own understanding form.
3. Design for the student who will try to skip. Every chatbot, simulation, or scaffold needs to be tested against the student who sends a one-word answer just to move on. If your tool gives them a shortcut, they will take it. Build the refusal into the design, not as punishment, but as redirection. Ask questions. Push them back to the source. Make the shortcut harder than doing the actual thinking.
AI can either flatten the work or sharpen it. The difference is entirely in the design.
Where this leaves us
The most interesting use of AI in education right now is not the AI itself. It is what we build with it. Spaces where shortcuts do not work. Environments where abstract concepts become visible. Tools that push students back toward their own thinking rather than away from it.
This is not about being anti-AI. It is about being deliberate. The technology is powerful enough to make thinking optional. Our job is to make sure it does not.
Matthew Wemyss is an AIGP-certified AI in Education consultant and practising school leader. Book a discovery call to discuss AI-enhanced lesson design for your school.

