LEGO/The Alan Turing Institute: Understanding GenAI Impact on Children
A new study reveals how children aged 8–12 are already using tools like ChatGPT, highlighting benefits, risks, and the urgent need for child-centred AI design and literacy.
Kids Are Already Prompting ChatGPT—Are We Ready?
LEGO and The Alan Turing Institute’s report, “*[Understanding the Impacts of Generative AI Use on Children](https://www.turing.ac.uk/sites/default/files/2025-05/combined_briefing_-_understanding_the_impacts_of_generative_ai_use_on_children.pdf)*,” combines survey data from thousands of children, parents, and teachers with hands-on workshops in schools. The headline: nearly one in four children aged 8–12 has tried generative AI, most commonly ChatGPT. Yet today’s AI ecosystems were never built with young users’ rights, needs, or wellbeing at the center.Who’s Using AI—and Who’s Being Left Behind?
- Usage disparities: 52 % of private-school pupils have dabbled in generative AI versus just 18 % in state schools—a stark digital divide.
- Demographic nuances: Age, gender, and additional learning needs all influence how—and why—kids engage with AI.
- Learning benefits: Teachers see real promise, especially for students with special educational needs who gain adaptive scaffolding and confidence boosts.
Shared Concerns Across the Board
Parents, carers, and educators converge on several worries:- Exposure to harmful or false content (82 % of parents).
- Erosion of critical-thinking skills (76 % of parents, 72 % of teachers).
- Environmental costs of compute-heavy models.
- Bias in outputs and the temptation for students to pass AI-generated work as their own.
Recommendations for a Child-Centred AI Future
1. Design with Kids, Not Just for Them- Engage children in feature ideation, testing, and feedback cycles to ensure tools respect their developmental needs.
- Provide age-appropriate curricula and resources so children, parents, and teachers understand capabilities, limits, and ethics.
- Ensure state schools and underserved communities receive funding and infrastructure to prevent an AI-driven learning divide.
- Hard-wire content filters, transparent data practices, and robust oversight into kid-facing AI products.
- Encourage model efficiency and green data-center practices, teaching students about sustainability in tech.
How Learning Platforms Can Help
Solutions like [ibl.ai’s AI Mentor](https://ibl.ai/product/mentor-ai-higher-ed) echo the report’s call to embed ethics, critical thinking, and adaptive support into AI experiences. By aligning prompts and feedback with curricular goals—and by giving educators control dashboards—mentor platforms can foster deeper learning while safeguarding young users.Final Thoughts
Generative AI holds undeniable potential to spark creativity and personalize learning, but LEGO and The Alan Turing Institute remind us that kids need more than clever chatbots—they need safe, inclusive, and empowering digital spaces. Policymakers, industry, and educators must collaborate now to ensure every child benefits from AI without sacrificing their agency, privacy, or cognitive growth.Related Articles
AI Equity as Infrastructure: Why Equitable Access to Institutional AI Must Be Treated as a Campus Utility — Not a Privilege
Why AI must be treated as shared campus infrastructure—closing the equity gap between students who can afford premium tools and those who can’t, and showing how ibl.ai enables affordable, governed AI access for all.
Ethics Meets Economics: Balancing Ethical AI Use with Budget Reality
How higher education can balance ethics and economics—showing that transparent, equitable, and explainable AI design isn’t just responsible, but the most financially sustainable strategy for long-term success.
Equity in the Age of AI: Making Educational Technology Work for Every Student
How governed, institution-controlled AI ensures equitable access to high-quality learning support for every student—transforming AI from a privilege into a campus-wide right.
Microsoft Education AI Toolkit
Microsoft’s new AI Toolkit guides institutions through a full-cycle journey—exploration, data readiness, pilot design, scaled adoption, and continuous impact review—showing how to deploy AI responsibly for student success and operational efficiency.