Back to Blog

LEGO/The Alan Turing Institute: Understanding GenAI Impact on Children

Jeremy WeaverJune 20, 2025
Premium

A new study reveals how children aged 8–12 are already using tools like ChatGPT, highlighting benefits, risks, and the urgent need for child-centred AI design and literacy.


Kids Are Already Prompting ChatGPT—Are We Ready?

LEGO and The Alan Turing Institute’s report, “*[Understanding the Impacts of Generative AI Use on Children](https://www.turing.ac.uk/sites/default/files/2025-05/combined_briefing_-_understanding_the_impacts_of_generative_ai_use_on_children.pdf)*,” combines survey data from thousands of children, parents, and teachers with hands-on workshops in schools. The headline: nearly one in four children aged 8–12 has tried generative AI, most commonly ChatGPT. Yet today’s AI ecosystems were never built with young users’ rights, needs, or wellbeing at the center.

Who’s Using AI—and Who’s Being Left Behind?

  • Usage disparities: 52 % of private-school pupils have dabbled in generative AI versus just 18 % in state schools—a stark digital divide.
  • Demographic nuances: Age, gender, and additional learning needs all influence how—and why—kids engage with AI.
  • Learning benefits: Teachers see real promise, especially for students with special educational needs who gain adaptive scaffolding and confidence boosts.

Shared Concerns Across the Board

Parents, carers, and educators converge on several worries:
  • Exposure to harmful or false content (82 % of parents).
  • Erosion of critical-thinking skills (76 % of parents, 72 % of teachers).
  • Environmental costs of compute-heavy models.
  • Bias in outputs and the temptation for students to pass AI-generated work as their own.
Despite these fears, teachers using AI themselves report higher productivity and more time for student interaction—evidence that thoughtful integration can pay dividends.

Recommendations for a Child-Centred AI Future

1. Design with Kids, Not Just for Them
  • Engage children in feature ideation, testing, and feedback cycles to ensure tools respect their developmental needs.
2. Boost AI Literacy for All Stakeholders
  • Provide age-appropriate curricula and resources so children, parents, and teachers understand capabilities, limits, and ethics.
3. Close the Access Gap
  • Ensure state schools and underserved communities receive funding and infrastructure to prevent an AI-driven learning divide.
4. Address Bias and Safety by Default
  • Hard-wire content filters, transparent data practices, and robust oversight into kid-facing AI products.
5. Mitigate Environmental Impact
  • Encourage model efficiency and green data-center practices, teaching students about sustainability in tech.

How Learning Platforms Can Help

Solutions like [ibl.ai’s AI Mentor](https://ibl.ai/product/mentor-ai-higher-ed) echo the report’s call to embed ethics, critical thinking, and adaptive support into AI experiences. By aligning prompts and feedback with curricular goals—and by giving educators control dashboards—mentor platforms can foster deeper learning while safeguarding young users.

Final Thoughts

Generative AI holds undeniable potential to spark creativity and personalize learning, but LEGO and The Alan Turing Institute remind us that kids need more than clever chatbots—they need safe, inclusive, and empowering digital spaces. Policymakers, industry, and educators must collaborate now to ensure every child benefits from AI without sacrificing their agency, privacy, or cognitive growth.