ibl.ai Agentic AI Blog

Insights on building and deploying agentic AI systems. Our blog covers AI agent architectures, LLM infrastructure, MCP servers, enterprise deployment strategies, and real-world implementation guides. Whether you are a developer building AI agents, a CTO evaluating agentic platforms, or a technical leader driving AI adoption, you will find practical guidance here.

Topics We Cover

Featured Research and Reports

We analyze key research from leading institutions and labs including Google DeepMind, Anthropic, OpenAI, Meta AI, McKinsey, and the World Economic Forum. Our content includes detailed analysis of reports on AI agents, foundation models, and enterprise AI strategy.

For Technical Leaders

CTOs, engineering leads, and AI architects turn to our blog for guidance on agent orchestration, model evaluation, infrastructure planning, and building production-ready AI systems. We provide frameworks for responsible AI deployment that balance capability with safety and reliability.

Interested in an on-premise deployment or AI transformation? Calculate your AI costs. Call/text 📞 (571) 293-0242
Back to Blog

Pearson: Asking to Learn

Jeremy WeaverJune 18, 2025
Premium

Pearson’s analysis of 128,000 student queries to an AI study tool uncovers a surprising share of higher-order questions—evidence that thoughtful AI integration can push learners beyond rote memorization.


What Happens When Students “Ask to Learn”?

Pearson’s paper, “Asking to Learn: What student queries to Generative AI reveal about cognitive engagement,” examines 128,725 queries posed by 8,681 students using the “Explain” feature of an AI-powered eTextbook. Each free-form question offers a window into learners’ thought processes—and the results challenge the assumption that AI tools encourage only surface-level inquiry.

Key Findings at a Glance

  • 80 % Basic Recall & Conceptual Understanding: Most questions sought definitions or straightforward explanations—expected for an introductory biology course.

  • One-Third Showed Advanced Complexity: About 20 % reached Analyze or higher on Bloom’s Taxonomy, indicating genuine higher-order thinking.

  • Active vs. Passive Learning Signals: The presence of sophisticated questions suggests many students are proactively framing problems rather than passively consuming information.

From Insights to Innovation: The “Go Deeper” Feature

Armed with these findings, Pearson developed “Go Deeper,” a new capability that recommends follow-up prompts designed to nudge students toward analysis, synthesis, and evaluation. By scaffolding curiosity, the tool aims to shift more queries into higher cognitive territory.

Implications for Educators and EdTech Designers

1. AI Can Foster Depth, Not Just Speed

  • When thoughtfully integrated, generative AI can encourage students to explore relationships, critiques, and applications—not merely memorize facts.

2. Prompt Engineering Is a Teachable Skill

  • Training students to craft better questions may be as important as teaching content itself.

3. Feedback Loops Drive Feature Evolution

  • Continuous analysis of real-world usage fuels data-informed improvements—exactly how Pearson arrived at Go Deeper.

4. Alignment with Pedagogical Goals

  • Platforms should embed structures that push learners up Bloom’s ladder, much like mentor systems (e.g., ibl.ai’s AI Mentor) that guide users from recall to reasoning.

Takeaways for Institutions

  • Monitor Question Quality to gauge engagement and adjust instruction.

  • Embed AI Tools that prompt reflection, not just delivery of answers.

  • Support Faculty Development so instructors can leverage AI analytics to tailor lessons.


Final Thoughts

Pearson’s study reinforces a critical insight: AI’s educational value depends on how students interact with it. By designing features that prompt deeper inquiry, EdTech can transform generative models from answer engines into catalysts for critical thinking. As AI continues to permeate classrooms, the real differentiator will be tools that help learners not only ask questions—but ask better ones.

See the ibl.ai AI Operating System in Action

Discover how leading universities and organizations are transforming education with the ibl.ai AI Operating System. Explore real-world implementations from Harvard, MIT, Stanford, and users from 400+ institutions worldwide.

View Case Studies

Get Started with ibl.ai

Choose the plan that fits your needs and start transforming your educational experience today.