Back to Blog

Pearson: Asking to Learn

Jeremy WeaverJune 18, 2025
Premium

Pearson’s analysis of 128,000 student queries to an AI study tool uncovers a surprising share of higher-order questions—evidence that thoughtful AI integration can push learners beyond rote memorization.


What Happens When Students “Ask to Learn”?

Pearson’s paper, “*[Asking to Learn: What student queries to Generative AI reveal about cognitive engagement](https://plc.pearson.com/sites/pearson-corp/files/asking-to-learn.pdf)*,” examines 128,725 queries posed by 8,681 students using the “Explain” feature of an AI-powered eTextbook. Each free-form question offers a window into learners’ thought processes—and the results challenge the assumption that AI tools encourage only surface-level inquiry.

Key Findings at a Glance

  • 80 % Basic Recall & Conceptual Understanding: Most questions sought definitions or straightforward explanations—expected for an introductory biology course.
  • One-Third Showed Advanced Complexity: About 20 % reached Analyze or higher on Bloom’s Taxonomy, indicating genuine higher-order thinking.
  • Active vs. Passive Learning Signals: The presence of sophisticated questions suggests many students are proactively framing problems rather than passively consuming information.

From Insights to Innovation: The “Go Deeper” Feature

Armed with these findings, Pearson developed “Go Deeper,” a new capability that recommends follow-up prompts designed to nudge students toward analysis, synthesis, and evaluation. By scaffolding curiosity, the tool aims to shift more queries into higher cognitive territory.

Implications for Educators and EdTech Designers

1. AI Can Foster Depth, Not Just Speed
  • When thoughtfully integrated, generative AI can encourage students to explore relationships, critiques, and applications—not merely memorize facts.
2. Prompt Engineering Is a Teachable Skill
  • Training students to craft better questions may be as important as teaching content itself.
3. Feedback Loops Drive Feature Evolution
  • Continuous analysis of real-world usage fuels data-informed improvements—exactly how Pearson arrived at Go Deeper.
4. Alignment with Pedagogical Goals
  • Platforms should embed structures that push learners up Bloom’s ladder, much like mentor systems (e.g., [ibl.ai’s AI Mentor](https://ibl.ai/product/mentor-ai-higher-ed)) that guide users from recall to reasoning.

Takeaways for Institutions

  • Monitor Question Quality to gauge engagement and adjust instruction.
  • Embed AI Tools that prompt reflection, not just delivery of answers.
  • Support Faculty Development so instructors can leverage AI analytics to tailor lessons.

Final Thoughts

Pearson’s study reinforces a critical insight: AI’s educational value depends on how students interact with it. By designing features that prompt deeper inquiry, EdTech can transform generative models from answer engines into catalysts for critical thinking. As AI continues to permeate classrooms, the real differentiator will be tools that help learners not only ask questions—but ask *better* ones.