Digital Education Council: Global AI Faculty Survey 2025
The survey reveals that most faculty have experimented with AI in teaching, though its use tends to be limited. Many are worried about students’ over-reliance on AI and their ability to critically assess its output, while also noting that institutions lack clear AI guidance. Additionally, a significant number advocate for reforming student assessments, although a strong majority remain optimistic about the future integration of AI in teaching.
Digital Education Council: Global AI Faculty Survey 2025
Summary of https://www.digitaleducationcouncil.com/post/digital-education-council-global-ai-faculty-survey
The Digital Education Council's Global AI Faculty Survey 2025 explores faculty perspectives on AI in higher education. The survey, gathering insights from 1,681 faculty members across 28 countries, investigates AI usage, its impact on teaching and learning, and institutional support for AI integration.
Key findings reveal that a majority of faculty have used AI in teaching, mainly for creating materials, but many have concerns about student over-reliance and evaluation skills. Furthermore, faculty express a need for clearer guidelines, improved AI literacy resources, and training from their institutions.
The report also highlights the need for redesigning student assessments to address AI's impact. The survey data is intended to inform higher education leaders in their AI integration efforts and complements the DEC's Global AI Student Survey.
Here are the five most important takeaways:
- Faculty have largely adopted AI in teaching, but use it sparingly. 61% of faculty report they have used AI in teaching. However, a significant majority of these faculty members indicate they use AI sparingly.
- Many faculty express concerns regarding students' AI literacy and potential over-reliance on AI. 83% of faculty are concerned about students' ability to critically evaluate AI output, and 82% worry that students may become too reliant on AI.
- Most faculty feel that institutions need to provide more AI guidance. 80% of faculty feel that their institution's AI guidelines are not comprehensive. A similar percentage of faculty feel there is a lack of clarity on how AI can be applied in teaching within their institutions.
- A significant number of faculty are calling for changes to student assessment methods. 54% of faculty believe that current student evaluation methods require significant changes. Half of faculty members believe that current assignments need to be redesigned to be more AI resistant.
- The majority of faculty are positive about using AI in teaching in the future. 86% of faculty see themselves using AI in their teaching practices in the future. Two-thirds of faculty agree that incorporating AI into teaching is necessary to prepare students for future job markets.
Related Articles
Students as Agent Builders: How Role-Based Access (RBAC) Makes It Possible
How ibl.ai’s role-based access control (RBAC) enables students to safely design and build real AI agents—mirroring industry-grade systems—while institutions retain full governance, security, and faculty oversight.
AI Equity as Infrastructure: Why Equitable Access to Institutional AI Must Be Treated as a Campus Utility — Not a Privilege
Why AI must be treated as shared campus infrastructure—closing the equity gap between students who can afford premium tools and those who can’t, and showing how ibl.ai enables affordable, governed AI access for all.
Pilot Fatigue and the Cost of Hesitation: Why Campuses Are Stuck in Endless Proof-of-Concept Cycles
Why higher education’s cautious pilot culture has become a roadblock to innovation—and how usage-based, scalable AI frameworks like ibl.ai’s help institutions escape “demo purgatory” and move confidently to production.
AI Literacy as Institutional Resilience: Equipping Faculty, Staff, and Administrators with Practical AI Fluency
How universities can turn AI literacy into institutional resilience—equipping every stakeholder with practical fluency, transparency, and confidence through explainable, campus-owned AI systems.