The Pilot Trap
Every campus AI story starts the same way. A small team runs a pilot. Fifty students. One course. A chatbot that answers questions about the syllabus and nudges students before deadlines.
The results look great. Engagement up 30%. Student satisfaction scores through the roof. The dean is excited. The provost wants to scale it.
Then the vendor sends the enterprise pricing.
Suddenly that per-seat model — so reasonable at fifty students — is projecting to $4.5 million annually for the full student body.
And the pilot data? It lives in the vendor's cloud. The integrations? Proprietary connectors the vendor maintains. The model? Whatever the vendor chose, at whatever price the vendor negotiated with the LLM provider.
The pilot didn't prove ROI. It proved dependency.
Why Pilot ROI Misleads
Pilot results are structurally biased. This isn't a criticism of the researchers running them — it's a feature of the pilot design itself.
Pilots select for motivated participants. The faculty member who volunteers to test an AI tutoring tool is already interested in innovation. The students in the pilot section know they're getting something special. The Hawthorne effect is alive and well in higher ed.
Pilots also operate at a scale where cost is invisible. At fifty users, even an expensive per-seat model costs less than a faculty member's monthly parking pass. The economics change by orders of magnitude at institutional scale.
Most critically, pilots don't capture dependency costs. They measure the value of using the tool. They don't measure the cost of being unable to leave the tool.
The Costs Nobody Calculates
When a provost or CIO evaluates AI ROI, they typically look at direct costs (licensing, infrastructure) versus direct benefits (time saved, retention improved, enrollment protected).
That framework misses three categories of cost that often dwarf the licensing fee.
Dependency Cost
Every month your institution uses a proprietary AI platform, switching becomes harder. Student interaction data accumulates in the vendor's format. Faculty build workflows around the vendor's interface. IT builds integrations with the vendor's API.
After two years, the cost of switching isn't just finding a new vendor. It's migrating data, retraining faculty, rebuilding integrations, and managing the disruption to student services during the transition.
Dependency cost compounds silently. By the time it's visible, it's often the largest line item in the total cost of ownership.
Exit Cost
What happens when you need to leave? Maybe the vendor raises prices. Maybe they get acquired by a company whose values don't align with your institution's mission. Maybe their security practices fail an audit.
Exit cost includes data extraction (if it's even possible), parallel operation during migration, retraining, integration rebuilding, and the political cost of telling the board that the AI investment needs to be replaced.
Institutions running platforms they own — with source code access and data in their own infrastructure — have near-zero exit costs. The platform keeps running regardless of the vendor relationship.
Lock-In Cost
Lock-in cost is the opportunity cost of architectural decisions you can't undo.
When your AI platform only supports one LLM provider, you pay whatever that provider charges. When a competitor launches a model that's 40% cheaper and 20% better for your use case, you can't switch. That delta, compounded over years, is lock-in cost.
When your platform uses proprietary integration connectors, you can't add new data sources without the vendor's involvement. That delay — measured in months and professional services fees — is lock-in cost.
When your platform stores conversation data in a proprietary format, your institutional research team can't analyze it without the vendor's analytics module. That additional license fee is lock-in cost.
What Provosts and CIOs Need to Understand
The core insight is this: AI in higher education is infrastructure, not a tool.
A tool is something you buy, use, and replace when something better comes along. A hammer. A projector. A polling app.
Infrastructure is something that other things depend on. Your SIS. Your LMS. Your identity management system.
AI is becoming infrastructure. Student advising, enrollment management, retention interventions, faculty support, administrative workflows — once these run through AI, the AI layer is infrastructure whether you planned for it or not.
And infrastructure has different economics than tools.
Tool economics optimize for features per dollar. Infrastructure economics optimize for control per dollar. The cheapest AI tool might be the most expensive AI infrastructure if it locks you into a single vendor, a single model, and a single architecture.
This is the conversation provosts and CIOs need to have before signing year one of a multi-year agreement.
The Expanded ROI Framework
Here's the framework that accounts for what pilot ROI ignores.
Direct Value (What the Pilot Measured)
Time savings for advisors, faculty, and staff. Improvements in student engagement, retention, and satisfaction. Enrollment protection through better yield management and student support.
These are real. They matter. But they're table stakes — every vendor claims them.
Architecture Value (What the Pilot Didn't Measure)
Model flexibility savings. If you can route queries to the most cost-effective model for each task, how much do you save versus a single-model platform? For many institutions, this is 40-60% of inference costs.
Integration velocity. How quickly can you connect a new data source — a new SIS module, a new CRM field, a new learning tool? Platforms using open protocols like MCP measure this in days. Proprietary platforms measure it in quarters and SOWs.
Compliance confidence. What's the cost of a FERPA incident? What's the risk reduction of running AI in your own infrastructure versus a multi-tenant SaaS? Your CISO can quantify this — and the number is usually significant.
Avoidance Value (What You Don't Pay Later)
Zero exit cost. Platforms you own don't charge you to leave. The code keeps running. The data stays in your systems.
Zero lock-in cost. LLM-agnostic platforms let you follow the cost-performance frontier as it moves. This year's best model isn't next year's.
Zero dependency cost. When you own the source code and run in your infrastructure, vendor business decisions — pricing changes, acquisitions, pivots — don't become your emergency.
Putting Numbers to It
Let's run a simplified comparison for a university with 30,000 students.
Proprietary SaaS model: $15/student/month = $5.4M/year. Three-year contract = $16.2M. Plus $500K in integration professional services. Plus unknown exit costs at contract end.
Owned platform model: Infrastructure costs of $300-500K/year for compute and storage. LLM API costs at developer rates of $200-400K/year depending on usage. Platform licensing or partnership at a fraction of per-seat pricing.
The owned model typically runs 50-70% less in direct costs. But the real savings are in the avoidance category. No exit costs. No lock-in premium. No dependency risk.
ibl.ai works with institutions on this math regularly. The numbers vary by institution size and usage patterns, but the structural advantage of ownership is consistent.
The Decision Framework for Campus Leaders
Before your next AI procurement, run every option through these questions.
Year one: What does this cost to deploy across our full student body, not just the pilot cohort?
Year three: What does this cost if the vendor raises prices by 30%? What does it cost if we need to switch?
Year five: Does this platform still work if the vendor doesn't exist? Does our data still belong to us? Can we still run and modify the system?
If the answer to any year-five question is "no," you're not buying a platform. You're renting a dependency.
ROI Is a Governance Question
The real ROI of AI in higher education isn't a number. It's a posture.
Institutions that own their AI infrastructure can experiment freely, scale deliberately, and adapt continuously. Their ROI improves over time because they control the variables.
Institutions that rent their AI capabilities are always one pricing change, one acquisition, or one contract negotiation away from starting the ROI conversation over.
The provost who asks "what's the ROI of this AI tool?" is asking the right question with the wrong framing. The better question is: what's the ROI of owning our AI infrastructure versus renting it?
The math always favors ownership. It just takes longer to see it than a twelve-week pilot.