# Air-Gapped AI Deployment > Source: https://ibl.ai/resources/capabilities/air-gapped-deployment *Run a full-stack AI platform inside your security perimeter — no data egress, no telemetry, no external dependencies. Ever.* Air-gapped AI deployment means the entire ibl.ai platform — agents, models, data pipelines, APIs, and audit systems — runs exclusively inside your infrastructure. Nothing leaves your environment. For classified agencies, regulated enterprises, and security-conscious organizations, this is not a feature. It is a prerequisite. Most AI vendors cannot offer it. ibl.ai is built for it. With 1.6M+ users across 400+ organizations and production deployments on infrastructure ranging from government data centers to private cloud enclaves, ibl.ai delivers enterprise-grade AI capability without ever requiring a connection to the outside world. ## The Challenge Most enterprise AI platforms are SaaS-first. Your data travels to vendor clouds for inference, fine-tuning, and logging. Even platforms that claim "private deployment" often embed telemetry, license checks, or model API calls that reach external endpoints. For organizations operating under ITAR, FedRAMP, HIPAA, or classified mandates, this is a disqualifying risk. The result is a painful tradeoff: accept the security exposure and use modern AI, or lock down your environment and fall behind. ibl.ai eliminates that tradeoff entirely. The platform is architected from the ground up to operate with zero external dependencies — no phone-home, no vendor telemetry, no cloud model APIs required. ## How It Works 1. **Full Platform Delivery to Your Infrastructure:** ibl.ai delivers the complete platform — including source code — to your environment. Deployment targets include on-premise bare metal, private VMware or OpenStack clusters, air-gapped AWS GovCloud, Azure Government, or classified enclaves. No SaaS components are required. 2. **Local Model Hosting and Inference:** Open-weight models (Llama, Mistral, and others) are deployed locally using your GPU or CPU infrastructure. The platform is model-agnostic — you choose which models run, where they run, and how they are updated. No external model API calls are made. 3. **Internal Data Pipeline Configuration:** All data ingestion, vectorization, retrieval, and storage occurs within your perimeter. Document stores, vector databases, and knowledge bases are hosted on your infrastructure. MCP connectors link to internal data sources only — no external endpoints required. 4. **Agent Orchestration Without External Calls:** Autonomous AI agents reason, plan, and execute entirely within your environment. Code execution sandboxes, API calls, and tool use are scoped to internal systems. Every agent action is logged to your internal audit infrastructure. 5. **Multi-Tenant Isolation and Access Control:** Role-based access control, tenant isolation, and identity management are configured against your existing directory services (LDAP, Active Directory, SAML). No external identity providers are required. 6. **Ongoing Operations Without Vendor Contact:** Because customers receive full source code ownership, the platform continues operating indefinitely without vendor involvement. Updates are delivered as versioned packages that you validate and deploy on your own schedule. ## Features ### Zero External Dependencies No telemetry, no license pings, no external API calls. The platform operates in fully disconnected environments indefinitely. Every component — inference, storage, orchestration, and audit — runs inside your perimeter. ### Full Source Code Ownership Customers receive the complete ibl.ai codebase. You own it. You can inspect it, modify it, and operate it without any ongoing vendor relationship. True sovereignty over your AI infrastructure. ### Local Model Hosting Deploy and serve open-weight LLMs (Llama, Mistral, and others) on your own GPU infrastructure. Switch models, fine-tune locally, and version control your model registry — all without touching the public internet. ### Internal Audit Trail Every agent action, user interaction, model call, and system event is logged to your internal infrastructure. Audit logs are queryable, exportable, and integrated with your existing SIEM or compliance tooling. ### Air-Gapped MCP Connectors Model Context Protocol connectors link AI agents to internal data sources — databases, document repositories, internal APIs — without requiring any external network access. Data stays inside. ### Autonomous Agents, Fully Contained ibl.ai deploys reasoning agents that execute code, call internal APIs, and take multi-step actions — all scoped to your internal environment. No agent capability requires external connectivity. ### Flexible Deployment Targets Deploy on bare metal, private VMware clusters, air-gapped GovCloud regions, or classified enclaves. The platform is containerized and infrastructure-agnostic, supporting Kubernetes and standalone deployments. ## With vs. Without | Aspect | Without | With | |--------|---------|------| | Data Residency | Prompts, documents, and user data routed through vendor cloud infrastructure for inference and logging. Data residency is a contractual promise, not a technical guarantee. | All data — prompts, documents, embeddings, logs — remains exclusively within your infrastructure. Data residency is enforced architecturally, not contractually. | | Operational Continuity | Platform availability depends on vendor uptime, license server connectivity, and external API availability. A vendor outage or connectivity loss disables your AI operations. | Platform operates indefinitely without any external connectivity. No license checks, no API dependencies, no single points of failure outside your control. | | Model Access in Disconnected Environments | Platforms tied to GPT, Claude, or Gemini APIs are completely non-functional in air-gapped environments. No internet means no AI. | Open-weight models run locally on your GPU infrastructure. Full LLM capability in completely disconnected environments, including classified networks. | | Audit and Oversight | Agent actions and model interactions logged to vendor systems. You receive summaries or exports — not ownership of the raw audit trail. | Every agent action, API call, and user interaction logged to your internal infrastructure in real time. Full audit trail ownership, queryable and exportable on your terms. | | Vendor Dependency and Exit Risk | Workflows, fine-tuned models, and integrations are locked inside vendor platforms. Switching vendors means rebuilding everything from scratch. | Full source code ownership means the platform is yours permanently. No vendor relationship required to keep it running. Exit risk is zero. | | Compliance Posture | Security teams must negotiate BAAs, DPAs, and security addenda with AI vendors — and accept residual risk that vendor practices may not meet your regulatory requirements. | Compliance is achieved architecturally. HIPAA, ITAR, FedRAMP, NERC CIP, and similar frameworks are satisfied by the fact that data never leaves your environment. | | Customization and Control | Platform behavior, model selection, and update cadence are controlled by the vendor. You consume what they offer, on their timeline. | Full source code ownership means you control every aspect of the platform — model selection, update timing, feature configuration, and infrastructure choices. | ## FAQ **Q: What does 'air-gapped' mean in the context of ibl.ai's deployment?** Air-gapped means the entire ibl.ai platform — including AI agents, model inference, data storage, APIs, and audit systems — runs exclusively inside your infrastructure with zero network connections to external systems. No data leaves your environment, no telemetry is transmitted, and no external APIs are called. The platform is fully self-contained. **Q: Can ibl.ai run without any internet connectivity at all?** Yes. ibl.ai is designed to operate in fully disconnected environments, including classified networks with no internet access. There are no license validation calls, no telemetry endpoints, and no external model API dependencies. Once deployed, the platform runs indefinitely without any external connectivity. **Q: Which AI models can be used in an air-gapped ibl.ai deployment?** ibl.ai is model-agnostic and supports any open-weight model that can be hosted locally, including Llama, Mistral, and custom fine-tuned models. Models are served using local inference backends such as vLLM, Ollama, or llama.cpp on your own GPU infrastructure. No external model provider APIs are required. **Q: Does ibl.ai send any telemetry or usage data back to the vendor?** No. ibl.ai transmits zero telemetry, usage analytics, error reports, or behavioral data to any external system. This is enforced architecturally — there are no telemetry endpoints in the platform. Customers who receive full source code can independently verify this. **Q: How does ibl.ai handle software updates in an air-gapped environment?** Updates are delivered as versioned release packages — container images and source code — that customers validate and deploy on their own schedule using their internal processes. There is no auto-update mechanism and no requirement to connect to vendor infrastructure to receive or apply updates. **Q: Is ibl.ai suitable for classified government and defense deployments?** Yes. ibl.ai is architected for deployment in classified environments including IL4, IL5, and TS/SCI networks. The platform has no external dependencies, supports FIPS 140-2 compliant cryptographic modules, integrates with internal identity providers, and delivers complete audit trails — meeting the technical requirements for classified AI deployments. **Q: What does 'full source code ownership' mean for air-gapped deployments?** Customers receive the complete ibl.ai codebase as part of their deployment. This means your security team can audit every line of code, verify the absence of telemetry or backdoors, and operate the platform permanently without any ongoing vendor relationship. The platform is yours — it keeps running regardless of what happens to ibl.ai as a vendor. **Q: Can AI agents in an air-gapped deployment still access internal data sources?** Yes. ibl.ai's MCP (Model Context Protocol) connectors are configured to connect agents to internal data sources — databases, document repositories, internal APIs, and internal S3-compatible storage — without requiring any external network access. Agents have full data access capability within your perimeter.