Warp Goes Open Source — OpenAI Sponsors the Lock-In
Warp open-sourced its Rust terminal under AGPL-3.0 with OpenAI as founding sponsor. The client is free. The agent runtime that makes it useful is not.
Warp dropped its Rust-based terminal client as a public repository on May 7, 2026, licensed under AGPL-3.0 — with OpenAI as the founding sponsor and GPT models powering the agentic contribution workflows that manage the repo itself. The repo hit 56,700+ stars and over 4,200 forks within days, landing at #2 on GitHub trending. I have been watching Warp for two years waiting for a reason to recommend it over Ghostty or Kitty. Now there is a reason — and a catch that matters more than the code itself.
TL;DR
- What: Warp open-sourced its terminal client under AGPL-3.0; OpenAI is the founding sponsor
- The split: Client is AGPL (copyleft); the Oz cloud agent orchestration platform is the monetization layer
- The pattern: Same open-source capture pipeline as OpenAI’s Astral acquisition — sponsor the tooling, route developers to your APIs
- Action: Use the terminal if you want. But understand what you are opting into before the agent layer becomes load-bearing.
Warp Goes Open Source — What Happened
Warp, the AI-first terminal used by over 700,000 developers, released its full client source code under the GNU Affero General Public License v3. The underlying UI framework ships under MIT. OpenAI is listed as the founding sponsor of the repository, and the contribution model is built around Oz — Warp’s proprietary cloud agent orchestration platform — which runs on GPT models.
This is not a conventional open-source release. The contribution workflow is agent-managed: human contributors propose ideas and verify behavior, while Oz agents handle issue triage, spec writing, implementation, and pull request review. Warp set up build.warp.dev as a live dashboard showing agent sessions, available tasks, and contribution status in real time. The company’s blog post states explicitly: “We now have a lot of confidence in code that is generated by Oz with our rules, context and verification.”
Alongside the open-source release, Warp expanded multi-agent terminal support. The terminal now runs Claude Code, Codex, Gemini CLI, and OpenCode in vertical tabs with native notifications, code review UI, rich input, and remote control. For developers running multiple CLI agents in parallel — a workflow that is rapidly becoming standard — this is a genuine differentiator. Warp also added support for open-source models including Kimi, MiniMax, and Qwen, plus a new “auto (open)” routing option.
Why This Matters
The licensing structure is the story. AGPL-3.0 on the client means any derivative work distributed over a network must also be open source. This effectively prevents cloud providers and competitors from forking Warp into a proprietary product — the same defensive move MongoDB and Grafana have used. For the community, this is protective. For Warp, it ensures the open-source client stays a funnel to their commercial layer rather than a gift to Amazon.
But the commercial layer is where the real architecture lives. Oz, the agent orchestration platform, is not open-sourced under AGPL. It is the monetization engine: enterprise features like bring-your-own-LLM, advanced security, and dedicated support all route through Oz. The client is the storefront. Oz is the register.
AGPL protects the client from proprietary forks. It does not make the agent orchestration layer open. The two licenses serve different masters — one protects the community, the other protects Warp’s revenue model.
This matters because of who is funding it. OpenAI as founding sponsor is not philanthropy. It is the same pattern we tracked when OpenAI acquired Astral — the company that built uv and Ruff. The playbook: sponsor the tooling layer developers depend on daily, then route their workflows toward OpenAI’s model APIs. With Warp, the mechanism is concrete — the default contribution workflow runs on GPT models through Oz. You can use other coding agents to contribute, but Warp’s own documentation states their “preference is using Oz.”
The community noticed. Within days of the open-source launch, OpenWarp appeared — a community fork maintained by zerx-lab that strips Warp’s mandatory cloud dependency while preserving the terminal experience. OpenWarp enables any AI provider — DeepSeek, Ollama, local models — with keys staying local. Warp’s founder Zach Lloyd responded publicly, and the company has since added open-source model support and the “auto (open)” routing option. But the existence of the fork tells you what the defaults were designed to be: OpenAI first, alternatives later.
Compare this to how Cursor’s SDK is evolving. The pattern is identical: open the client surface, close the agent infrastructure, make switching costly once teams build workflows on the proprietary orchestration layer. Warp is just more explicit about it — the AGPL license is honest about preventing proprietary forks, and the OpenAI sponsorship is public. But honesty about the structure does not change the structure.
If you are evaluating Warp for a team, the terminal itself is excellent. The question is not “is it good” but “do you want your contribution and agent workflows to default to OpenAI’s infrastructure?” That is a separate decision from choosing a terminal emulator.
The multi-agent terminal capabilities deserve credit on their own merits. Running Claude Code, Codex, and Gemini CLI in vertical tabs with native notifications and code review is a workflow that does not exist in Ghostty, Kitty, or any other terminal. For developers who have moved to agent-heavy CLI workflows, Warp solves a real coordination problem. The terminal is not just a shell anymore — it is an agent management surface. That is a legitimate product category, and Warp is the first to own it.
But “first to own it” with OpenAI as sponsor creates a specific gravity. The agent-first contribution model — where Oz handles implementation and humans validate — is both impressive and concerning. Impressive because it works: the build.warp.dev dashboard shows real agent sessions producing real code. Concerning because it normalizes a workflow where the orchestration layer is more important than the contributor. If Oz is the bottleneck for contribution quality, then whoever controls Oz controls the project’s velocity. Right now, that is Warp, running on OpenAI.
Ghostty remains the minimalist open-source terminal for developers who want performance without opinions. Kitty remains the configurable powerhouse for those who want control. Warp is now the AI-first terminal for developers who want agent orchestration built into their shell — but the “AI-first” part comes with a specific vendor attached. The open-source release directly addresses the transparency objection that blocked enterprise adoption of Warp for years. Enterprises can now audit the client code. They still cannot audit Oz.
The Take
I keep coming back to the same architectural pattern: the client is free, the runtime is not. We saw it with Cursor’s editor versus their inference layer. We saw it when OpenAI acquired Astral and suddenly the Python tooling ecosystem had a single corporate owner. Now we see it with Warp: a genuinely good terminal, honestly licensed under AGPL, funded by a company whose business model requires developers to route inference through their APIs.
The terminal itself? Use it. The Rust codebase is real, the multi-agent support is genuinely useful, and AGPL prevents the worst corporate fork scenarios. But do not confuse “open source client” with “open ecosystem.” The moment your team builds contribution workflows on Oz, you have a dependency on Warp’s orchestration layer and OpenAI’s models. That dependency is the product.
The OpenWarp fork exists because the community understood this immediately. If you want Warp’s terminal without the cloud gravity, start there. If you want the full agent-managed experience and accept the vendor trade-off, Warp is the best implementation of that vision available today. Just know what you are choosing — and what it will cost to unchoose it later.