On April 16, 2026, MZLA Technologies — the for-profit subsidiary of the Mozilla Foundation best known for maintaining the Thunderbird email client — announced Thunderbolt, an open-source, self-hostable AI client aimed squarely at enterprises that don’t want their internal data flowing through Microsoft Copilot, ChatGPT Enterprise, or Claude Enterprise. The project is licensed under MPL 2.0, ships clients for every major desktop and mobile platform, and positions itself as the “sovereign” alternative to the handful of proprietary AI stacks that currently dominate the enterprise market.
General Audience
Thunderbolt is a front-end application that an organization’s users interact with for chat, search, research, and task-based automation — while the back end is wired to whatever models and systems the organization chooses. Out of the box it supports Anthropic, OpenAI, Mistral, and OpenRouter as cloud providers, and it runs local models through Ollama, llama.cpp, or any OpenAI-compatible API. Enterprises can deploy it on-premises via Docker Compose or Kubernetes.
The product is built around four modes: a conversational Chat, an information-retrieval Search, and two preview features — Research and Tasks. It also ships with OIDC authentication, integrations for Google and Microsoft workspaces, and preview support for the Model Context Protocol (MCP) and the newer Agent Client Protocol (ACP) for workflow orchestration. Native apps are available for Web, Linux, Windows, macOS, iOS, and Android.
MZLA CEO Ryan Sipes framed the launch as a fight over control rather than features. “The problem we are solving today is one of sovereignty and control,” he told The Register, warning that enterprises should not have “internal company data flowing through” proprietary platforms. The pitch deliberately echoes Firefox’s early positioning against Internet Explorer’s 95% market share: the tagline on the GitHub repo reads, “AI You Control: Choose your models. Own your data. Eliminate vendor lock-in.”
To get there, Mozilla partnered with deepset, the Berlin-based company behind the open-source Haystack agent framework. Haystack handles the retrieval and orchestration layer that connects Thunderbolt to an organization’s internal data sources.
Thunderbolt lands in a market where most enterprise AI clients are tightly coupled to a single vendor’s hosted models. By shipping an open, model-agnostic front end with serious on-prem deployment tooling, MZLA is targeting organizations in regulated industries, public-sector buyers, and any business that has been told by legal or security that chat logs cannot leave the premises. The combination of MPL 2.0 licensing, Ollama support, and MCP/ACP compatibility also makes Thunderbolt a plausible reference client for the broader open-source AI ecosystem — especially for teams already running local inference on their own hardware.
It’s still early. The project is under active development, a security audit is in progress, and the “offline-first” experience isn’t quite there yet — authentication and search still require connectivity. But with 557 GitHub stars inside its first few days, a clear commercial model built on support, professional services, and a planned managed-hosting tier, and Mozilla’s long history of shepherding open-source alternatives to dominant incumbents, Thunderbolt is the most credible enterprise challenger to Copilot and ChatGPT Enterprise to emerge so far in 2026.
