The MCP client desktop
ChatFrame is a cross-platform desktop chatbot that unifies access to multiple LLM providersa and connect them with MCP (Model Context Protocol) servers

Built with ❤️,
for developers and power users who want direct access to AI models,
deliver maximum value of your token
Artifact
Unleash your creativity, render HTML / React / Mermaid / SVG

Extension by MCP
Finish tasks in natural language, empower AI with real context

Knowledge in one place
Upload pdfs, code files and more. Setup instructions for AI to help you better

Experience the perfect balance of privacy, economy, freedom, and speed
Privacy
All data stays on your local machine. Your conversations, files, and settings never leave your computer.
Economy
Use your own API keys. One subscription to GitHub Copilot gives you access to everything.
Freedom
Create your own AI workspace with different LLM contexts and tools. Switch between providers freely.
Fast
Built with Rust kernel for lightweight performance. Save your time with blazing fast responses.
You receive a single, polished desktop application that lets you chat with multiple large-language-model providers (OpenAI, Anthropic, Groq, etc.) from one interface.
Out of the box you also get:
- Local RAG (Retrieval-Augmented Generation) for your own PDF, text, and code files—no data ever leaves your machine.
- Support for MCP (Model Context Protocol) servers so you can plug in custom tools and databases.
- Cross-platform installers for macOS (Apple Silicon & Intel) and Windows (x86_64).
Only the prompts and context you explicitly send to an LLM provider are transmitted. All file parsing, vector indexing, and RAG operations happen locally on your computer.
Launch ChatFrame, open the Providers tab, and paste the keys for the services you use (e.g., OpenAI, Anthropic). The table in the documentation lists direct links to each provider’s key portal.
MCP (Model Context Protocol) is an open standard that lets language models call external tools—databases, web search, custom scripts, etc.—in a secure, standardized way. If you need your chatbot to query a Postgres database or run internal APIs, you can add an MCP server and ChatFrame will expose those tools inside any conversation.
Only if you choose to run MCP servers that require them (e.g., the Postgres MCP server needs Node.js). ChatFrame does not bundle runtimes; you control your own environment to avoid version conflicts and bloat.
No. ChatFrame is closed-source and built on Tauri and the Vercel AI SDK.
Updates download automatically in the background. When a new version is ready, an “Install” button appears in the app; one click applies the update.