What AI Providers Does OpenClaw Support?
OpenClaw supports virtually any AI model with an OpenAI-compatible API. Here's a complete overview of supported providers and their key models.
Cloud providers: - Anthropic: Claude Opus 4, Sonnet 4, Haiku (highest quality, best reasoning) - OpenAI: GPT-4o, GPT-4o-mini, GPT-4.1 (strong all-around) - Google: Gemini 2.5 Pro, Flash (massive context, multimodal) - DeepSeek: V3, R1 (best value, excellent quality/price) - Mistral: Large, Small, Nemo (European, multilingual) - Moonshot: Kimi (ultra-long context up to 2M tokens)
Local providers: - Ollama: Llama, Qwen, Mistral, Phi, Gemma (free, offline) - LM Studio: Same models with GUI management
Aggregators: - OpenRouter: 200+ models via single API key (easy switching)
Any service that implements the OpenAI Chat Completions API format works with OpenClaw's openai-compatible provider mode. This includes self-hosted solutions like vLLM, TGI, and LocalAI.
Choosing a provider depends on your priorities: quality (Claude), speed (GPT-4o), price (DeepSeek), privacy (Ollama), or flexibility (OpenRouter). You can configure different providers per workspace.