Connect Ollama Cloud to OpenClaw
Use Ollama Cloud to power OpenClaw Setup with open-source LLM models. Ollama Cloud provides hosted access to popular open models like DeepSeek, Llama, Qwen, and more. Note: OpenClaw Setup supports Ollama Cloud (the hosted service) — local Ollama instances are not supported.
Create an Ollama account
- Go to ollama.com
- Click Create account and sign up
- Verify your email address
With an account, you get access to Ollama Cloud's hosted hardware to run open-source models faster than on local machines.
Get your Ollama Cloud API key
- After signing in, go to your Account Settings
- Generate or copy your API key for Ollama Cloud access
- Copy the API key
- In the OpenClaw Setup wizard, select Provider: Ollama and paste the key as the credential
Select your model
After adding Ollama as a provider, choose a default model. Popular options include:
ollama/deepseek-v3.2— DeepSeek reasoning modelollama/llama3.3— Meta Llama 3.3ollama/qwen2.5— Alibaba Qwen 2.5
The wizard loads the full model catalog from Ollama Cloud automatically, or you can type any model ID manually using the format ollama/model-name.
Cloud vs local Ollama
OpenClaw Setup integrates with Ollama Cloud (the hosted service at ollama.com). If you're looking to run a local Ollama instance on your own hardware, that requires a self-hosted OpenClaw setup — see our managed vs self-hosted comparison for details.
About pricing
Ollama Cloud charges for hosted compute time. Check ollama.com for current pricing tiers. Open-source models themselves are free to use — you pay only for the cloud hardware that runs them.
Ready to launch?
Once you have your Ollama Cloud API key, go to the dashboard to complete setup.