Tutorial

Connect Ollama Cloud to OpenClaw

Use Ollama Cloud to power OpenClaw Setup with open-source LLM models. Ollama Cloud provides hosted access to popular open models like DeepSeek, Llama, Qwen, and more. Note: OpenClaw Setup supports Ollama Cloud (the hosted service) — local Ollama instances are not supported.

Create an Ollama account

  1. Go to ollama.com
  2. Click Create account and sign up
  3. Verify your email address

With an account, you get access to Ollama Cloud's hosted hardware to run open-source models faster than on local machines.

Get your Ollama Cloud API key

  1. After signing in, go to your Account Settings
  2. Generate or copy your API key for Ollama Cloud access
  3. Copy the API key
  4. In the OpenClaw Setup wizard, select Provider: Ollama and paste the key as the credential

Select your model

After adding Ollama as a provider, choose a default model. Popular options include:

  • ollama/deepseek-v3.2 — DeepSeek reasoning model
  • ollama/llama3.3 — Meta Llama 3.3
  • ollama/qwen2.5 — Alibaba Qwen 2.5

The wizard loads the full model catalog from Ollama Cloud automatically, or you can type any model ID manually using the format ollama/model-name.

Cloud vs local Ollama

OpenClaw Setup integrates with Ollama Cloud (the hosted service at ollama.com). If you're looking to run a local Ollama instance on your own hardware, that requires a self-hosted OpenClaw setup — see our managed vs self-hosted comparison for details.

About pricing

Ollama Cloud charges for hosted compute time. Check ollama.com for current pricing tiers. Open-source models themselves are free to use — you pay only for the cloud hardware that runs them.

Ready to launch?

Once you have your Ollama Cloud API key, go to the dashboard to complete setup.

Cookie preferences