CrowLLMCrowLLM
User GuideInstallationAPI ReferenceAI ApplicationsSkillsHelp & SupportBusiness Cooperation

OpenClaw - Self-Hosted AI Smart Assistant Platform

OpenClaw Tutorial — Install OpenClaw, integrate with CrowLLM, and quickly set up a self-hosted AI assistant. An open-source project supporting multi-channel integration like Telegram, Discord, WhatsApp.

Project Introduction

OpenClaw is an open-source, self-hosted personal AI assistant platform that connects messaging apps to AI agents running on your own hardware. Designed for developers and advanced users, it allows you to have an autonomous AI assistant without giving up control of your data.

OpenClaw is completely open source. You can browse the source code, submit issues, or contribute at OpenClaw's GitHub repository. This tutorial covers the complete steps for installation, configuration, and integrating OpenClaw with CrowLLM.

🌟 Core Features

Multi-Channel Integration

  • Multi-channel integration: Supports various messaging channels like Telegram, Discord, WhatsApp, iMessage, and can be extended to more platforms via plugins.
  • Single Gateway: Unified management of all channels through a single Gateway process.
  • Voice Support: Supports macOS/iOS/Android voice interaction.
  • Canvas Interface: Capable of rendering interactive Canvas interfaces.

Self-Hosting and Data Security

  • Fully Self-Hosted: Runs on your own machine or server.
  • Open Source & Transparent: MIT open-source license, fully transparent code.
  • Data Localization: Context and skills are stored on your local computer, not in the cloud.

Smart Agent Capabilities

  • Continuous Operation: Supports persistent background operation with long-term memory.
  • Scheduled Tasks: Supports cron-based scheduled tasks.
  • Session Isolation: Isolates sessions by agent/workspace/sender.
  • Multi-Agent Routing: Supports collaborative work among multiple agents.
  • Tool Calling: Native support for tool calling and code execution.

📦 Pre-integration Preparation

Preparation Information

  • Node.js 22 or higher
  • An available CrowLLM address (usually ending with /v1)
  • An available CrowLLM API Key

Before integrating with CrowLLM, it's recommended to first get the Gateway and Control UI running according to OpenClaw's currently recommended official process. This makes it easier to distinguish whether OpenClaw itself hasn't started or if the model provider configuration is incorrect when troubleshooting later.

1. Install OpenClaw (macOS/Linux)

curl -fsSL https://openclaw.ai/install.sh | bash

For other installation methods, refer to the OpenClaw official documentation: Getting Started.

2. Run the Onboarding Wizard

openclaw onboard --install-daemon

This wizard completes basic authentication, Gateway setup, and optional channel initialization. The goal here is to get OpenClaw running first, then switch the default model to CrowLLM later.

3. Check Gateway and Control UI

openclaw gateway status
openclaw dashboard

If your browser can open the Control UI, it means OpenClaw's basic operation is normal. At this stage, there's no need to configure messaging channels like Telegram, Discord, or Feishu yet.

4. Locate the Configuration File

OpenClaw's configuration file is usually located at ~/.openclaw/openclaw.json. You can continue to modify it based on what the onboarding wizard generates.

Path-Related Environment Variables

If you run OpenClaw under a dedicated service account, or wish to customize the configuration/state directory, you can use:

  • OPENCLAW_HOME
  • OPENCLAW_STATE_DIR
  • OPENCLAW_CONFIG_PATH

For detailed explanations, see the official environment variables documentation: Environment Variables.

🚀 Using CrowLLM as a Model Provider

OpenClaw supports integrating custom or OpenAI-compatible model gateways via models.providers. For CrowLLM, the most common approach is to add it as a custom provider to the configuration, then point the default model to newapi/MODEL_ID.

Integration Approach

  1. Declare a newapi provider under models.providers.
  2. Point baseUrl to your CrowLLM address, ensuring it includes /v1.
  3. Set api to openai-completions.
  4. List the model IDs you want OpenClaw to use in models.
  5. Switch the default model to newapi/... in agents.defaults.model.primary.

First, provide your CrowLLM key in the current shell, service environment, or a .env file readable by OpenClaw:

export CrowLLM_API_KEY="sk-your-newapi-key"

Then, add or modify the following snippet in openclaw.json:

{
  models: {
    mode: "merge",
    providers: {
      newapi: {
        baseUrl: "https://<your-newapi-domain>/v1",
        apiKey: "${CrowLLM_API_KEY}",
        api: "openai-completions",
        models: [
          { id: "gemini-2.5-flash", name: "Gemini 2.5 Flash" },
          { id: "kimi-k2.5", name: "Kimi K2.5" },
        ],
      },
    },
  },

  agents: {
    defaults: {
      model: {
        primary: "newapi/gemini-2.5-flash",
        fallbacks: ["newapi/kimi-k2.5"],
      },
      models: {
        "newapi/gemini-2.5-flash": { alias: "flash" },
        "newapi/kimi-k2.5": { alias: "kimi" },
      },
    },
  },
}

This is not a complete configuration to be copied verbatim, but rather the most critical part for integrating CrowLLM. As long as the provider, model ID, and default model references are correctly matched, OpenClaw will be able to call the model resources you expose via CrowLLM.

Key Configuration Explanation

Configuration ItemDescription
models.modeRecommended to set to merge to append newapi while retaining OpenClaw's built-in providers.
models.providers.newapi.baseUrlYour CrowLLM address, usually needs to include /v1.
models.providers.newapi.apiKeyCrowLLM key, recommended to inject via ${CrowLLM_API_KEY}.
models.providers.newapi.apiFor OpenAI-compatible gateways like CrowLLM, use openai-completions.
models.providers.newapi.modelsThe model IDs listed here must match the actual model names exposed by your CrowLLM.
agents.defaults.model.primaryDefault primary model, format must be provider/model-id.
agents.defaults.model.fallbacksFallback model list, automatically switches if the primary model fails.
agents.defaults.modelsOptional, used to create aliases for models, convenient for referencing in UI or conversations.

Verify Successful Integration

After completing the configuration, return to or reopen the Control UI:

openclaw dashboard

If you can initiate conversations normally in OpenClaw and the default model has become newapi/..., then the integration is successful. You can also use:

openclaw models list

to confirm that models with the newapi/ prefix appear in the selectable list.

Common Issues

  • baseUrl without /v1: This is one of the most common integration errors.
  • Incorrect model ID: primary and fallbacks must correspond to the id in models.providers.newapi.models.
  • Key only effective in the current terminal: If Gateway runs as a background service, ensure the service process can also read CrowLLM_API_KEY.
  • For foreground troubleshooting: You can use the official foreground running method openclaw gateway --port 18789 to observe logs and errors.

How is this guide?