8wDlpd.png
8wDFp9.png
8wDEOx.png
8wDMfH.png
8wDKte.png
0
0
0

Docker-Compose部署OpenClaw

雷顿
7小时前 28

使用docker-compose部署不失为最省心最快速体验openclaw的方式之一。

但是docker里面的openclaw能干的事情太少,卒之还是建议部署在宿主机上。

直入主题

创建一个目录,创建如下两个文件

.env

OPENCLAW_IMAGE=ghcr.io/openclaw/openclaw:latest
OPENCLAW_GATEWAY_TOKEN=可以先留空,等QuickStart之后再看openclaw.json
OPENCLAW_DATA_DIR=./data

docker-compose.yml

services:
  # --- 主服务:WebUI 和 Gateway (常驻) ---
  openclaw-gateway:
    image: ${OPENCLAW_IMAGE}
    container_name: openclaw-gateway
    restart: unless-stopped
    init: true
    environment:
      # 必须配置
      OPENCLAW_GATEWAY_MODE: local
      OPENCLAW_GATEWAY_TOKEN: ${OPENCLAW_GATEWAY_TOKEN}
      # 基础环境
      HOME: /home/node
      TERM: xterm-256color
      # 绑定设置
      OPENCLAW_GATEWAY_BIND: "lan" 
      OPENCLAW_GATEWAY_PORT: 18789
    volumes:
      - ${OPENCLAW_DATA_DIR}:/home/node/.openclaw
      - ${OPENCLAW_DATA_DIR}/workspace:/home/node/.openclaw/workspace
    ports:
      - "18789:18789"
      - "18790:18790"
    command:
      [
        "node",
        "dist/index.js",
        "gateway",
        "--bind",
        "lan",
        "--port",
        "18789",
      ]
  # --- 工具箱:CLI (默认不启动) ---
  openclaw-cli:
    profiles: ["tools"]
    image: ${OPENCLAW_IMAGE}
    container_name: openclaw-cli
    init: true
    stdin_open: true # -i 允许输入
    tty: true        # -t 伪终端
    environment:
      OPENCLAW_GATEWAY_MODE: local
      OPENCLAW_GATEWAY_TOKEN: ${OPENCLAW_GATEWAY_TOKEN}
      HOME: /home/node
      TERM: xterm-256color
      BROWSER: echo # 防止尝试打开宿主机浏览器
    volumes:
      # 挂载相同的配置目录,确保 CLI 修改的是同一份文件
      - ${OPENCLAW_DATA_DIR}:/home/node/.openclaw
      - ${OPENCLAW_DATA_DIR}/workspace:/home/node/.openclaw/workspace
    entrypoint: ["node", "dist/index.js"]
networks:
  default:
    enable_ipv6: true
    driver: bridge
    ipam:
      driver: default
      config:
        - subnet: 172.18.0.0/16
        - subnet: fd00:dead:beee::/48
    driver_opts:
      com.docker.network.bridge.enable_ip_masquerade: "true"
      com.docker.network.bridge.name: "openclaw"

需要注意:

有太多预购建好的openclaw镜像,可以根据需求挑选。

进入放docker-compose.yml的目录之后,docker compose run --rm openclaw-cli onboard 开始交互式引导向导

之后 docker compose up -d openclaw-gateway

如果你的docker没有启用ipv6,请删掉docker-compose.yml关于ipv6的那几行

如果要关掉Control UI的设备配对,请在openclaw.json的gateway中加入下面的内容

  "gateway": {
    "controlUi": {
      "allowInsecureAuth": true,
      "dangerouslyDisableDeviceAuth": true
    },
    "trustedProxies": ["*"],
  ...
  },

Nvidia免费提供多款开源大模型
可以在openclaw.json的models.providers部分添加配置
下面是我给出的一份参考

  "models": {
    "providers": {
      "nvidia": {
        "baseUrl": "https://integrate.api.nvidia.com/v1",
        "apiKey": "nvapi-填入你的apiKey",
        "api": "openai-completions",
        "models": [
          {
            "id": "z-ai/glm5",
            "name": "GLM-5",
            "api": "openai-completions",
            "reasoning": false,
            "input": [
              "text"
            ],
            "contextWindow": 200000,
            "maxTokens": 8192
          },
          {
            "id": "minimaxai/minimax-m2.1",
            "name": "MMinimax-M2.1",
            "api": "openai-completions",
            "reasoning": false,
            "input": [
              "text"
            ],
            "contextWindow": 195000,
            "maxTokens": 8192
          },
          {
            "id": "moonshotai/kimi-k2.5",
            "name": "Kimi-K2.5",
            "api": "openai-completions",
            "reasoning": false,
            "input": [
              "text"
            ],
            "contextWindow": 250000,
            "maxTokens": 8192
          },
          {
            "id": "qwen/qwen3.5-397b-a17b",
            "name": "Qwen3.5-397B-A17B",
            "api": "openai-completions",
            "reasoning": false,
            "input": [
              "text"
            ],
            "contextWindow": 250000,
            "maxTokens": 8192
          }
        ]
      }
    }
  },

在agents.defaults.models部分添加以下配置

  "agents": {
    "defaults": {
      "model": {
        "primary": "nvidia/minimaxai/minimax-m2.1"
      },
      "models": {
        "nvidia/z-ai/glm5": {},
        "nvidia/minimaxai/minimax-m2.1": {},
        "nvidia/moonshotai/kimi-k2.5": {},
        "nvidia/qwen/qwen3.5-397b-a17b": {}
      },
  ...
    }
  },
最后于 4小时前 被雷顿编辑 ,原因:
最新回复 (0)

    暂无评论

    • 碧蓝之星_深海迷航社区
      2
        点击登录 点击注册

请先登录后发表评论!

返回