05-install-openclaw.txt

From: Running OpenClaw Locally with Ollama on Apple Silicon

INSTALLING OPENCLAW ==================== OpenClaw is the agent framework that sits on top of your model. Ollama is the engine; OpenClaw is everything else. It handles conversations, tool execution, memory, sessions, and optionally connects to messaging platforms. You can think of it as the brain and personality layer. Three installation methods, all produce the same result. Method A: The One-Liner (Recommended) -------------------------------------- curl -fsSL https://openclaw.ai/install.sh | bash This is the simplest path. The script detects your system, installs Node.js if needed (version 22.12+), and installs the latest OpenClaw release. It handles dependencies automatically. After it finishes, verify: openclaw --version Method B: npm Direct Install ------------------------------ If you already have Node.js 22.12+ installed: npm install -g openclaw@latest Check your Node version first: node -v If it's below 22.12, you need to update Node before this will work. The one-liner method handles this for you, which is why it's the recommended approach. Method C: Homebrew Cask ------------------------ brew install --cask openclaw This installs OpenClaw as a native macOS application. It requires macOS 15 (Sequoia) or newer. If you're on Sonoma, use Method A or B instead. The macOS Gatekeeper Problem ------------------------------ If you use Method C (the Homebrew cask), you might hit the Gatekeeper warning: "OpenClaw cannot be opened because Apple cannot check it for malicious software." This is a standard macOS security popup for apps downloaded outside the App Store. Two fixes: Fix 1: Right-click the app, click Open, then click Open again on the confirmation dialog. Fix 2: Remove the quarantine attribute from the command line: xattr -cr /Applications/OpenClaw.app Both do the same thing. The quarantine attribute is macOS's way of flagging downloaded apps. Once you clear it, the app opens normally. Running The Onboarding Wizard ------------------------------- After installation, run the onboarding: openclaw onboard --install-daemon The wizard walks through several setup steps: 1. AI Provider: Select "Ollama" when it asks. Do not enter any cloud API keys (Anthropic, OpenAI, OpenRouter). We want fully local. 2. Gateway settings: Accept the defaults or customize the port. The gateway is the local web server that handles OpenClaw's UI and API. 3. Messaging platforms: Skip these for now. You don't need WhatsApp or Telegram to use OpenClaw. We'll cover the built-in interfaces that require no external services. 4. Daemon: The --install-daemon flag sets up a background service so OpenClaw's gateway starts automatically. You can skip this and start it manually each time if you prefer. The wizard creates the configuration directory at ~/.openclaw/ and writes an initial openclaw.json config file. Alternative: Ollama Integrated Launch --------------------------------------- If you have Ollama v0.17.0 or newer, there's a shortcut: ollama launch openclaw This handles installation and initial configuration in one step, pre-configured to use your local Ollama instance. However, the manual configuration in the next section gives you more control and is easier to debug when things go wrong. For a meetup or learning scenario, I recommend the manual approach. Verify The Installation ------------------------ openclaw --version openclaw doctor The doctor command checks the overall health of your OpenClaw installation: configuration validity, provider connectivity, gateway status, and dependency versions. It's the first thing to run when something seems off. +----------------------------------------------------------+ | After installing, do NOT send your first message yet. | | The default configuration may try to reach cloud APIs. | | Configure local-only operation first (next section). | +----------------------------------------------------------+ If the installation succeeded and doctor shows no critical errors, move on to configuration. The next section is where we make sure OpenClaw talks only to your local Ollama instance and never phones home to any cloud provider.

← Back to tutorial