INSTALLING OPENWEBUI
This is the main event. One Docker command and you've got a full ChatGPT-like interface running on your Mac.
The Docker Compose File
Instead of a long docker run command, we're going to use Docker Compose. It's a simple YAML file that describes exactly how to run the container. Create a file called docker-compose.yml somewhere convenient, like a folder called open-webui in your home directory:
mkdir ~/open-webui
cd ~/open-webui
Create the file with these contents:
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
ports:
- "3000:8080"
volumes:
- open-webui:/app/backend/data
extra_hosts:
- "host.docker.internal:host-gateway"
restart: always
volumes:
open-webui:
That's the whole thing. Save it, then run:
docker compose up -d
Docker pulls the image (about 2GB) and starts OpenWebUI in the background. Done.
What Each Part Does
Let's break that compose file down so you know what you just wrote.
image: ghcr.io/open-webui/open-webui:main
The official OpenWebUI image from GitHub Container Registry. The :main tag gives you the latest stable release.
ports: "3000:8080"
Maps port 3000 on your Mac to port 8080 inside the container. OpenWebUI listens on 8080 internally, but you'll access it at localhost:3000 in your browser.
volumes: open-webui:/app/backend/data
Creates a named volume for persistent storage. Your conversations, settings, uploaded files, and user accounts all live here. If you stop and restart the container, nothing is lost.
extra_hosts: "host.docker.internal:host-gateway"
This is the Colima compatibility line. Docker Desktop provides host.docker.internal automatically, but with Colima we need to tell Docker how to reach the host machine. This lets OpenWebUI find Ollama running on your Mac.
restart: always
Automatically restarts the container if it crashes or when Docker starts up. Set it and forget it.
Verify It's Running
docker compose ps
You should see your open-webui container listed with status "Up." If it shows "Restarting" or isn't listed, something went wrong. Check the logs:
docker compose logs
Open It Up
In your browser, go to:
http://localhost:3000
You should see the OpenWebUI welcome screen. If you get a connection error, give it 30 seconds. The first startup takes a moment while it initializes the database and loads everything.
First Account Setup
The first time you visit, you'll see a registration form. Enter your name, email, and a password.
+-------------------------------------------------------+
| IMPORTANT |
| |
| The first account you create automatically becomes |
| the Administrator. This controls who else can use |
| the system and what features are available. |
+-------------------------------------------------------+
This is entirely local. The email doesn't need to be real. It's just a login credential stored on your machine. Use whatever you want.
After signing up, you'll land on the main chat interface. It should look familiar if you've ever used ChatGPT.
Connecting to Ollama
OpenWebUI needs to know where Ollama is running. The good news is that it usually figures this out automatically.
For our Docker setup, OpenWebUI connects to Ollama through a special Docker networking address:
http://host.docker.internal:11434
This is Docker's way of saying "the host machine" from inside a container. Since Ollama runs directly on your Mac (not in Docker), this bridge lets them talk.
If auto-detection works, you'll see your Ollama models appear in the model dropdown at the top of the chat interface. If the dropdown is empty, we need to configure the connection manually.
Manual Connection (If Needed)
- Click your profile icon in the bottom left
- Go to Admin Panel
- Click Settings, then Connections
- Under Ollama, click the wrench icon (Manage)
- Enter: http://host.docker.internal:11434
- Click the check mark to verify the connection
If verification succeeds, you'll see a green check. Your models should now appear in the chat dropdown.
Why Docker Compose Instead of Docker Run?
You might see tutorials online that use a long docker run command with a bunch of flags. Docker Compose is better for a few reasons:
- The compose file is readable. You can see all the settings at a glance without decoding command-line flags.
- It's repeatable. Run docker compose up -d anywhere you have the file and you get the same result every time.
- Updates are easier. Pull the new image and recreate the container in one motion.
- The file lives in your project folder. You can version it, back it up, or share it.
Alternative Install: Python/pip
If you prefer not to use Docker, you can install OpenWebUI directly with Python:
pip install open-webui
open-webui serve
This runs on port 8080 (not 3000) and connects to Ollama at localhost:11434 directly. No Docker networking needed.
We recommend Docker Compose because it's cleaner and easier to manage, but the pip method works fine too.
Managing the Container
A few Docker Compose commands you'll need later. Run these from the directory where your docker-compose.yml file lives:
Stop OpenWebUI:
docker compose down
Start it again:
docker compose up -d
See the logs:
docker compose logs -f
Update to latest version:
docker compose pull
docker compose up -d
That last one is the beauty of Compose. Pull the new image and recreate the container in two commands. Your data persists because it lives in the named volume, not the container itself.
Next Up
OpenWebUI is running. Time to pull some models and start chatting.