In this video we get started with Open WebUI which is is an extensible, self-hosted AI interface. Basically a web interface for your locally or remote hosted LLMs.
Here are all the resources for the video and the commands I typed in the command line, so you can easily copy paste them. This is not a step by step guide, you need to follow the video to get the whole content.
Requirements: To follow this video I recommend you are already familiar with Docker and the command line interface. If you are not, checkout these videos:
– Docker / Docker compose (video)
– Familiarity with the command line interface (video)
If you already have docker the steps followed in the video are:
- Install Ollama
- Pull some Ollama models
- Run Open WebUI using Docker Compose
- Clone LileLLM repository
- Run LiteLLM docker compose
- Grab your external API Keys (Grok, OpenAI, Gemini,…)
- Configure a LiteLLM virtual key
- Add the Key to Open WebUI
- Create an account at OpenRouter
- Create an OpenRouter key
- Add the OpenRouter key to Open WebUI
Docker
Ollama
Open WebUI
LiteLLM
OpenRouter
Pull Ollama models, in this case tinyllama
ollama pull tinyllama
To run the model we use the run
command
ollama run tinyllama
We can also use the run command to pull the model and it will do it all in one go.
Docker compose file called docker-compose.yml
to run Open WebUI. This is a very basic docker compose file, you can extend the capabilities from here as much as you want. Check the Open WebUI git repository for more information.
version: '3'
services:
openwebui:
image: ghcr.io/open-webui/open-webui:main
ports:
- "3000:8080"
volumes:
- open-webui:/app/backend/data
volumes:
open-webui:
From the same directory the docker compose file is located, run the docker compose command to start Open WebUI
docker compose up -d
To install LiteLLM, clone the repository in some local folder
git clone git@github.com:BerriAI/litellm.git
Move to that folder and create an .env
file with the following content. For the password you only need to keep the sk-
at the beginning and then add whatever password you want.
LITELLM_MASTER_KEY="sk-<your password>"
LITELLM_SALT_KEY="sk-<your password>"
Run docker compose inside the LiteLLM directory:
docker compose up -d
This concludes all the commands and resources you need. For the remaining steps, check the video.