Get started with Open WebUI

In this video we get started with Open WebUI which is is an extensible, self-hosted AI interface. Basically a web interface for your locally or remote hosted LLMs.

Here are all the resources for the video and the commands I typed in the command line, so you can easily copy paste them. This is not a step by step guide, you need to follow the video to get the whole content.

Requirements: To follow this video I recommend you are already familiar with Docker and the command line interface. If you are not, checkout these videos:
– Docker / Docker compose (video)
– Familiarity with the command line interface (video)

If you already have docker the steps followed in the video are:

  1. Install Ollama
  2. Pull some Ollama models
  3. Run Open WebUI using Docker Compose
  4. Clone LileLLM repository
  5. Run LiteLLM docker compose
  6. Grab your external API Keys (Grok, OpenAI, Gemini,…)
  7. Configure a LiteLLM virtual key
  8. Add the Key to Open WebUI
  9. Create an account at OpenRouter
  10. Create an OpenRouter key
  11. Add the OpenRouter key to Open WebUI

Docker
Ollama
Open WebUI
LiteLLM
OpenRouter

Pull Ollama models, in this case tinyllama

ollama pull tinyllama


To run the model we use the run command

ollama run tinyllama


We can also use the run command to pull the model and it will do it all in one go.

Docker compose file called docker-compose.yml to run Open WebUI. This is a very basic docker compose file, you can extend the capabilities from here as much as you want. Check the Open WebUI git repository for more information.

version: '3'
services:
  openwebui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "3000:8080"
    volumes:
      - open-webui:/app/backend/data
volumes:
  open-webui:


From the same directory the docker compose file is located, run the docker compose command to start Open WebUI

docker compose up -d


To install LiteLLM, clone the repository in some local folder

git clone git@github.com:BerriAI/litellm.git


Move to that folder and create an .env file with the following content. For the password you only need to keep the sk- at the beginning and then add whatever password you want.

LITELLM_MASTER_KEY="sk-<your password>"
LITELLM_SALT_KEY="sk-<your password>"


Run docker compose inside the LiteLLM directory:

docker compose up -d


This concludes all the commands and resources you need. For the remaining steps, check the video.

ZUCO.DEV
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.