Open WebUI

Open WebUI running llava:34b model
Open WebUI running an image analysis model.

Open WebUI simplifies the process of embedding AI capabilities into web environments. It allows developers to create dynamic interfaces. Users can enter data, view AI-generated outputs, and visualize real-time results. Its open-source nature provides flexibility for customization, adapting to various AI project requirements, or integrating with popular machine learning libraries. (This summary was written, in part, using the deepseek-r1:32b model).

Open WebUI Installation

We installed Open WebUI as a Docker container using the approach outlined in the video below.

Encryption and reverse proxy implementation is handled using Traefik. A docker-compose template file for our installation follows. It can be installed as a Portainer stack.

Dockerfile
services:
  openwebui:
    image: ghcr.io/open-webui/open-webui:0.5.20
    container_name: openwebui
    restart: unless-stopped
    
    environment:
      # Ollama Config
      - OLLAMA_BASE_URL=http://<your server IP>:11434

    # Map configuration to our docker persitent store
    volumes:
      - /mnt/nfs/docker/open-webui/data:/app/backend/data:rw
    
    # Connect to our Traefik proxy network
    networks:
      - proxy
    
    # Configure for Traefik reverse proxy
    labels:
      - "traefik.enable=true"
      - "traefik.http.routers.openwebui.rule=Host(`openwebui.anita-fred.net`)"
      - "traefik.http.middlewares.openwebui-https-redirect.redirectscheme.scheme=https"
      - "traefik.http.routers.openwebui.middlewares=openwebui-https-redirect"
      - "traefik.http.routers.openwebui-secure.entrypoints=https"
      - "traefik.http.routers.openwebui-secure.rule=Host(`openwebui.anita-fred.net`)"
      - "traefik.http.routers.openwebui-secure.tls=true"
      - "traefik.http.routers.openwebui-secure.service=openwebui"
      - "traefik.http.services.openwebui.loadbalancer.server.port=8080"
      - "traefik.docker.network=proxy"
      - "traefik.http.services.openwebui.loadBalancer.server.port=8080"

volumes:
  data:
    driver: local

networks:
  proxy:
    external: true
Docker Compose template for Open WebUI

Once the stack is running, you can set up your user account. The final step is to allow access to the WebUI via a proxy host, our Nginx Proxy Manager.

What Can I Do With Open WebUI and LLMs?

How to Tutorials

This tool can be used for a variety of applications involving local—and cloud-based LLMs. A great place to start is the Tutorials page.

Using Paid APIs

You can use Open WebUI with paid services like ChatGPT. The video above explains how to set this up.

Anita's and Fred's Home Lab

WordPress Appliance - Powered by TurnKey Linux