mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #11224] issue: Embedding model set: sentence-transformers/all-MiniLM-L6-v2 THEN exited with code 0 (DOCKER) #16151
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @toddpage on GitHub (Mar 5, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/11224
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
ghcr.io/open-webui/open-webui:main
Ollama Version (if applicable)
ollama/ollama:latest
Operating System
Ubuntu 22.04
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
To start up and run
Actual Behavior
Exits with
open-webui exited with code 0.Start up ollama and open-webui via docker compose. Using the same yaml as another server, which works. It starts but hangs at
Steps to Reproduce
docker compose up
Logs & Screenshots
open-webui | /app/backend/open_webui open-webui | /app/backend open-webui | /app open-webui | INFO [alembic.runtime.migration] Context impl SQLiteImpl. open-webui | INFO [alembic.runtime.migration] Will assume non-transactional DDL. open-webui | INFO [open_webui.env] 'DEFAULT_LOCALE' loaded from the latest database entry open-webui | INFO [open_webui.env] 'DEFAULT_PROMPT_SUGGESTIONS' loaded from the latest database entry open-webui | WARNI [open_webui.env] open-webui | open-webui | WARNING: CORS_ALLOW_ORIGIN IS SET TO '*' - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS. open-webui | open-webui | INFO [open_webui.env] Embedding model set: sentence-transformers/all-MiniLM-L6-v2 open-webui exited with code 0`
open-webui:
build:
context: .
args:
OLLAMA_BASE_URL: '/ollama'
dockerfile: Dockerfile
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- /opt/machineLearning/ollama:/app/backend/data
depends_on:
#- ollama
- ollama-server
ports:
- ${OPEN_WEBUI_PORT-3010}:8080
#- "3011:8081
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
- 'WEBUI_SECRET_KEY='
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
`
Additional Information
No response