File uploads not working #3485

Closed
opened 2025-11-11 15:32:40 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @dxkyy on GitHub (Jan 29, 2025).

Bug Report

When I upload a file and try to ask any LLM something about it, nothing happens. The LLMs don't even get loaded into memory.

Image

and the browser console gets spammed with this:

{
    "models": [
        "llama3:latest",
        "artifish/llama3.2-uncensored:latest"
    ]
}

Image


Installation Method

Docker compose

Environment

  • Open WebUI Version: v0.5.7

  • Ollama (if applicable): v0.5.7

  • Operating System: Windows 11, but OpenWebUI is hosted on a debian vps

  • Browser (if applicable): Brave v1.74.50

Expected Behavior:

I should be able to upload a file and the LLM should be able to read the file.

Actual Behavior:

Nothing happens.

Description

Bug Summary:
File uploads arent working.

Reproduction Details

Steps to Reproduce:

  • install openwebui with docker compose:
services:
  open-webui:
    image: 'ghcr.io/open-webui/open-webui:main'
    container_name: open-webui
    restart: always
    depends_on:
      - tailscale
    network_mode: host
    environment:
      - 'OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-}'
      - 'PORT=${PORT:-8181}'
    volumes:
      - 'open-webui:/app/backend/data'
volumes:
  open-webui: null
  • connect to openwebui
  • connect openwebui to ollama
  • try to upload a file and chat with it
Originally created by @dxkyy on GitHub (Jan 29, 2025). # Bug Report When I upload a file and try to ask any LLM something about it, nothing happens. The LLMs don't even get loaded into memory. ![Image](https://github.com/user-attachments/assets/ec359694-7b7e-43be-a159-88c41bcd1f88) and the browser console gets spammed with this: ```json { "models": [ "llama3:latest", "artifish/llama3.2-uncensored:latest" ] } ``` ![Image](https://github.com/user-attachments/assets/7e6c7e78-f990-4ecb-bbb9-a719125cc0d0) --- ## Installation Method Docker compose ## Environment - **Open WebUI Version:** v0.5.7 - **Ollama (if applicable):** v0.5.7 - **Operating System:** Windows 11, but OpenWebUI is hosted on a debian vps - **Browser (if applicable):** Brave v1.74.50 ## Expected Behavior: I should be able to upload a file and the LLM should be able to read the file. ## Actual Behavior: Nothing happens. ## Description **Bug Summary:** File uploads arent working. ## Reproduction Details **Steps to Reproduce:** - install openwebui with docker compose: ```yml services: open-webui: image: 'ghcr.io/open-webui/open-webui:main' container_name: open-webui restart: always depends_on: - tailscale network_mode: host environment: - 'OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-}' - 'PORT=${PORT:-8181}' volumes: - 'open-webui:/app/backend/data' volumes: open-webui: null ``` - connect to openwebui - connect openwebui to ollama - try to upload a file and chat with it
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3485