multi user access issue #418

Closed
opened 2025-11-11 14:20:48 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @tkmamidi on GitHub (Mar 6, 2024).

Bug Report

Description

Bug Summary:
I hosted ollama and open-webui on a server and it is working fine. When I and my colleague tried to query something at the same time with different models, one had to wait until the model generation/result is complete for the other to start responding.

Steps to Reproduce:
We used Terraform and Ansible to deploy both ollama and open-webui on our server. They can both communicate and are working fine.
Trying to use open-webui simultaneously/parallely doesn't seem to work. We want to expand this service to everyone in our group to access it.

Expected Behavior:
Two users can simultaneously query/use/chat at the same time without any delays.

Actual Behavior:
One query has to be finished until the other one starts.

Environment

  • Operating System: debian-12-generic-amd64
  • Browser (if applicable): Google Chrome

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I have reviewed the troubleshooting.md document.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

I'm using docker-compose.yaml to deploy

version: '3.8'

services:
  ollama:
    volumes:
      - ollama:/root/.ollama
    container_name: ollama
    pull_policy: always
    tty: true
    restart: unless-stopped
    image: ollama/ollama:0.1.27

  open-webui:
    image: ghcr.io/open-webui/open-webui:git-eb51ad1
    container_name: open-webui
    volumes:
      - open-webui:/app/backend/data
    depends_on:
      - ollama
    ports:
      - ${OPEN_WEBUI_PORT-3000}:8080
    environment:
      - 'OLLAMA_API_BASE_URL=http://ollama:11434/api'
      - 'WEBUI_SECRET_KEY='
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: unless-stopped

volumes:
  ollama: {}
  open-webui: {}

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @tkmamidi on GitHub (Mar 6, 2024). # Bug Report ## Description **Bug Summary:** I hosted ollama and open-webui on a server and it is working fine. When I and my colleague tried to query something at the same time with different models, one had to wait until the model generation/result is complete for the other to start responding. **Steps to Reproduce:** We used Terraform and Ansible to deploy both ollama and open-webui on our server. They can both communicate and are working fine. Trying to use open-webui simultaneously/parallely doesn't seem to work. We want to expand this service to everyone in our group to access it. **Expected Behavior:** Two users can simultaneously query/use/chat at the same time without any delays. **Actual Behavior:** One query has to be finished until the other one starts. ## Environment - **Operating System:** debian-12-generic-amd64 - **Browser (if applicable):** Google Chrome ## Reproduction Details **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I have reviewed the troubleshooting.md document. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method I'm using docker-compose.yaml to deploy ``` version: '3.8' services: ollama: volumes: - ollama:/root/.ollama container_name: ollama pull_policy: always tty: true restart: unless-stopped image: ollama/ollama:0.1.27 open-webui: image: ghcr.io/open-webui/open-webui:git-eb51ad1 container_name: open-webui volumes: - open-webui:/app/backend/data depends_on: - ollama ports: - ${OPEN_WEBUI_PORT-3000}:8080 environment: - 'OLLAMA_API_BASE_URL=http://ollama:11434/api' - 'WEBUI_SECRET_KEY=' extra_hosts: - host.docker.internal:host-gateway restart: unless-stopped volumes: ollama: {} open-webui: {} ``` ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#418