[GH-ISSUE #2408] Unable to use open-webui without setting up "Memory" #51536

Closed
opened 2026-05-05 12:35:35 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @Kisaragi-ng on GitHub (May 20, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2408

Bug Report

Description

Bug Summary:
In version v0.1.125, open-webui is not passing chat into ollama, causing the chat just hang / stuck. However, by enabling Memory and creating one personality make the chat works properly.

Steps to Reproduce:

  1. using docker-compose.yml, run docker compose up
  2. install phi3:latest from ollama library
  3. begin chat using template
  4. ollama doesn't respond to chat
  5. enable Memory, create one personality, restart chat
  6. chat works as intended

Expected Behavior:
open-webui should still works without setting up Memory, as this feature is Experimental (and from my understanding, this is not mandatory).

Actual Behavior:
open-webui doesn't respond to chat.

Environment

  • Open WebUI Version: v0.1.125

  • Ollama (if applicable): 0.1.38

  • Operating System: docker

  • Browser (if applicable): Firefox 115.11.0esr, Edge 125.0.2535.51 (Official build) (64-bit)

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:

firefox:

Backend config: 
Object { status: true, name: "Open WebUI", version: "0.1.125", auth: true, default_locale: "en-US", images: false, default_models: null, default_prompt_suggestions: (6) […], trusted_header_auth: false, admin_export_enabled: true }
0.3690807d.js:1:37714
submitPrompt <empty string> 4.dc01e3b1.js:5:2577
modelId phi3:latest 4.dc01e3b1.js:5:3857
Uncaught (in promise) TypeError: l.userContext is null
    Immutable 38
    <anonymous> https://mydomain.com/:79
    promise callback* https://mydomain.com/:78
4.dc01e3b1.js:8:3
    we Immutable
    AsyncFunctionThrow self-hosted:857
    Immutable 17
    InterpretGeneratorResume self-hosted:1461
    AsyncFunctionNext self-hosted:852
    (Async: async)
    F Immutable
    map self-hosted:221
    Immutable 6
    InterpretGeneratorResume self-hosted:1461
    AsyncFunctionNext self-hosted:852
    (Async: async)
    F Immutable
    map self-hosted:221
    Immutable 6
    InterpretGeneratorResume self-hosted:1461
    AsyncFunctionNext self-hosted:852
    Immutable 3
    <anonymous> https://mydomain.com/:79
    (Async: promise callback)
    <anonymous> https://mydomain.com/:78

edge:

Backend config: 
Object { status: true, name: "Open WebUI", version: "0.1.125", auth: true, default_locale: "en-US", images: false, default_models: "phi3:latest", default_prompt_suggestions: (6) […], trusted_header_auth: false, admin_export_enabled: true }
0.3690807d.js:1:37714
setting default models globally index.4df4e131.js:93:23328
submitPrompt <empty string> 4.dc01e3b1.js:5:2577
modelId phi3:latest 4.dc01e3b1.js:5:3857
Uncaught (in promise) TypeError: l.userContext is null
    Immutable 5
4.dc01e3b1.js:8:3

Docker Container Logs:
unfortunately, nothing interesting:

open-webui  | INFO:     127.0.0.1:57778 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:55764 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:58104 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:59696 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:38172 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET / HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/config HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/changelog HTTP/1.1" 200 OK
open-webui  | INFO:apps.ollama.main:get_all_models()
open-webui  | INFO:apps.openai.main:get_all_models()
ollama      | [GIN] 2024/05/20 - 04:58:30 | 200 |     416.606µs |      172.21.0.3 | GET      "/api/tags"
open-webui  | INFO:     ****:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK
open-webui  | INFO:apps.openai.main:get_all_models()
open-webui  | INFO:     ****:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /openai/api/models HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
open-webui  | INFO:apps.ollama.main:get_all_models()
ollama      | [GIN] 2024/05/20 - 04:58:31 | 200 |     393.705µs |      172.21.0.3 | GET      "/api/tags"
open-webui  | INFO:     ****:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK
open-webui  | INFO:apps.openai.main:get_all_models()
open-webui  | INFO:apps.openai.main:get_all_models()
open-webui  | INFO:     ****:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /openai/api/models HTTP/1.1" 200 OK
ollama      | [GIN] 2024/05/20 - 04:58:31 | 200 |        23.7µs |      172.21.0.3 | GET      "/api/version"
open-webui  | INFO:     ****:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /ollama/api/version HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "POST /api/v1/chats/new HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:39706 - "GET /health HTTP/1.1" 200 OK

Screenshots (if applicable):
Memory disabled:
ollamai1

Memory enabled:
ollamai2

Memory entry:
ollamai3

Installation Method

Docker version 26.1.2, build 211e74b

Additional Information

I'm willing to provide additional information if required. during this open-webui usage, I have already make sure there are no ad blocker running active, and I also have tested this in private window without addon enabled resulting the same result. I also found this error after updating my docker image from adb86c02cf4b to b0ef2a0e3744 (current main docker image).

Originally created by @Kisaragi-ng on GitHub (May 20, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2408 # Bug Report ## Description **Bug Summary:** In version v0.1.125, open-webui is not passing chat into ollama, causing the chat just hang / stuck. However, by enabling Memory and creating one personality make the chat works properly. **Steps to Reproduce:** 1. using [docker-compose.yml](https://github.com/open-webui/open-webui/blob/main/docker-compose.yaml), run `docker compose up` 2. install phi3:latest from ollama library 3. begin chat using template 4. ollama doesn't respond to chat 5. enable Memory, create one personality, restart chat 6. chat works as intended **Expected Behavior:** open-webui should still works without setting up Memory, as this feature is Experimental (and from my understanding, this is not mandatory). **Actual Behavior:** open-webui doesn't respond to chat. ## Environment - **Open WebUI Version:** v0.1.125 - **Ollama (if applicable):** 0.1.38 - **Operating System:** docker - **Browser (if applicable):** Firefox 115.11.0esr, Edge 125.0.2535.51 (Official build) (64-bit) ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** firefox: ``` Backend config: Object { status: true, name: "Open WebUI", version: "0.1.125", auth: true, default_locale: "en-US", images: false, default_models: null, default_prompt_suggestions: (6) […], trusted_header_auth: false, admin_export_enabled: true } 0.3690807d.js:1:37714 submitPrompt <empty string> 4.dc01e3b1.js:5:2577 modelId phi3:latest 4.dc01e3b1.js:5:3857 Uncaught (in promise) TypeError: l.userContext is null Immutable 38 <anonymous> https://mydomain.com/:79 promise callback* https://mydomain.com/:78 4.dc01e3b1.js:8:3 we Immutable AsyncFunctionThrow self-hosted:857 Immutable 17 InterpretGeneratorResume self-hosted:1461 AsyncFunctionNext self-hosted:852 (Async: async) F Immutable map self-hosted:221 Immutable 6 InterpretGeneratorResume self-hosted:1461 AsyncFunctionNext self-hosted:852 (Async: async) F Immutable map self-hosted:221 Immutable 6 InterpretGeneratorResume self-hosted:1461 AsyncFunctionNext self-hosted:852 Immutable 3 <anonymous> https://mydomain.com/:79 (Async: promise callback) <anonymous> https://mydomain.com/:78 ``` edge: ``` Backend config: Object { status: true, name: "Open WebUI", version: "0.1.125", auth: true, default_locale: "en-US", images: false, default_models: "phi3:latest", default_prompt_suggestions: (6) […], trusted_header_auth: false, admin_export_enabled: true } 0.3690807d.js:1:37714 setting default models globally index.4df4e131.js:93:23328 submitPrompt <empty string> 4.dc01e3b1.js:5:2577 modelId phi3:latest 4.dc01e3b1.js:5:3857 Uncaught (in promise) TypeError: l.userContext is null Immutable 5 4.dc01e3b1.js:8:3 ``` **Docker Container Logs:** unfortunately, nothing interesting: ``` open-webui | INFO: 127.0.0.1:57778 - "GET /health HTTP/1.1" 200 OK open-webui | INFO: 127.0.0.1:55764 - "GET /health HTTP/1.1" 200 OK open-webui | INFO: 127.0.0.1:58104 - "GET /health HTTP/1.1" 200 OK open-webui | INFO: 127.0.0.1:59696 - "GET /health HTTP/1.1" 200 OK open-webui | INFO: 127.0.0.1:38172 - "GET /health HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET / HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/config HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/v1/auths/ HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/changelog HTTP/1.1" 200 OK open-webui | INFO:apps.ollama.main:get_all_models() open-webui | INFO:apps.openai.main:get_all_models() ollama | [GIN] 2024/05/20 - 04:58:30 | 200 | 416.606µs | 172.21.0.3 | GET "/api/tags" open-webui | INFO: ****:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK open-webui | INFO:apps.openai.main:get_all_models() open-webui | INFO: ****:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /openai/api/models HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/v1/documents/ HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK open-webui | INFO:apps.ollama.main:get_all_models() ollama | [GIN] 2024/05/20 - 04:58:31 | 200 | 393.705µs | 172.21.0.3 | GET "/api/tags" open-webui | INFO: ****:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK open-webui | INFO:apps.openai.main:get_all_models() open-webui | INFO:apps.openai.main:get_all_models() open-webui | INFO: ****:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /openai/api/models HTTP/1.1" 200 OK ollama | [GIN] 2024/05/20 - 04:58:31 | 200 | 23.7µs | 172.21.0.3 | GET "/api/version" open-webui | INFO: ****:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /ollama/api/version HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "POST /api/v1/chats/new HTTP/1.1" 200 OK open-webui | INFO: ****:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK open-webui | INFO: 127.0.0.1:39706 - "GET /health HTTP/1.1" 200 OK ``` **Screenshots (if applicable):** Memory disabled: ![ollamai1](https://github.com/open-webui/open-webui/assets/88741696/3201dccd-6e87-4f20-bc89-da7afe90d8eb) Memory enabled: ![ollamai2](https://github.com/open-webui/open-webui/assets/88741696/7bd827fe-9d28-41e3-ae4c-482d3dd8b0d1) Memory entry: ![ollamai3](https://github.com/open-webui/open-webui/assets/88741696/b8f1e694-9277-47ce-985a-592402d5f6b3) ## Installation Method Docker version 26.1.2, build 211e74b ## Additional Information I'm willing to provide additional information if required. during this open-webui usage, I have already make sure there are no ad blocker running active, and I also have tested this in private window without addon enabled resulting the same result. I also found this error after updating my docker image from `adb86c02cf4b` to `b0ef2a0e3744` (current main docker image).
Author
Owner

@mptyl commented on GitHub (May 20, 2024):

Same problem + error in connecting with openAI. An error windows pop-up appears with a join error. An OpenAI model can only be used by deleting the system prompt.

If you enable the memory, you can use the System prompt with OpenAI models like gpt-3.5.

To replicate the event:

  1. remove the memory
  2. write something in system prompt, whatever you want
  3. use an OpenAI model (exception is thrown) or use a local model (hangs up)
<!-- gh-comment-id:2119996421 --> @mptyl commented on GitHub (May 20, 2024): Same problem + error in connecting with openAI. An error windows pop-up appears with a join error. An OpenAI model can only be used by deleting the system prompt. If you enable the memory, you can use the System prompt with OpenAI models like gpt-3.5. To replicate the event: 1) remove the memory 2) write something in system prompt, whatever you want 3) use an OpenAI model (exception is thrown) or use a local model (hangs up)
Author
Owner

@tjbck commented on GitHub (May 20, 2024):

Please pull the current latest main, and let me know if the issue persists!

image

<!-- gh-comment-id:2120571228 --> @tjbck commented on GitHub (May 20, 2024): Please pull the current latest main, and let me know if the issue persists! ![image](https://github.com/open-webui/open-webui/assets/25473318/c799dfaa-304d-445b-960d-71f5e5324209)
Author
Owner

@CultusMechanicus commented on GitHub (May 20, 2024):

It's still not working properly, now disabling the Memory toggle also disables the System Prompt too.

<!-- gh-comment-id:2120828206 --> @CultusMechanicus commented on GitHub (May 20, 2024): It's still not working properly, now disabling the Memory toggle also disables the System Prompt too.
Author
Owner

@tjbck commented on GitHub (May 20, 2024):

@CultusMechanicus Added a fix with #2427, let me know if the issue persists!

<!-- gh-comment-id:2120860099 --> @tjbck commented on GitHub (May 20, 2024): @CultusMechanicus Added a fix with #2427, let me know if the issue persists!
Author
Owner

@JamesBedwell commented on GitHub (May 21, 2024):

This seems to be related to my issue #2441. I enabled memory and it allows me to send the message once again, but it ignores the system prompt. I have not yet tried the latest one with the fix #2427.

<!-- gh-comment-id:2121824001 --> @JamesBedwell commented on GitHub (May 21, 2024): This seems to be related to my issue #2441. I enabled memory and it allows me to send the message once again, but it ignores the system prompt. I have not yet tried the latest one with the fix #2427.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#51536