[GH-ISSUE #2334] Feedback on Ollama+Ollama web ui issues #51510

Closed
opened 2026-05-05 12:33:18 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @LyingDoc on GitHub (May 17, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2334

I installed a Docker image and used WebUI to associate it with the local server. After I successfully deployed it, for example, I retrieved llama3-7b from the Ollama library and asked questions on the Web-UI interface. If there were any problems, it would take a long time to respond and the generation process would be slow. What is the problem

Originally created by @LyingDoc on GitHub (May 17, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2334 I installed a Docker image and used WebUI to associate it with the local server. After I successfully deployed it, for example, I retrieved llama3-7b from the Ollama library and asked questions on the Web-UI interface. If there were any problems, it would take a long time to respond and the generation process would be slow. What is the problem
Author
Owner

@pkeffect commented on GitHub (May 17, 2024):

I noticed this as well. My solution was to install Ollama locally and OpenWebUI in docker. This fixed that issue for me. I do not know what causes this issue other than just Ollama not liking the extra layer of virtualization in some setups.

<!-- gh-comment-id:2117456801 --> @pkeffect commented on GitHub (May 17, 2024): I noticed this as well. My solution was to install Ollama locally and OpenWebUI in docker. This fixed that issue for me. I do not know what causes this issue other than just Ollama not liking the extra layer of virtualization in some setups.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#51510