Feedback on Ollama+Ollama web ui issues #948

Closed
opened 2025-11-11 14:34:09 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @LyingDoc on GitHub (May 17, 2024).

I installed a Docker image and used WebUI to associate it with the local server. After I successfully deployed it, for example, I retrieved llama3-7b from the Ollama library and asked questions on the Web-UI interface. If there were any problems, it would take a long time to respond and the generation process would be slow. What is the problem

Originally created by @LyingDoc on GitHub (May 17, 2024). I installed a Docker image and used WebUI to associate it with the local server. After I successfully deployed it, for example, I retrieved llama3-7b from the Ollama library and asked questions on the Web-UI interface. If there were any problems, it would take a long time to respond and the generation process would be slow. What is the problem
Author
Owner

@pkeffect commented on GitHub (May 17, 2024):

I noticed this as well. My solution was to install Ollama locally and OpenWebUI in docker. This fixed that issue for me. I do not know what causes this issue other than just Ollama not liking the extra layer of virtualization in some setups.

@pkeffect commented on GitHub (May 17, 2024): I noticed this as well. My solution was to install Ollama locally and OpenWebUI in docker. This fixed that issue for me. I do not know what causes this issue other than just Ollama not liking the extra layer of virtualization in some setups.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#948