mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-08 04:16:03 -05:00
[GH-ISSUE #8382] OpenWebUI not following Ollama config parameters? #15103
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @alpilotx on GitHub (Jan 7, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/8382
After some experiments and observations I have realized, that OpenWebUI seems to "decide" on its own, which parameters to use to start Ollama Server, and does not really "seems" to follow what settings I try to dial in!
Here is an example.

I set in the Models Settings a few parameters:
Like:
And then also set these in the chats model settings :

But when I fire off my chat (with a knowledge set), I see - while the Ollama process runs - that it was started with parameters like this:

Or in text:
So, effectively it ignored almost all of my settings, only threads seems to have been "respected" (or was by chance set correctly, as I have 64 CPU cores)
Do I maybe overlook some basic "workings" of ho OpenWebUI interacts withOllama, or is there something missing / not-working in the configuration?
Additional info: both OpenWebUI and Ollama run in their own Docker container.