mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-17 20:43:32 -05:00
[Bug] Custom system prompt and Modelfile system prompt not reaching Ollama #998
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Serbaf on GitHub (May 21, 2024).
Bug Report
Description
Bug Summary:
The system prompt set at Settings -> General -> System prompt is being totally ignored lately (it has correctly worked in my setup previously). Also, I have some Modelfiles defined with their own embedded system prompts. These sometimes do work, but not always.
Steps to Reproduce:
My environment consists in an Open WebUI instance in a server and an Ollama being served in a different machine.
In my user account I have the following content in Settings -> General -> System Prompt:
In addition, I have some modelfiles with a system prompt defined. E.g.:
Starting a new chat with any model will trigger the 1st kind of bug (user defined system prompt) for me.
The 2nd bug (Modelfile defined system prompt) is triggered when asking the model instance derived from that Modelfile.
Expected Behavior:
I expect my user defined system prompt to always be passed to the Ollama model and receive a response which is coherent with it. And I expect models defined with a Modelfile to also take into account the specified instructions.
Actual Behavior:
Models are responding in a way that is totally unaware of both system prompts. In fact, in the Ollama logs I am seeing that the system prompts are not reaching it (details in logs and screenshots below).
Environment
Open WebUI Version: 0.1.125
Ollama (if applicable): 0.1.38
Operating System: Ubuntu 22.04
Browser (if applicable): Firefox 126.0
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
Case 1: conversation with a regular llama3-3B model
Then I make the following interaction:

The request as shown in the browser console (no trace of the system prompt):
Case 2: conversation with a Modelfile instanced model
Installation Method
We are using Docker Compose to setup the Open WebUI.
Additional Information
It stopped working some days ago. To my knowledge, the only relevant changes I did since then were some updatings of the Ollama and Open WebUI versions.
Thank you!
@justinh-rahb commented on GitHub (May 21, 2024):
As usual with any new release, we often do some fixes afterwards that push direct to
mainand build new Docker images. If you haven't already done so as of 3hrs ago please try re-pulling again.@Serbaf commented on GitHub (May 22, 2024):
Great, that fixed it, thanks!