mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #1295] Ollama Base URL will not save #27963
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @dtsoden on GitHub (Mar 25, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1295
Bug Report
Description
Entering in "Ollama Base URL" = http://localhost:11434/ and saving says "settings saves successfully" but it never does and models won't show and error out - postman can connect just fine
Bug Summary:
entering in "Ollama Base URL" = http://localhost:11434/ and saving says "settings saves successfully" but it never does and models won't show and error out - postman can connect just fine
Steps to Reproduce:
this is a brand new MANUAL installation which worked perfectly when Ollama-WebUI
git clone https://github.com/open-webui/open-webui.git
cd path/to/open-webui/
npm install
npm run build
cd backend
pip install -r requirements.txt -U
bash start.sh
everything runs as normal creates the 1st admin account but Ollama is not configured by default and I can not save a configuration using the WebUI interface
I even manually renamed .env - Copy.example to .env AND... I even copied it to the folder backend - - nothing works to save the Ollama URL (the web UI or the .env files) seams this new version no longer works with anything except docker as there are ZERO instructions to install this -- only instructions to update existing manual installs https://docs.openwebui.com/getting-started/updating
Expected Behavior:
When I enter in the UI the URL for Ollama local server I expected to it STICK and actually save like it use to.
Actual Behavior:
never saves dispite saying it has successfully saved
Environment
Reproduction Details
Confirmation:
@slash-proc commented on GitHub (Mar 25, 2024):
I can confirm this. It says successfully saved but it only saves if I verify the connection. In fact, it saves it without clicking save if the server is verified.
@tjbck commented on GitHub (Mar 25, 2024):
Have you verified the connection first before saving? That seems to be causing the problem on your end.
@dtsoden commented on GitHub (Mar 25, 2024):
Where is this verification button/mechanism in Open WebUI?
in a browser http://localhost:11434 (reads: Ollama is running)
Works in postman too http://localhost:11434/api/tags
{
"models": [
{
"name": "llama2:latest",
"modified_at": "2023-12-12T13:18:43.442731906-05:00",
"size": 3825819519,
"digest": "fe938a131f40e6f6d40083c9f0f430a515233eb2edaa6d72eb85c50d64f2300e"
},
{
"name": "mistral:latest",
"modified_at": "2024-01-03T08:53:40.229738067-05:00",
"size": 4109865159,
"digest": "61e88e884507ba5e06c49b40e6226884b2a16e872382c2b44a42f2d119d804a5"
},
{
"name": "openchat:latest",
"modified_at": "2024-01-03T08:53:56.319736704-05:00",
"size": 4109876386,
"digest": "aa6d10add428bf93660c6c27daedd48934f62c36a554101557d67a52a79de76b"
},
{
"name": "yarn-mistral:latest",
"modified_at": "2024-01-03T08:55:20.569730092-05:00",
"size": 4108916676,
"digest": "8e9c368a0ae42f5c29f59eacd6ad3c20685e5525066727ebc10ee62321a60999"
}
]
}
@dtsoden commented on GitHub (Mar 25, 2024):
https://github.com/open-webui/open-webui/assets/36846872/ba5aff7d-dcc3-459b-9e89-c2abbd8756af
I attached a video of the error / issue
@tjbck commented on GitHub (Mar 25, 2024):
@dtsoden You should click on the refresh button to verify the connection. I admit that doesn't seem that intuitive/obvious, I'll modify the code so that when you save it'll alert you if you haven't verified the connection.
@dtsoden commented on GitHub (Mar 26, 2024):
@tjbck I have, nothing happens when I click this button - I can click a million times no indicator on click, no confirmation, nothing and when I press save the same ole same ole as shown in the video above. 😒
@G4Zz0L1 commented on GitHub (Mar 26, 2024):
I also add my own problem to this: I can get my local ollama to connect, use it and modify the models, but when I restart the container (I installed it with docker) the URL is no longer saved and I have to start it again.
EDIT: it also seems that the OLLAMA_BASE_URL variable is ignored, either by putting localhost or 127.0.0.1
@dtsoden commented on GitHub (Mar 27, 2024):
So in this ticket I describe my install as manual and just today I gave up and installed docker and I am getting the same issue inside the container image as I did with my manual install. This app is sadly just broken for new installs using ollama running locally. Shame I'll have to switch to a new UI for ollama (I loved this before it changed to open-webui from ollama-webui, which IME was super stable till this change)
@justinh-rahb commented on GitHub (Mar 27, 2024):
@dtsoden I see trailing slashes on the end of your URLs in the original post, are you putting those in your config or run command as well? This is how one should decide what the URL is supposed to be:
http://host.docker.internal:11434http://ollama:11434http://host.docker.internal:11434-p 11434:11434is used.http://127.0.0.1:11434--network=host.Further context and explanation around hostnames and localhost in Docker:
https://docs.openwebui.com/faq#q-why-cant-my-docker-container-connect-to-services-on-the-host-using-localhost
https://docs.openwebui.com/getting-started/troubleshooting
Ollama FAQ regarding environment variables for server usage:
https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server
@dtsoden commented on GitHub (Mar 27, 2024):
I have followed all of these nuances and variations in a desperate attempt to get this to work again before the changes to open-webui. I had to do the last "WebUI (Docker) + Ollama (Docker or Non-Docker, Distro Docker Package)" for my Docker attempt which worked and operates EXACTLY as the manually installed version. This to me, seams to be a bug. Albeit I have not documented every nuance attempt... There are lot's of great and useful suggestions (like yours), but one thing no one seams to have done, is validate the reported issue. sometimes there is just a bug in the matrix 🙃
If someone on the contribution/development team could please setup a new instance using the latest version (with Docker or Manually) folks should be easily able to replicate what I have described herein.
All of this was and has worked perfectly for me till the change occurred and rebranding. Now its just broken because I can not connect to my local LLMs because the configuration won't save or stick.
@tjbck commented on GitHub (Mar 27, 2024):
@G4Zz0L1 The settings will not persist if you restart the container so you'd have to set
OLLAMA_BASE_URLenv var manually. Could you share your installation command with us? Persistent config is in the works with #1022, so stay tuned for that.@dtsoden Setting up the dev environment uses the exact same method as the manual installation instruction. I've personally tested this by removing and fresh installing it on all three of my machines (macos, ubuntu, windows) and was unable to reproduce the issue at all, as have thousands of other people. We have not changed anything regarding saving the URL in the code, I strongly encourage you to check our docs again to see if you have missed anything.
The minimum you could do to help us diagnose the issue is by including the browser console logs and backend logs (container logs), please follow the instructions and share them with us if you want better assistance from the community, instead of just saying it doesn't work which is very counterproductive. Keep in mind, our project is maintained by volunteers who juggle their day jobs with their passion for contributing here. We're all human here, swamped with messages round the clock, and this support work doesn't pay the bills.
When Ollama is reachable:

When Ollama is unreachable:
@dtsoden One other thing I've noticed is your issue post on LiteLLM repo: https://github.com/BerriAI/litellm/issues/2681 They're NOT related to our project at all, so PLEASE close you issue there. We do not condone spam-like behaviour on someone else repo. Thanks for your understanding.
@justinh-rahb commented on GitHub (Mar 27, 2024):
@dtsoden I understand that you may be feeling frustrated by the issues you are experiencing with your installation. It is possible that there may be some underlying issues that we have not yet identified. I encourage you to continue seeking assistance from the community and providing any relevant information that may help us diagnose the problem more effectively. We can only be aware of what you share with us, and there is certainly some key detail that is being left out. If you don't want to cooperate with those that have been trying to assist you, that's your prerogative but don't take the community for granted. Remember that our project is maintained by volunteers who also have day jobs, so your patience and understanding during this process is greatly appreciated.
@dtsoden commented on GitHub (Mar 27, 2024):
verMicrosoft Windows [Version 10.0.22631.3296]
wsl --versionWSL version: 2.1.5.0
Kernel version: 5.15.146.1-2
WSLg version: 1.0.60
MSRDC version: 1.2.5105
Direct3D version: 1.611.1-81528511
DXCore version: 10.0.25131.1002-220531-1700.rs-onecore-base2-hyp
Windows version: 10.0.22631.3296
lsb_release -aNo LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.3 LTS
Release: 22.04
Codename: jammy
sudo apt-get updatesudo apt-get install ca-certificates curlsudo install -m 0755 -d /etc/apt/keyringssudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.ascsudo chmod a+r /etc/apt/keyrings/docker.ascsudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:mainOn my Windows PC I go to http://localhost:8080
The app runs and this is where the hell begins and never ends regardless if this is Docker or manual.
MANUAL
git clone https://github.com/open-webui/open-webui.git
cd open-webui/Copying required .env file
cp -RPp .env.example .envBuilding Frontend Using Node
npm inpm run buildServing Frontend with the Backend
cd ./backendpip install -r requirements.txt -Ubash start.shIn the manual variation and in the "open-webui" folder I run...
nano .envthe contents are correct
This is madness - nothing works to verify or save any URL
Not sure how else to communicate this is broken, a bug or bad docs.
Either way its not working, and is extremely frustrating as this was work when it was Ollama-WebUI not its been rebranded and is totally broken.
@tjbck commented on GitHub (Mar 27, 2024):
We need browser console logs and backend logs.
@dtsoden commented on GitHub (Mar 27, 2024):
@tjbck are you saying this is just isolated to my machine and that this code runs fine elsewhere using a fresh Linux instance?
I can get the browser console logs but what logs are you wanting from the system
give me whatever commands you want run on each of the layers to ensure I get exactly what you want
@dtsoden commented on GitHub (Mar 27, 2024):
@tjbck console logs
localhost-1711557661910.log
@justinh-rahb commented on GitHub (Mar 27, 2024):
@dtsoden check your Ollama configuration, we believe you may have misconfigured a service override (if dealing with Ollama installed on Linux).
@dtsoden commented on GitHub (Mar 27, 2024):
@G4Zz0L1
I ran this at the WSL Linux level
curl https://ollama.ai/install.sh | shOMG this worked - unbelievable... thanks goodness. sigh of relief - I was seriously losing my mind. Thanks for the assist!!!
@dtsoden commented on GitHub (Mar 27, 2024):
note to self and all... if you pull on the UI make sure you upgrade your backend.
Unless the backend breaks the front end then not sure what to do for ya, 😂
@tjbck commented on GitHub (Mar 27, 2024):
This was our requirement for the issue report. Please review our template thoroughly next time.
@justinh-rahb commented on GitHub (Mar 27, 2024):
I'm glad we've finally managed to solve the problem you were facing. However, I must say I'm a little disappointed in how you've acted throughout this conversation. Blaming others when things don't go as planned is not the most effective way to get the help you need, and many other projects would have refused to help you further long ago.
I understand that everyone has bad days, but it's important to remember that we're all human beings here to help each other. In the future, please try to approach situations like this with a more positive attitude. Remember, we're all in this together.
@dtsoden commented on GitHub (Mar 27, 2024):
@tjbck
noted, and I suggest you update your installation doc's with this helpful information as this was where I spent the majority of my time reading thoroughly. Guilty as charged that I tersely read the request template before filling out.
@dtsoden commented on GitHub (Mar 27, 2024):
@justinh-rahb
Not sure I agree with this assessment. The only blame I delt was that it worked before the upgrade. If you took that personally, then I can not control the "feelings" of others. All I can say is it all did work till the upgrade, and that is and was a fact. I appreciate the help but all the back and forth and miraculously trying to explain the issue was getting this issue nowheres. And for that YES I was frustrated. You can not control the feelings of others either. Open source or paid there is an implied persona role here. service provider and consumer. It's important to understand a project is not just coding. As mentioned to @tjbck the helpful resolution was in the ticket instructions that I overlooked when it should be in the installation docs in and around the area
None the less thank you for your help @tjbck @justinh-rahb.
@justinh-rahb commented on GitHub (Mar 27, 2024):
Then respectfully, your assessment is incomplete. You had success using an older codebase, with an older codebase (for which is was designed to interface with). Your troubles began after you updated one codebase (WebUI) but not the other (Ollama).
If you feel that our docs are incomplete or not clear enough, please by all means submit a PR to the docs repo, we'll be happy to take it:
https://github.com/open-webui/docs/compare