[GH-ISSUE #1295] Ollama Base URL will not save #27963

Closed
opened 2026-04-25 02:43:47 -05:00 by GiteaMirror · 24 comments
Owner

Originally created by @dtsoden on GitHub (Mar 25, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1295

Bug Report

Description

Entering in "Ollama Base URL" = http://localhost:11434/ and saving says "settings saves successfully" but it never does and models won't show and error out - postman can connect just fine

Bug Summary:
entering in "Ollama Base URL" = http://localhost:11434/ and saving says "settings saves successfully" but it never does and models won't show and error out - postman can connect just fine

Steps to Reproduce:
this is a brand new MANUAL installation which worked perfectly when Ollama-WebUI

git clone https://github.com/open-webui/open-webui.git

cd path/to/open-webui/

npm install
npm run build

cd backend
pip install -r requirements.txt -U
bash start.sh

everything runs as normal creates the 1st admin account but Ollama is not configured by default and I can not save a configuration using the WebUI interface

I even manually renamed .env - Copy.example to .env AND... I even copied it to the folder backend - - nothing works to save the Ollama URL (the web UI or the .env files) seams this new version no longer works with anything except docker as there are ZERO instructions to install this -- only instructions to update existing manual installs https://docs.openwebui.com/getting-started/updating

Expected Behavior:
When I enter in the UI the URL for Ollama local server I expected to it STICK and actually save like it use to.

Actual Behavior:
never saves dispite saying it has successfully saved

Environment

  • Operating System: Windows 11 WSL
  • Browser (if applicable): chrome Version 122.0.6261.131 (Official Build) (64-bit)

Reproduction Details

Confirmation:

  • [x ] I have read and followed all the instructions provided in the README.md.
  • [ x] I am on the latest version of both Open WebUI and Ollama.
  • [ x] I have included the browser console logs.
  • [ NA FOR MANUAL] I have included the Docker container logs.
Originally created by @dtsoden on GitHub (Mar 25, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/1295 # Bug Report ## Description Entering in "Ollama Base URL" = http://localhost:11434/ and saving says "settings saves successfully" but it never does and models won't show and error out - postman can connect just fine **Bug Summary:** entering in "Ollama Base URL" = http://localhost:11434/ and saving says "settings saves successfully" but it never does and models won't show and error out - postman can connect just fine **Steps to Reproduce:** this is a brand new MANUAL installation which worked perfectly when Ollama-WebUI git clone https://github.com/open-webui/open-webui.git cd path/to/open-webui/ npm install npm run build cd backend pip install -r requirements.txt -U bash start.sh everything runs as normal creates the 1st admin account but Ollama is not configured by default and I can not save a configuration using the WebUI interface I even manually renamed .env - Copy.example to .env AND... I even copied it to the folder backend - - nothing works to save the Ollama URL (the web UI or the .env files) seams this new version no longer works with anything except docker as there are ZERO instructions to install this -- only instructions to update existing manual installs https://docs.openwebui.com/getting-started/updating **Expected Behavior:** When I enter in the UI the URL for Ollama local server I expected to it STICK and actually save like it use to. **Actual Behavior:** never saves dispite saying it has successfully saved ## Environment - **Operating System:** Windows 11 WSL - **Browser (if applicable):** chrome Version 122.0.6261.131 (Official Build) (64-bit) ## Reproduction Details **Confirmation:** - [x ] I have read and followed all the instructions provided in the README.md. - [ x] I am on the latest version of both Open WebUI and Ollama. - [ x] I have included the browser console logs. - [ NA FOR MANUAL] I have included the Docker container logs.
Author
Owner

@slash-proc commented on GitHub (Mar 25, 2024):

I can confirm this. It says successfully saved but it only saves if I verify the connection. In fact, it saves it without clicking save if the server is verified.

<!-- gh-comment-id:2018388708 --> @slash-proc commented on GitHub (Mar 25, 2024): I can confirm this. It says successfully saved but it only saves if I verify the connection. In fact, it saves it without clicking save if the server is verified.
Author
Owner

@tjbck commented on GitHub (Mar 25, 2024):

Have you verified the connection first before saving? That seems to be causing the problem on your end.

<!-- gh-comment-id:2018467492 --> @tjbck commented on GitHub (Mar 25, 2024): Have you verified the connection first before saving? That seems to be causing the problem on your end.
Author
Owner

@dtsoden commented on GitHub (Mar 25, 2024):

Where is this verification button/mechanism in Open WebUI?

in a browser http://localhost:11434 (reads: Ollama is running)
Works in postman too http://localhost:11434/api/tags

{
"models": [
{
"name": "llama2:latest",
"modified_at": "2023-12-12T13:18:43.442731906-05:00",
"size": 3825819519,
"digest": "fe938a131f40e6f6d40083c9f0f430a515233eb2edaa6d72eb85c50d64f2300e"
},
{
"name": "mistral:latest",
"modified_at": "2024-01-03T08:53:40.229738067-05:00",
"size": 4109865159,
"digest": "61e88e884507ba5e06c49b40e6226884b2a16e872382c2b44a42f2d119d804a5"
},
{
"name": "openchat:latest",
"modified_at": "2024-01-03T08:53:56.319736704-05:00",
"size": 4109876386,
"digest": "aa6d10add428bf93660c6c27daedd48934f62c36a554101557d67a52a79de76b"
},
{
"name": "yarn-mistral:latest",
"modified_at": "2024-01-03T08:55:20.569730092-05:00",
"size": 4108916676,
"digest": "8e9c368a0ae42f5c29f59eacd6ad3c20685e5525066727ebc10ee62321a60999"
}
]
}

<!-- gh-comment-id:2018477741 --> @dtsoden commented on GitHub (Mar 25, 2024): Where is this verification button/mechanism in Open WebUI? in a browser http://localhost:11434 (reads: Ollama is running) Works in postman too http://localhost:11434/api/tags { "models": [ { "name": "llama2:latest", "modified_at": "2023-12-12T13:18:43.442731906-05:00", "size": 3825819519, "digest": "fe938a131f40e6f6d40083c9f0f430a515233eb2edaa6d72eb85c50d64f2300e" }, { "name": "mistral:latest", "modified_at": "2024-01-03T08:53:40.229738067-05:00", "size": 4109865159, "digest": "61e88e884507ba5e06c49b40e6226884b2a16e872382c2b44a42f2d119d804a5" }, { "name": "openchat:latest", "modified_at": "2024-01-03T08:53:56.319736704-05:00", "size": 4109876386, "digest": "aa6d10add428bf93660c6c27daedd48934f62c36a554101557d67a52a79de76b" }, { "name": "yarn-mistral:latest", "modified_at": "2024-01-03T08:55:20.569730092-05:00", "size": 4108916676, "digest": "8e9c368a0ae42f5c29f59eacd6ad3c20685e5525066727ebc10ee62321a60999" } ] }
Author
Owner

@dtsoden commented on GitHub (Mar 25, 2024):

https://github.com/open-webui/open-webui/assets/36846872/ba5aff7d-dcc3-459b-9e89-c2abbd8756af

I attached a video of the error / issue

<!-- gh-comment-id:2018494051 --> @dtsoden commented on GitHub (Mar 25, 2024): https://github.com/open-webui/open-webui/assets/36846872/ba5aff7d-dcc3-459b-9e89-c2abbd8756af I attached a video of the error / issue
Author
Owner

@tjbck commented on GitHub (Mar 25, 2024):

@dtsoden You should click on the refresh button to verify the connection. I admit that doesn't seem that intuitive/obvious, I'll modify the code so that when you save it'll alert you if you haven't verified the connection.

<!-- gh-comment-id:2018779513 --> @tjbck commented on GitHub (Mar 25, 2024): @dtsoden You should click on the refresh button to verify the connection. I admit that doesn't seem that intuitive/obvious, I'll modify the code so that when you save it'll alert you if you haven't verified the connection.
Author
Owner

@dtsoden commented on GitHub (Mar 26, 2024):

@tjbck I have, nothing happens when I click this button - I can click a million times no indicator on click, no confirmation, nothing and when I press save the same ole same ole as shown in the video above. 😒

<!-- gh-comment-id:2020402696 --> @dtsoden commented on GitHub (Mar 26, 2024): @tjbck I have, nothing happens when I click this button - I can click a million times no indicator on click, no confirmation, nothing and when I press save the same ole same ole as shown in the video above. 😒
Author
Owner

@G4Zz0L1 commented on GitHub (Mar 26, 2024):

I also add my own problem to this: I can get my local ollama to connect, use it and modify the models, but when I restart the container (I installed it with docker) the URL is no longer saved and I have to start it again.
EDIT: it also seems that the OLLAMA_BASE_URL variable is ignored, either by putting localhost or 127.0.0.1

<!-- gh-comment-id:2021282764 --> @G4Zz0L1 commented on GitHub (Mar 26, 2024): I also add my own problem to this: I can get my local ollama to connect, use it and modify the models, but when I restart the container (I installed it with docker) the URL is no longer saved and I have to start it again. EDIT: it also seems that the OLLAMA_BASE_URL variable is ignored, either by putting localhost or 127.0.0.1
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

So in this ticket I describe my install as manual and just today I gave up and installed docker and I am getting the same issue inside the container image as I did with my manual install. This app is sadly just broken for new installs using ollama running locally. Shame I'll have to switch to a new UI for ollama (I loved this before it changed to open-webui from ollama-webui, which IME was super stable till this change)

<!-- gh-comment-id:2022675386 --> @dtsoden commented on GitHub (Mar 27, 2024): So in this ticket I describe my install as manual and just today I gave up and installed docker and I am getting the same issue inside the container image as I did with my manual install. This app is sadly just broken for new installs using ollama running locally. Shame I'll have to switch to a new UI for ollama (I loved this before it changed to open-webui from ollama-webui, which IME was super stable till this change)
Author
Owner

@justinh-rahb commented on GitHub (Mar 27, 2024):

@dtsoden I see trailing slashes on the end of your URLs in the original post, are you putting those in your config or run command as well? This is how one should decide what the URL is supposed to be:

Setup Base URL Setting Required Notes
WebUI (Docker) + Ollama (Non-Docker) Default http://host.docker.internal:11434 Ollama running natively on the host; the default should work for official Docker installations.
WebUI (Docker) + Ollama (Docker, Same Stack) http://ollama:11434 Both services in the same Docker Compose stack; uses Docker's internal DNS for communication.
WebUI (Docker) + Ollama (Docker, Separate) http://host.docker.internal:11434 Ollama in a separate Docker container not in the same stack; ensure -p 11434:11434 is used.
WebUI (Docker) + Ollama (Docker or Non-Docker, Distro Docker Package) http://127.0.0.1:11434 If using Docker from distro's repository or Ollama not in the same Docker network as WebUI; set URL explicitly and for WebUI; use --network=host.

Further context and explanation around hostnames and localhost in Docker:
https://docs.openwebui.com/faq#q-why-cant-my-docker-container-connect-to-services-on-the-host-using-localhost
https://docs.openwebui.com/getting-started/troubleshooting

Ollama FAQ regarding environment variables for server usage:
https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server

<!-- gh-comment-id:2022732171 --> @justinh-rahb commented on GitHub (Mar 27, 2024): @dtsoden I see trailing slashes on the end of your URLs in the original post, are you putting those in your config or run command as well? This is how one should decide what the URL is supposed to be: | Setup | Base URL Setting Required | Notes | |-----------------------------------------------|---------------------------------------------|-------------------------------------------------------------------------------------------------| | WebUI (Docker) + Ollama (Non-Docker) | Default `http://host.docker.internal:11434` | Ollama running natively on the host; the default should work for official Docker installations. | | WebUI (Docker) + Ollama (Docker, Same Stack) | `http://ollama:11434` | Both services in the same Docker Compose stack; uses Docker's internal DNS for communication. | | WebUI (Docker) + Ollama (Docker, Separate) | `http://host.docker.internal:11434` | Ollama in a separate Docker container not in the same stack; ensure `-p 11434:11434` is used. | | WebUI (Docker) + Ollama (Docker or Non-Docker, Distro Docker Package) | `http://127.0.0.1:11434` | If using Docker from distro's repository or Ollama not in the same Docker network as WebUI; set URL explicitly and for WebUI; use `--network=host`. | Further context and explanation around hostnames and localhost in Docker: https://docs.openwebui.com/faq#q-why-cant-my-docker-container-connect-to-services-on-the-host-using-localhost https://docs.openwebui.com/getting-started/troubleshooting Ollama FAQ regarding environment variables for server usage: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

I have followed all of these nuances and variations in a desperate attempt to get this to work again before the changes to open-webui. I had to do the last "WebUI (Docker) + Ollama (Docker or Non-Docker, Distro Docker Package)" for my Docker attempt which worked and operates EXACTLY as the manually installed version. This to me, seams to be a bug. Albeit I have not documented every nuance attempt... There are lot's of great and useful suggestions (like yours), but one thing no one seams to have done, is validate the reported issue. sometimes there is just a bug in the matrix 🙃

If someone on the contribution/development team could please setup a new instance using the latest version (with Docker or Manually) folks should be easily able to replicate what I have described herein.

All of this was and has worked perfectly for me till the change occurred and rebranding. Now its just broken because I can not connect to my local LLMs because the configuration won't save or stick.

<!-- gh-comment-id:2022880331 --> @dtsoden commented on GitHub (Mar 27, 2024): I have followed all of these nuances and variations in a desperate attempt to get this to work again before the changes to open-webui. I had to do the last "WebUI (Docker) + Ollama (Docker or Non-Docker, Distro Docker Package)" for my Docker attempt which worked and operates EXACTLY as the manually installed version. This to me, seams to be a bug. Albeit I have not documented every nuance attempt... There are lot's of great and useful suggestions (like yours), but one thing no one seams to have done, is validate the reported issue. sometimes there is just a bug in the matrix 🙃 If someone on the contribution/development team could please setup a new instance using the latest version (with Docker or Manually) folks should be easily able to replicate what I have described herein. All of this was and has worked perfectly for me till the change occurred and rebranding. Now its just broken because I can not connect to my local LLMs because the configuration won't save or stick.
Author
Owner

@tjbck commented on GitHub (Mar 27, 2024):

it also seems that the OLLAMA_BASE_URL variable is ignored, either by putting localhost or 127.0.0.1

@G4Zz0L1 The settings will not persist if you restart the container so you'd have to set OLLAMA_BASE_URL env var manually. Could you share your installation command with us? Persistent config is in the works with #1022, so stay tuned for that.

If someone on the contribution/development team could please setup a new instance using the latest version (with Docker or Manually) folks should be easily able to replicate what I have described herein.

@dtsoden Setting up the dev environment uses the exact same method as the manual installation instruction. I've personally tested this by removing and fresh installing it on all three of my machines (macos, ubuntu, windows) and was unable to reproduce the issue at all, as have thousands of other people. We have not changed anything regarding saving the URL in the code, I strongly encourage you to check our docs again to see if you have missed anything.

[ x] I have included the browser console logs.

The minimum you could do to help us diagnose the issue is by including the browser console logs and backend logs (container logs), please follow the instructions and share them with us if you want better assistance from the community, instead of just saying it doesn't work which is very counterproductive. Keep in mind, our project is maintained by volunteers who juggle their day jobs with their passion for contributing here. We're all human here, swamped with messages round the clock, and this support work doesn't pay the bills.

image

When Ollama is reachable:
image

When Ollama is unreachable:

image

@dtsoden One other thing I've noticed is your issue post on LiteLLM repo: https://github.com/BerriAI/litellm/issues/2681 They're NOT related to our project at all, so PLEASE close you issue there. We do not condone spam-like behaviour on someone else repo. Thanks for your understanding.

<!-- gh-comment-id:2023093946 --> @tjbck commented on GitHub (Mar 27, 2024): > it also seems that the OLLAMA_BASE_URL variable is ignored, either by putting localhost or 127.0.0.1 @G4Zz0L1 The settings will not persist if you restart the container so you'd have to set `OLLAMA_BASE_URL` env var manually. Could you share your installation command with us? Persistent config is in the works with #1022, so stay tuned for that. > If someone on the contribution/development team could please setup a new instance using the latest version (with Docker or Manually) folks should be easily able to replicate what I have described herein. @dtsoden Setting up the dev environment uses the exact same method as the manual installation instruction. I've personally tested this by removing and fresh installing it on all three of my machines (macos, ubuntu, windows) and was unable to reproduce the issue at all, as have thousands of other people. We have not changed anything regarding saving the URL in the code, I strongly encourage you to [check our docs again](https://docs.openwebui.com/getting-started/#how-to-install-without-docker) to see if you have missed anything. > [ x] I have included the browser console logs. The minimum you could do to help us diagnose the issue is by including the browser console logs and backend logs (container logs), please follow the instructions and share them with us if you want better assistance from the community, instead of just saying it doesn't work which is very counterproductive. Keep in mind, our project is maintained by volunteers who juggle their day jobs with their passion for contributing here. We're all human here, swamped with messages round the clock, and this support work doesn't pay the bills. <img width="586" alt="image" src="https://github.com/open-webui/open-webui/assets/25473318/c4169885-78f8-406f-a071-462e5676e2ce"> When Ollama is reachable: <img width="834" alt="image" src="https://github.com/open-webui/open-webui/assets/25473318/5d7d1390-d5d5-4ecf-8f8b-973a831e8d6e"> When Ollama is unreachable: <img width="804" alt="image" src="https://github.com/open-webui/open-webui/assets/25473318/554105ae-9d51-4b42-b030-68546fedb248"> @dtsoden One other thing I've noticed is your issue post on LiteLLM repo: https://github.com/BerriAI/litellm/issues/2681 They're NOT related to our project at all, so **PLEASE close you issue there**. We do not condone spam-like behaviour on someone else repo. Thanks for your understanding.
Author
Owner

@justinh-rahb commented on GitHub (Mar 27, 2024):

@dtsoden I understand that you may be feeling frustrated by the issues you are experiencing with your installation. It is possible that there may be some underlying issues that we have not yet identified. I encourage you to continue seeking assistance from the community and providing any relevant information that may help us diagnose the problem more effectively. We can only be aware of what you share with us, and there is certainly some key detail that is being left out. If you don't want to cooperate with those that have been trying to assist you, that's your prerogative but don't take the community for granted. Remember that our project is maintained by volunteers who also have day jobs, so your patience and understanding during this process is greatly appreciated.

<!-- gh-comment-id:2023142600 --> @justinh-rahb commented on GitHub (Mar 27, 2024): @dtsoden I understand that you may be feeling frustrated by the issues you are experiencing with your installation. It is possible that there may be some underlying issues that we have not yet identified. I encourage you to continue seeking assistance from the community and providing any relevant information that may help us diagnose the problem more effectively. We can only be aware of what you share with us, and there is certainly some key detail that is being left out. If you don't want to cooperate with those that have been trying to assist you, that's your prerogative but don't take the community for granted. Remember that our project is maintained by volunteers who also have day jobs, so your patience and understanding during this process is greatly appreciated.
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

ver
Microsoft Windows [Version 10.0.22631.3296]

wsl --version
WSL version: 2.1.5.0
Kernel version: 5.15.146.1-2
WSLg version: 1.0.60
MSRDC version: 1.2.5105
Direct3D version: 1.611.1-81528511
DXCore version: 10.0.25131.1002-220531-1700.rs-onecore-base2-hyp
Windows version: 10.0.22631.3296

lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.3 LTS
Release: 22.04
Codename: jammy

sudo apt-get update

sudo apt-get install ca-certificates curl

sudo install -m 0755 -d /etc/apt/keyrings

sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc

sudo chmod a+r /etc/apt/keyrings/docker.asc

echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
  $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
  sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

On my Windows PC I go to http://localhost:8080
The app runs and this is where the hell begins and never ends regardless if this is Docker or manual.

MANUAL
git clone https://github.com/open-webui/open-webui.git

cd open-webui/

Copying required .env file

cp -RPp .env.example .env

Building Frontend Using Node

npm i
npm run build

Serving Frontend with the Backend

cd ./backend
pip install -r requirements.txt -U
bash start.sh

In the manual variation and in the "open-webui" folder I run...
nano .env

the contents are correct

The path '/ollama' will be redirected to the specified backend URL

OLLAMA_BASE_URL='http://localhost:11434'

OPENAI_API_BASE_URL=''
OPENAI_API_KEY=''

AUTOMATIC1111_BASE_URL="http://localhost:7860"

DO NOT TRACK

SCARF_NO_ANALYTICS=true
DO_NOT_TRACK=true

This is madness - nothing works to verify or save any URL
Not sure how else to communicate this is broken, a bug or bad docs.
Either way its not working, and is extremely frustrating as this was work when it was Ollama-WebUI not its been rebranded and is totally broken.

<!-- gh-comment-id:2023188903 --> @dtsoden commented on GitHub (Mar 27, 2024): `ver` Microsoft Windows [Version 10.0.22631.3296] `wsl --version` WSL version: 2.1.5.0 Kernel version: 5.15.146.1-2 WSLg version: 1.0.60 MSRDC version: 1.2.5105 Direct3D version: 1.611.1-81528511 DXCore version: 10.0.25131.1002-220531-1700.rs-onecore-base2-hyp Windows version: 10.0.22631.3296 `lsb_release -a` No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 22.04.3 LTS Release: 22.04 Codename: jammy `sudo apt-get update` `sudo apt-get install ca-certificates curl` `sudo install -m 0755 -d /etc/apt/keyrings` `sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc` `sudo chmod a+r /etc/apt/keyrings/docker.asc` ``` echo \ "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \ $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \ sudo tee /etc/apt/sources.list.d/docker.list > /dev/null ``` `sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main` On my Windows PC I go to http://localhost:8080 The app runs and this is where the hell begins and never ends regardless if this is Docker or manual. MANUAL git clone https://github.com/open-webui/open-webui.git `cd open-webui/` # Copying required .env file `cp -RPp .env.example .env` # Building Frontend Using Node `npm i` `npm run build` # Serving Frontend with the Backend `cd ./backend` `pip install -r requirements.txt -U` `bash start.sh` In the manual variation and in the "open-webui" folder I run... `nano .env` the contents are correct > # The path '/ollama' will be redirected to the specified backend URL > OLLAMA_BASE_URL='http://localhost:11434' > > OPENAI_API_BASE_URL='' > OPENAI_API_KEY='' > > # AUTOMATIC1111_BASE_URL="http://localhost:7860" > > # DO NOT TRACK > SCARF_NO_ANALYTICS=true > DO_NOT_TRACK=true This is madness - nothing works to verify or save any URL Not sure how else to communicate this is broken, a bug or bad docs. Either way its not working, and is extremely frustrating as this was work when it was Ollama-WebUI not its been rebranded and is totally broken.
Author
Owner

@tjbck commented on GitHub (Mar 27, 2024):

Not sure how else to communicate this is broken, a bug or bad docs.

We need browser console logs and backend logs.

<!-- gh-comment-id:2023217204 --> @tjbck commented on GitHub (Mar 27, 2024): > Not sure how else to communicate this is broken, a bug or bad docs. We need browser console logs and backend logs.
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

@tjbck are you saying this is just isolated to my machine and that this code runs fine elsewhere using a fresh Linux instance?

I can get the browser console logs but what logs are you wanting from the system

  • Windows Host
  • WSL linux (host to docker and ollama running at this level not in docker)
  • Docker
  • Docker running container

give me whatever commands you want run on each of the layers to ensure I get exactly what you want

<!-- gh-comment-id:2023240650 --> @dtsoden commented on GitHub (Mar 27, 2024): @tjbck are you saying this is just isolated to my machine and that this code runs fine elsewhere using a fresh Linux instance? I can get the browser console logs but what logs are you wanting from the system - Windows Host - WSL linux (host to docker and ollama running at this level not in docker) - Docker - Docker running container give me whatever commands you want run on each of the layers to ensure I get exactly what you want
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

@tjbck console logs
localhost-1711557661910.log

<!-- gh-comment-id:2023258036 --> @dtsoden commented on GitHub (Mar 27, 2024): @tjbck console logs [localhost-1711557661910.log](https://github.com/open-webui/open-webui/files/14776471/localhost-1711557661910.log)
Author
Owner

@justinh-rahb commented on GitHub (Mar 27, 2024):

@dtsoden check your Ollama configuration, we believe you may have misconfigured a service override (if dealing with Ollama installed on Linux).

<!-- gh-comment-id:2023279999 --> @justinh-rahb commented on GitHub (Mar 27, 2024): @dtsoden check your Ollama configuration, we believe you may have misconfigured a service override (if dealing with Ollama installed on Linux).
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

@G4Zz0L1

I ran this at the WSL Linux level
curl https://ollama.ai/install.sh | sh

OMG this worked - unbelievable... thanks goodness. sigh of relief - I was seriously losing my mind. Thanks for the assist!!!

<!-- gh-comment-id:2023286790 --> @dtsoden commented on GitHub (Mar 27, 2024): @G4Zz0L1 I ran this at the WSL Linux level `curl https://ollama.ai/install.sh | sh` OMG this worked - unbelievable... thanks goodness. sigh of relief - I was seriously losing my mind. Thanks for the assist!!!
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

note to self and all... if you pull on the UI make sure you upgrade your backend.
Unless the backend breaks the front end then not sure what to do for ya, 😂

<!-- gh-comment-id:2023291127 --> @dtsoden commented on GitHub (Mar 27, 2024): note to self and all... if you pull on the UI make sure you upgrade your backend. Unless the backend breaks the front end then not sure what to do for ya, 😂
Author
Owner

@tjbck commented on GitHub (Mar 27, 2024):

I am on the latest version of both Open WebUI and Ollama.

This was our requirement for the issue report. Please review our template thoroughly next time.

<!-- gh-comment-id:2023295503 --> @tjbck commented on GitHub (Mar 27, 2024): > I am on the latest version of both Open WebUI and Ollama. This was our requirement for the issue report. Please review our template thoroughly next time.
Author
Owner

@justinh-rahb commented on GitHub (Mar 27, 2024):

I ran this at the WSL Linux level curl https://ollama.ai/install.sh | sh

OMG this worked - unbelievable... thanks goodness. sigh of relief - I was seriously losing my mind. Thanks for the assist!!!

I'm glad we've finally managed to solve the problem you were facing. However, I must say I'm a little disappointed in how you've acted throughout this conversation. Blaming others when things don't go as planned is not the most effective way to get the help you need, and many other projects would have refused to help you further long ago.

I understand that everyone has bad days, but it's important to remember that we're all human beings here to help each other. In the future, please try to approach situations like this with a more positive attitude. Remember, we're all in this together.

<!-- gh-comment-id:2023300044 --> @justinh-rahb commented on GitHub (Mar 27, 2024): > I ran this at the WSL Linux level `curl https://ollama.ai/install.sh | sh` > > OMG this worked - unbelievable... thanks goodness. sigh of relief - I was seriously losing my mind. Thanks for the assist!!! I'm glad we've finally managed to solve the problem you were facing. However, I must say I'm a little disappointed in how you've acted throughout this conversation. Blaming others when things don't go as planned is not the most effective way to get the help you need, and many other projects would have refused to help you further long ago. I understand that everyone has bad days, but it's important to remember that we're all human beings here to help each other. In the future, please try to approach situations like this with a more positive attitude. Remember, we're all in this together.
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

@tjbck

I am on the latest version of both Open WebUI and Ollama.

This was our requirement for the issue report. Please review our template thoroughly next time.

noted, and I suggest you update your installation doc's with this helpful information as this was where I spent the majority of my time reading thoroughly. Guilty as charged that I tersely read the request template before filling out.

<!-- gh-comment-id:2023312651 --> @dtsoden commented on GitHub (Mar 27, 2024): @tjbck > > I am on the latest version of both Open WebUI and Ollama. > > This was our requirement for the issue report. Please review our template thoroughly next time. noted, and I suggest you update your installation doc's with this helpful information as this was where I spent the majority of my time reading thoroughly. Guilty as charged that I tersely read the request template before filling out.
Author
Owner

@dtsoden commented on GitHub (Mar 27, 2024):

@G4Zz0L1
I ran this at the WSL Linux level curl https://ollama.ai/install.sh | sh
OMG this worked - unbelievable... thanks goodness. sigh of relief - I was seriously losing my mind. Thanks for the assist!!!

I'm glad we've finally managed to solve the problem you were facing. However, I must say I'm a little disappointed in how you've acted throughout this conversation. Blaming others when things don't go as planned is not the most effective way to get the help you need, and many other projects would have refused to help you further long ago.

I understand that everyone has bad days, but it's important to remember that we're all human beings here to help each other. In the future, please try to approach situations like this with a more positive attitude. Remember, we're all in this together.

@justinh-rahb

Not sure I agree with this assessment. The only blame I delt was that it worked before the upgrade. If you took that personally, then I can not control the "feelings" of others. All I can say is it all did work till the upgrade, and that is and was a fact. I appreciate the help but all the back and forth and miraculously trying to explain the issue was getting this issue nowheres. And for that YES I was frustrated. You can not control the feelings of others either. Open source or paid there is an implied persona role here. service provider and consumer. It's important to understand a project is not just coding. As mentioned to @tjbck the helpful resolution was in the ticket instructions that I overlooked when it should be in the installation docs in and around the area

"If Ollama is on your computer, use this command:"

None the less thank you for your help @tjbck @justinh-rahb.

<!-- gh-comment-id:2023328311 --> @dtsoden commented on GitHub (Mar 27, 2024): > > @G4Zz0L1 > > I ran this at the WSL Linux level `curl https://ollama.ai/install.sh | sh` > > OMG this worked - unbelievable... thanks goodness. sigh of relief - I was seriously losing my mind. Thanks for the assist!!! > > I'm glad we've finally managed to solve the problem you were facing. However, I must say I'm a little disappointed in how you've acted throughout this conversation. Blaming others when things don't go as planned is not the most effective way to get the help you need, and many other projects would have refused to help you further long ago. > > I understand that everyone has bad days, but it's important to remember that we're all human beings here to help each other. In the future, please try to approach situations like this with a more positive attitude. Remember, we're all in this together. @justinh-rahb Not sure I agree with this assessment. The only blame I delt was that it worked before the upgrade. If you took that personally, then I can not control the "feelings" of others. All I can say is it all did work till the upgrade, and that is and was a fact. I appreciate the help but all the back and forth and miraculously trying to explain the issue was getting this issue nowheres. And for that YES I was frustrated. You can not control the feelings of others either. Open source or paid there is an implied persona role here. service provider and consumer. It's important to understand a project is not just coding. As mentioned to @tjbck the helpful resolution was in the ticket instructions that I overlooked when it should be in the installation docs in and around the area > "If Ollama is on your computer, use this command:" None the less thank you for your help @tjbck @justinh-rahb.
Author
Owner

@justinh-rahb commented on GitHub (Mar 27, 2024):

Not sure I agree with this assessment. The only blame I delt was that it worked before the upgrade.

Then respectfully, your assessment is incomplete. You had success using an older codebase, with an older codebase (for which is was designed to interface with). Your troubles began after you updated one codebase (WebUI) but not the other (Ollama).

If you feel that our docs are incomplete or not clear enough, please by all means submit a PR to the docs repo, we'll be happy to take it:
https://github.com/open-webui/docs/compare

<!-- gh-comment-id:2023344544 --> @justinh-rahb commented on GitHub (Mar 27, 2024): > Not sure I agree with this assessment. The only blame I delt was that it worked before the upgrade. Then respectfully, your assessment is incomplete. You had success using an _older_ codebase, with an _older_ codebase (for which is was designed to interface with). Your troubles began after you updated one codebase (WebUI) but not the other (Ollama). If you feel that our docs are incomplete or not clear enough, please by all means submit a PR to the docs repo, we'll be happy to take it: https://github.com/open-webui/docs/compare
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#27963