[GH-ISSUE #17435] issue: Changing localhost => somehost in Admin Settings is not saved #18283

Closed
opened 2026-04-20 00:28:57 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @tigran123 on GitHub (Sep 14, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/17435

Originally assigned to: @tjbck on GitHub.

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

0.6.28

Ollama Version (if applicable)

0.11.10

Operating System

Ubuntu Linux 22.04.05

Browser (if applicable)

Chrome 140.0.7339.80

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

In Admin Settings changing localhost to any other hostname accessible via LAN should work, as long there is an instance of Ollama listening on port 11434 on that host. It does not.

Actual Behavior

The new hostname is reverted back to localhost even though I click on "Save" in the Connections dialog.

Steps to Reproduce

  1. Install the latest ollama and open-webui. Install gpt-oss:20b and gpt-oss:120b models.
  2. Set OLLAMA_HOST=0.0.0.0 so that ollama is accepting connections from other LAN IPs, not just localhost
  3. Run ollama serve on the hostname (creating the admin account)
  4. Run open-webui serve on that same hostname (i.e. access Ollama via localhost) just to verify that everything works. It does. Now kill open-webui via Ctrl-C and switch to a different machine.
  5. On that different machine do pip install open-webui (but not Ollama)
  6. Run open-webui serve (unsetting OPENAI_KEY variable to prevent it from using my OpenAI credits)
  7. Create the admin account on this machine inside open-webui
  8. Go to Admin Settings/Connections and try changing the URL http://localhost:11434 to http://super:11434 (the super is the name of the machine, on which ollama serve is running)
  9. Click on the verify connection icon -- it works fine! I.e. green message is displayed about everything being ok
  10. Now click on "Save"
  11. Observe that the URL is reverted to http://localhost:11434 instead of http://super:11434

Logs & Screenshots

Image

Additional Information

No response

Originally created by @tigran123 on GitHub (Sep 14, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/17435 Originally assigned to: @tjbck on GitHub. ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version 0.6.28 ### Ollama Version (if applicable) 0.11.10 ### Operating System Ubuntu Linux 22.04.05 ### Browser (if applicable) Chrome 140.0.7339.80 ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior In Admin Settings changing `localhost` to any other hostname accessible via LAN should work, as long there is an instance of Ollama listening on port 11434 on that host. It does not. ### Actual Behavior The new hostname is reverted back to `localhost` even though I click on "Save" in the Connections dialog. ### Steps to Reproduce 1. Install the latest ollama and open-webui. Install gpt-oss:20b and gpt-oss:120b models. 2. Set OLLAMA_HOST=0.0.0.0 so that ollama is accepting connections from other LAN IPs, not just localhost 3. Run `ollama serve` on the `hostname` (creating the admin account) 4. Run `open-webui serve` on that same `hostname` (i.e. access Ollama via localhost) just to verify that everything works. It does. Now kill open-webui via `Ctrl-C` and switch to a different machine. 5. On that different machine do `pip install open-webui` (but not Ollama) 6. Run `open-webui serve` (unsetting OPENAI_KEY variable to prevent it from using my OpenAI credits) 7. Create the admin account on this machine inside open-webui 8. Go to Admin Settings/Connections and try changing the URL `http://localhost:11434` to `http://super:11434` (the `super` is the name of the machine, on which `ollama serve` is running) 9. Click on the `verify connection` icon -- it works fine! I.e. green message is displayed about everything being ok 10. Now click on "Save" 11. Observe that the URL is reverted to `http://localhost:11434` instead of `http://super:11434` ### Logs & Screenshots <img width="1241" height="717" alt="Image" src="https://github.com/user-attachments/assets/e7adc31a-8bbe-44db-929f-9b6ec6c24ea1" /> ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-20 00:28:57 -05:00
Author
Owner

@tigran123 commented on GitHub (Sep 14, 2025):

Just to add that after I click on Save I get the green message "Ollama API settings updated", but in actual fact they were NOT updated, i.e. it still points to http://localhost:11434.

<!-- gh-comment-id:3289437580 --> @tigran123 commented on GitHub (Sep 14, 2025): Just to add that after I click on Save I get the green message "Ollama API settings updated", but in actual fact they were NOT updated, i.e. it still points to `http://localhost:11434`.
Author
Owner

@Classic298 commented on GitHub (Sep 14, 2025):

have you clicked on Save in the popup AND ALSO on Save on the bottom right in the connection settings?

<!-- gh-comment-id:3289438146 --> @Classic298 commented on GitHub (Sep 14, 2025): have you clicked on `Save` in the popup AND ALSO on `Save` on the bottom right in the connection settings?
Author
Owner

@tigran123 commented on GitHub (Sep 14, 2025):

When I click on Save in the popup it shows the old value http://localhost:11434 in the connection settings, so clicking on Save in there would appear meaningless. Nevertheless, I have just tried that and no, it does not change anything. It still points to http://localhost:11434.

One possible workaround is to run open-webui serve on the same machine as ollama serve and access it as http://super:8080 -- that should work, but that means having the overhead of open-webui serve on the machine which I wanted to dedicate solely to running the LLMs locally.

<!-- gh-comment-id:3289440616 --> @tigran123 commented on GitHub (Sep 14, 2025): When I click on `Save` in the popup it shows the old value `http://localhost:11434` in the connection settings, so clicking on `Save` in there would appear meaningless. Nevertheless, I have just tried that and no, it does not change anything. It still points to `http://localhost:11434`. One possible workaround is to run `open-webui serve` on the same machine as `ollama serve` and access it as `http://super:8080` -- that should work, but that means having the overhead of `open-webui serve` on the machine which I wanted to dedicate solely to running the LLMs locally.
Author
Owner

@tigran123 commented on GitHub (Sep 14, 2025):

And what I see in the console log is this:

2025-09-14 11:55:55.423 | INFO     | open_webui.config:save:212 - Saving 'ENABLE_OLLAMA_API' to the database
2025-09-14 11:55:55.459 | INFO     | open_webui.config:save:212 - Saving 'OLLAMA_BASE_URLS' to the database
2025-09-14 11:55:55.484 | INFO     | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database
2025-09-14 11:55:55.501 | INFO     | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database
2025-09-14 11:55:55.519 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:46926 - "POST /ollama/config/update HTTP/1.1" 200
2025-09-14 11:55:55.557 | INFO     | open_webui.routers.ollama:get_all_models:348 - get_all_models()
2025-09-14 11:55:55.560 | ERROR    | open_webui.routers.ollama:send_get_request:106 - Connection error: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]
2025-09-14 11:55:55.562 | ERROR    | open_webui.routers.ollama:send_get_request:106 - Connection error: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]
2025-09-14 11:55:55.563 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:46926 - "GET /api/models?refresh=true HTTP/1.1" 200
<!-- gh-comment-id:3289443244 --> @tigran123 commented on GitHub (Sep 14, 2025): And what I see in the console log is this: ``` 2025-09-14 11:55:55.423 | INFO | open_webui.config:save:212 - Saving 'ENABLE_OLLAMA_API' to the database 2025-09-14 11:55:55.459 | INFO | open_webui.config:save:212 - Saving 'OLLAMA_BASE_URLS' to the database 2025-09-14 11:55:55.484 | INFO | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database 2025-09-14 11:55:55.501 | INFO | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database 2025-09-14 11:55:55.519 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:46926 - "POST /ollama/config/update HTTP/1.1" 200 2025-09-14 11:55:55.557 | INFO | open_webui.routers.ollama:get_all_models:348 - get_all_models() 2025-09-14 11:55:55.560 | ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] 2025-09-14 11:55:55.562 | ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] 2025-09-14 11:55:55.563 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:46926 - "GET /api/models?refresh=true HTTP/1.1" 200 ```
Author
Owner

@Classic298 commented on GitHub (Sep 14, 2025):

When I click on Save in the popup it shows the old value http://localhost:11434 in the connection settings

What if you modify the URL directly in the connection settings, without opening the popup, and then click on Save?

<!-- gh-comment-id:3289443556 --> @Classic298 commented on GitHub (Sep 14, 2025): > When I click on Save in the popup it shows the old value http://localhost:11434 in the connection settings What if you modify the URL directly in the connection settings, without opening the popup, and then click on Save?
Author
Owner

@tigran123 commented on GitHub (Sep 14, 2025):

The URL displayed in the connection settings is NOT editable. That is why a gear icon is provided on the right.

<!-- gh-comment-id:3289445744 --> @tigran123 commented on GitHub (Sep 14, 2025): The URL displayed in the connection settings is NOT editable. That is why a gear icon is provided on the right.
Author
Owner

@Classic298 commented on GitHub (Sep 14, 2025):

Oh yeah, my bad sorry.

I just tested it, can fully reproduce the issue you describe.
As if the value isn't even being saved.

cc @tjbck

<!-- gh-comment-id:3289446959 --> @Classic298 commented on GitHub (Sep 14, 2025): Oh yeah, my bad sorry. I just tested it, can fully reproduce the issue you describe. As if the value isn't even being saved. cc @tjbck
Author
Owner

@tigran123 commented on GitHub (Sep 14, 2025):

Ah, I found a workaround! Instead of editing the existing Ollama API point I can create a new one by clicking on the + and just point it to the URL http://super:11434 and then it works fine!

Image Image
<!-- gh-comment-id:3289449584 --> @tigran123 commented on GitHub (Sep 14, 2025): Ah, I found a workaround! Instead of editing the existing Ollama API point I can create a new one by clicking on the `+` and just point it to the URL `http://super:11434` and then it works fine! <img width="1414" height="381" alt="Image" src="https://github.com/user-attachments/assets/ebf48fdc-e01a-418c-927b-5926504e014b" /> <img width="1446" height="517" alt="Image" src="https://github.com/user-attachments/assets/822e7806-69ad-4b66-818f-d826febc6a5e" />
Author
Owner

@tigran123 commented on GitHub (Sep 14, 2025):

I have some more information about this bug: this is more general than I thought. Namely, it is impossible to change ANY URL, even the one created by myself. E.g. I have just created an OpenAI compatible endpoint for Groq and I mis-typed the URL as https://api.groq.com/openai/v1/chat/completions and now I cannot change it to the correct URL https://api.groq.com/openai/v1. I get the green message about it being updated, but nothing is updated. So I have to delete it and enter the correct URL from the beginning.

<!-- gh-comment-id:3289568252 --> @tigran123 commented on GitHub (Sep 14, 2025): I have some more information about this bug: this is more general than I thought. Namely, it is impossible to change ANY URL, even the one created by myself. E.g. I have just created an OpenAI compatible endpoint for Groq and I mis-typed the URL as `https://api.groq.com/openai/v1/chat/completions` and now I cannot change it to the correct URL `https://api.groq.com/openai/v1`. I get the green message about it being updated, but nothing is updated. So I have to delete it and enter the correct URL from the beginning.
Author
Owner

@tjbck commented on GitHub (Sep 15, 2025):

Addressed with e4c864de7e in dev, testing wanted here.

0.6.29 should be released shortly.

<!-- gh-comment-id:3293125124 --> @tjbck commented on GitHub (Sep 15, 2025): Addressed with e4c864de7eb0d577843a80688677ce3659d1f81f in dev, testing wanted here. 0.6.29 should be released shortly.
Author
Owner

@tigran123 commented on GitHub (Sep 19, 2025):

Unable to test, because 0.6.30 is totally broken, it would not even start:

$ open-webui 
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /home/tigran/.local/bin/open-webui:8 in <module>                                                 │
│                                                                                                  │
│   5 from open_webui import app                                                                   │
│   6 if __name__ == '__main__':                                                                   │
│   7 │   sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])                         │
│ ❱ 8 │   sys.exit(app())                                                                          │
│   9                                                                                              │
│                                                                                                  │
│ ╭────────────────────────────── locals ───────────────────────────────╮                          │
│ │ app = <typer.main.Typer object at 0x7f639f8db920>                   │                          │
│ │  re = <module 're' from '/usr/local/lib/python3.12/re/__init__.py'> │                          │
│ │ sys = <module 'sys' (built-in)>                                     │                          │
│ ╰─────────────────────────────────────────────────────────────────────╯                          │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:326 in __call__                   │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:309 in __call__                   │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:348 in get_command                │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:330 in get_group                  │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:485 in get_group_from_info        │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:579 in get_command_from_info      │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:555 in                            │
│ get_params_convertors_ctx_param_name_from_function                                               │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:888 in get_click_param            │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/typer/core.py:423 in __init__                   │
│                                                                                                  │
│ /home/tigran/.local/lib/python3.12/site-packages/click/core.py:2793 in __init__                  │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
TypeError: Secondary flag is not valid for non-boolean flag.
$ python3.12 --version
Python 3.12.3
$ cat ~/.local/bin/open-webui 
#!/usr/local/bin/python3.12
# -*- coding: utf-8 -*-
import re
import sys
from open_webui import app
if __name__ == '__main__':
    sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
    sys.exit(app())
<!-- gh-comment-id:3313726361 --> @tigran123 commented on GitHub (Sep 19, 2025): Unable to test, because 0.6.30 is totally broken, it would not even start: ``` $ open-webui ╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ /home/tigran/.local/bin/open-webui:8 in <module> │ │ │ │ 5 from open_webui import app │ │ 6 if __name__ == '__main__': │ │ 7 │ sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0]) │ │ ❱ 8 │ sys.exit(app()) │ │ 9 │ │ │ │ ╭────────────────────────────── locals ───────────────────────────────╮ │ │ │ app = <typer.main.Typer object at 0x7f639f8db920> │ │ │ │ re = <module 're' from '/usr/local/lib/python3.12/re/__init__.py'> │ │ │ │ sys = <module 'sys' (built-in)> │ │ │ ╰─────────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:326 in __call__ │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:309 in __call__ │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:348 in get_command │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:330 in get_group │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:485 in get_group_from_info │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:579 in get_command_from_info │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:555 in │ │ get_params_convertors_ctx_param_name_from_function │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/main.py:888 in get_click_param │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/typer/core.py:423 in __init__ │ │ │ │ /home/tigran/.local/lib/python3.12/site-packages/click/core.py:2793 in __init__ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ TypeError: Secondary flag is not valid for non-boolean flag. $ python3.12 --version Python 3.12.3 $ cat ~/.local/bin/open-webui #!/usr/local/bin/python3.12 # -*- coding: utf-8 -*- import re import sys from open_webui import app if __name__ == '__main__': sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0]) sys.exit(app()) ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#18283