[GH-ISSUE #18561] issue: ollama default network address wrong missing http:// prefix - should be http://localhost:11434 #34163

Closed
opened 2026-04-25 08:04:51 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @s-github-2 on GitHub (Oct 23, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/18561

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

0.6.34

Ollama Version (if applicable)

No response

Operating System

windows 11

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

connection to Ollama should have succeeded after startup

Actual Behavior

No models from ollama listed. Debugged it to the address missing http:// prefix in the default which was creating silent connection failures. After adding the prefix, it is able to connect to ollama and list local models

I am not providing any logs since there is no need for logs to verify that this problem exists and the fix is simple

Steps to Reproduce

  1. start with pip based installed in a venv
  2. run open-webui serve
  3. see if ollama models are listed and they will not be unless you fix the network path by adding http:// to the ollama address as localhost:11434/api/tags doesnt work but http://localhost:11434/api/tags does work

Logs & Screenshots

requests.exceptions.InvalidSchema: No connection adapters were found for 'localhost:11434/api/tags'
2025-10-23 14:09:45.975 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:55795 - "GET /ollama/api/tags/0 HTTP/1.1" 500
2025-10-23 14:10:31.140 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:55428 - "GET /_app/version.json HTTP/1.1" 200
2025-10-23 14:11:12.579 | ERROR | open_webui.routers.ollama:verify_connection:277 - Client error: localhost:11434/api/version
Traceback (most recent call last):

File "", line 198, in _run_module_as_main
File "", line 88, in _run_code

Additional Information

No response

Originally created by @s-github-2 on GitHub (Oct 23, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/18561 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version 0.6.34 ### Ollama Version (if applicable) _No response_ ### Operating System windows 11 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior connection to Ollama should have succeeded after startup ### Actual Behavior No models from ollama listed. Debugged it to the address missing http:// prefix in the default which was creating silent connection failures. After adding the prefix, it is able to connect to ollama and list local models I am not providing any logs since there is no need for logs to verify that this problem exists and the fix is simple ### Steps to Reproduce 1. start with pip based installed in a venv 2. run open-webui serve 3. see if ollama models are listed and they will not be unless you fix the network path by adding http:// to the ollama address as localhost:11434/api/tags doesnt work but http://localhost:11434/api/tags does work ### Logs & Screenshots requests.exceptions.InvalidSchema: No connection adapters were found for 'localhost:11434/api/tags' 2025-10-23 14:09:45.975 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:55795 - "GET /ollama/api/tags/0 HTTP/1.1" 500 2025-10-23 14:10:31.140 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:55428 - "GET /_app/version.json HTTP/1.1" 200 2025-10-23 14:11:12.579 | ERROR | open_webui.routers.ollama:verify_connection:277 - Client error: localhost:11434/api/version Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-25 08:04:51 -05:00
Author
Owner

@silentoplayz commented on GitHub (Oct 23, 2025):

How is this an issue? Are you using the bundled image with Ollama? Is the Ollama URL that gets automatically prefilled in the Ollama API URL input field incorrect, causing this issue to occur?

<!-- gh-comment-id:3438819038 --> @silentoplayz commented on GitHub (Oct 23, 2025): How is this an issue? Are you using the bundled image with Ollama? Is the Ollama URL that gets automatically prefilled in the `Ollama API URL` input field incorrect, causing this issue to occur?
Author
Owner

@s-github-2 commented on GitHub (Oct 23, 2025):

UPDATE: I just checked my environment variables and maybe it was the fact that I had
OLLAMA_BASE_URL=localhost:11434
as a variable (can't recall but probably some other package required this) and maybe that is why the default didnt have the prefix.
Maybe the code should check that a valid URL is listed in OLLAMA_BASE_URL and alert if its incorrect.

the ollama URL that is prefilled automatically doesnt have the http:// prefix which causes the network connection to fail
Adding the prefix and hitting refresh in the UI, connects to ollama
Note: the attached image shows AFTER I added the prefix - the default prefill started with localhost: without the http:// prefix
Image

<!-- gh-comment-id:3438921736 --> @s-github-2 commented on GitHub (Oct 23, 2025): UPDATE: I just checked my environment variables and maybe it was the fact that I had OLLAMA_BASE_URL=localhost:11434 as a variable (can't recall but probably some other package required this) and maybe that is why the default didnt have the prefix. Maybe the code should check that a valid URL is listed in OLLAMA_BASE_URL and alert if its incorrect. the ollama URL that is prefilled automatically doesnt have the http:// prefix which causes the network connection to fail Adding the prefix and hitting refresh in the UI, connects to ollama Note: the attached image shows AFTER I added the prefix - the default prefill started with localhost: without the http:// prefix <img width="722" height="657" alt="Image" src="https://github.com/user-attachments/assets/52812742-c299-458e-a389-1c1dcb15ac59" />
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#34163