Settings -> Ollama Server URL: Getting rid of pitfalls #138

Closed
opened 2025-11-11 14:07:43 -06:00 by GiteaMirror · 3 comments
Owner

Originally created by @bavcol on GitHub (Dec 29, 2023).

I am new to ollama and ollama-webui and I was surprised how easy the default setup (localhost) was.
Next step was to run ollama and ollama-webui on different machines. I got it running eventually. But in retrospective, there were some pitfalls regarding Settings -> Ollama Server URL which made it unnecessary hard for me (and maybe for others as well).

Describe the solution you'd like

Feature 1: Improve advertisement of the UI-feature Settings -> Ollama Server URL

From README -> Features:
grafik
and from README -> Accessing External Ollama on a Different Server
grafik
This made me think there's (not yet) a convenient way to change the Ollama Server URL from within the UI and I have to use the environment variable.
Imo changing the Ollama Server URL in the UI should be advertised as the preferred way in the README (for the first time user). Setting the environment variable is worth mentioning as an alternative for the experienced user.

Feature 2: Improve default value of Settings -> Ollama Server URL

grafik
I suggest using http://localhost:11434/api as the default value. It shows the required url pattern and hints at the required url features (http:// & /api) and is easy to recognize ("Ah, this is the localhost:11434 from ollama!").
The current value /ollama/api seems pretty random to me.

Feature 3: Get rid of cumbersome required URL parts http:// and /api

I am used to enter localhost:11434 to check if my local ollama server is running:
grafik
However, localhost:11434 fails when entered in Settings -> Ollama Server URL:
grafik
Same for http://localhost:11434 and localhost:11434/api.
Only http://localhost:11434/api succeeds:
grafik

  • Allowing missing http:// should be a low hanging fruit: If missing, add (just like the browser does it).
  • The user should also not be required to enter the trailing /api. Imo it's an implementation detail exposed to the user.

Describe alternatives you've considered

-/-

Additional context

I use Windows 11 + WSL2 + Docker Desktop to run Ollama and Ollama WebUI.
I am a hobby user and don't have much experience in Linux, Docker and LLMs, but thanks to your awesome projects ollama and ollama-webui I still was able to "chat with my PC" ❤️

I have some experience in python and git, so if you agree on my suggested features I might be able to contribute (at least some of) those myself.

Also let me know if I should create separate issues for the suggested features. I thought it's a good idea to keep it all in one place, but I'm unsure if that's the right way to do it.

Originally created by @bavcol on GitHub (Dec 29, 2023). ## Is your feature request related to a problem? Please describe. I am new to ollama and ollama-webui and I was surprised how easy the default setup (localhost) was. Next step was to run ollama and ollama-webui on different machines. I got it running eventually. But in retrospective, there were some pitfalls regarding Settings -> Ollama Server URL which made it unnecessary hard for me (and maybe for others as well). ## Describe the solution you'd like ### Feature 1: Improve advertisement of the UI-feature Settings -> Ollama Server URL From [README -> Features](https://github.com/ollama-webui/ollama-webui#features-): ![grafik](https://github.com/ollama-webui/ollama-webui/assets/64075407/c2adef1b-bda1-4c40-ac30-8b75efad41d5) and from [README -> Accessing External Ollama on a Different Server](https://github.com/ollama-webui/ollama-webui#accessing-external-ollama-on-a-different-server) ![grafik](https://github.com/ollama-webui/ollama-webui/assets/64075407/56c405af-1277-41f8-92f9-2b6fae7a857f) This made me think there's (not yet) a convenient way to change the Ollama Server URL from within the UI and I have to use the environment variable. Imo changing the Ollama Server URL in the UI should be advertised as the preferred way in the README (for the first time user). Setting the environment variable is worth mentioning as an alternative for the experienced user. ### Feature 2: Improve default value of Settings -> Ollama Server URL ![grafik](https://github.com/ollama-webui/ollama-webui/assets/64075407/f9dacbeb-6551-4989-8b05-0127763b5f50) I suggest using _http://localhost:11434/api_ as the default value. It shows the required url pattern and hints at the required url features (_http://_ & _/api_) and is easy to recognize ("Ah, this is the _localhost:11434_ from ollama!"). The current value _/ollama/api_ seems pretty random to me. ### Feature 3: Get rid of cumbersome required URL parts _http://_ and _/api_ I am used to enter _localhost:11434_ to check if my local ollama server is running: ![grafik](https://github.com/ollama-webui/ollama-webui/assets/64075407/f3230ab0-2411-43ca-bb85-91ee05fe095d) However, _localhost:11434_ fails when entered in Settings -> Ollama Server URL: ![grafik](https://github.com/ollama-webui/ollama-webui/assets/64075407/f63b99e4-9878-4065-8695-4aaf363714bf) Same for _http://localhost:11434_ and _localhost:11434/api_. Only _http://localhost:11434/api_ succeeds: ![grafik](https://github.com/ollama-webui/ollama-webui/assets/64075407/397a45b6-c96a-414f-8567-c5b6829b7f8d) - Allowing missing _http://_ should be a low hanging fruit: If missing, add (just like the browser does it). - The user should also not be required to enter the trailing _/api_. Imo it's an implementation detail exposed to the user. ## Describe alternatives you've considered -/- ## Additional context I use Windows 11 + WSL2 + Docker Desktop to run Ollama and Ollama WebUI. I am a hobby user and don't have much experience in Linux, Docker and LLMs, but thanks to your awesome projects ollama and ollama-webui I still was able to "chat with my PC" ❤️ I have some experience in python and git, so if you agree on my suggested features I might be able to contribute (at least some of) those myself. Also let me know if I should create separate issues for the suggested features. I thought it's a good idea to keep it all in one place, but I'm unsure if that's the right way to do it.
Author
Owner

@tjbck commented on GitHub (Dec 29, 2023):

Hi, Thanks for the thorough feature request! FYI, Ollama WebUI consists of two primary components: the frontend and the backend (which serves as a reverse proxy, handling static frontend files, and additional features). Having the ollama api url set to /ollama/api (reverse proxy route) is a design choice we made on purpose, the reason being, it allows us to ensure enhanced security by securing the ollama api route. If you wish to only use the frontend for the webui, you might want to check out our stripped down version of ollama-webui that's being actively worked on: https://github.com/ollama-webui/ollama-webui-lite

I'll close this as not planned but I'll update the documentation and the help messages in the webui to reflect your concerns and to address any future confusions based on your request, Thanks a lot!

@tjbck commented on GitHub (Dec 29, 2023): Hi, Thanks for the thorough feature request! FYI, Ollama WebUI consists of two primary components: the frontend and the backend (which serves as a reverse proxy, handling static frontend files, and additional features). Having the ollama api url set to `/ollama/api` (reverse proxy route) is a design choice we made on purpose, the reason being, it allows us to ensure enhanced security by securing the ollama api route. If you wish to only use the frontend for the webui, you might want to check out our stripped down version of ollama-webui that's being actively worked on: https://github.com/ollama-webui/ollama-webui-lite I'll close this as not planned but I'll update the documentation and the help messages in the webui to reflect your concerns and to address any future confusions based on your request, Thanks a lot!
Author
Owner

@bavcol commented on GitHub (Dec 29, 2023):

I will keep an eye on https://github.com/ollama-webui/ollama-webui-lite :)

The updates in the Settings -> Ollama API URL and Troubleshooting.md are helping me understand. In the future I will use OLLAMA_API_BASE_URL environment variable.
What about adding another field Ollama API base URL to the UI Settings, so it can be conveniently set from within the UI, like Ollama API URL (instead of having to use the OLLAMA_API_BASE_URL environment variable)?

@bavcol commented on GitHub (Dec 29, 2023): I will keep an eye on https://github.com/ollama-webui/ollama-webui-lite :) The updates in the Settings -> _Ollama API URL_ and [Troubleshooting.md](https://github.com/ollama-webui/ollama-webui/blob/main/TROUBLESHOOTING.md) are helping me understand. In the future I will use `OLLAMA_API_BASE_URL` environment variable. What about adding another field _Ollama API base URL_ to the UI Settings, so it can be conveniently set from within the UI, like _Ollama API URL_ (instead of having to use the `OLLAMA_API_BASE_URL` environment variable)?
Author
Owner

@DailyDisco commented on GitHub (Jan 4, 2025):

This helped a bunch thanks

@DailyDisco commented on GitHub (Jan 4, 2025): This helped a bunch thanks
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#138