mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #89] feat: server side API calls #11931
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @nopoz on GitHub (Nov 11, 2023).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/89
It would be great if the API calls from the web ui could be made server side. Right now if I have ollama and ollama-webui in a Docker stack, the web ui communicates with the ollama api externally from outside the stack. Ideally it would instead communicate to the API inside the stack.
I see this as a possible solution: Provide an optional docker environment value to make the communication server-side and if configured, this removes the option to configure the API url in web configuration page.
This is a great looking project so far! Thanks!
@tjbck commented on GitHub (Nov 15, 2023):
Hi, Just merged the dev branch with the backend reverse proxy feature. Please take a look at the setup instructions as they have been also updated. Let me know if you're experiencing any issues. Thanks!
@nopoz commented on GitHub (Nov 16, 2023):
@tjbck Thanks for the change! While this gets closer to the desired result, I think there might be some problems with this approach.
In order to use
- "host.docker.internal:host-gateway"which changes the "Ollama Server URL" to/ollama/apithe ollama API port still needs to be exposed outside the container stack. This is problematic as users might not want to expose the olllama API outside the stack at all. I've included a docker compose example below.When two services are in the same stack, they can utilize the built-in docker DNS to resolve each others hostname. So in the below example, ollama-webui can connect directly to ollama using the "ollama" DNS name. For example, if you start an interactive shell on the ollama-webui container, you will be able to connect directly to ollama API inside the stack with "http://ollama:11434/api" without needing to expose the API port in the compose file.
If I comment out the port being exposed, the webui connection to the API fails:
Possible solutions:
<variable_name>='http://ollama:11434/api'Thanks!
@nopoz commented on GitHub (Nov 16, 2023):
Here are some test results I ran if it helps:
jump into interactive shell on ollama-webui container:
docker exec -t -i ollama-webui /bin/bashinstall dig, ping and curl:
apt updateapt install dnsutilsapt install iputils-pingapt install curlDNS for ollama resolves as expected - providing the internal IP of ollama service inside the stack:
pinging ollama works:
curl test to API works:
@tjbck commented on GitHub (Nov 17, 2023):
This looks promising, I'll take a look and update the docker compose + backend files accordingly. In the meantime, PR is always welcome. Thanks!
@tjbck commented on GitHub (Nov 17, 2023):
Now that I'm taking a second look, you should be able to leverage the OLLAMA_API_BASE_URL environment variable to connect to Ollama without exposing the port externally. Could you please try setting the environment variable to
OLLAMA_API_BASE_URL=http://ollama:11434/apiand see if it functions as you intended?Regardless of your result, I'll still update the docker compose file for enhanced security, Thanks!
@nopoz commented on GitHub (Nov 17, 2023):
Yes, that does seem to work! Thank you.
Here is an example of my compose file I used: