[GH-ISSUE #89] feat: server side API calls #27459

Closed
opened 2026-04-25 02:08:57 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @nopoz on GitHub (Nov 11, 2023).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/89

It would be great if the API calls from the web ui could be made server side. Right now if I have ollama and ollama-webui in a Docker stack, the web ui communicates with the ollama api externally from outside the stack. Ideally it would instead communicate to the API inside the stack.

I see this as a possible solution: Provide an optional docker environment value to make the communication server-side and if configured, this removes the option to configure the API url in web configuration page.

This is a great looking project so far! Thanks!

Originally created by @nopoz on GitHub (Nov 11, 2023). Original GitHub issue: https://github.com/open-webui/open-webui/issues/89 It would be great if the API calls from the web ui could be made server side. Right now if I have ollama and ollama-webui in a Docker stack, the web ui communicates with the ollama api externally from outside the stack. Ideally it would instead communicate to the API inside the stack. I see this as a possible solution: Provide an optional docker environment value to make the communication server-side and if configured, this removes the option to configure the API url in web configuration page. This is a great looking project so far! Thanks!
Author
Owner

@tjbck commented on GitHub (Nov 15, 2023):

Hi, Just merged the dev branch with the backend reverse proxy feature. Please take a look at the setup instructions as they have been also updated. Let me know if you're experiencing any issues. Thanks!

<!-- gh-comment-id:1813221395 --> @tjbck commented on GitHub (Nov 15, 2023): Hi, Just merged the dev branch with the backend reverse proxy feature. Please take a look at the setup instructions as they have been also updated. Let me know if you're experiencing any issues. Thanks!
Author
Owner

@nopoz commented on GitHub (Nov 16, 2023):

@tjbck Thanks for the change! While this gets closer to the desired result, I think there might be some problems with this approach.

In order to use - "host.docker.internal:host-gateway" which changes the "Ollama Server URL" to /ollama/api the ollama API port still needs to be exposed outside the container stack. This is problematic as users might not want to expose the olllama API outside the stack at all. I've included a docker compose example below.

When two services are in the same stack, they can utilize the built-in docker DNS to resolve each others hostname. So in the below example, ollama-webui can connect directly to ollama using the "ollama" DNS name. For example, if you start an interactive shell on the ollama-webui container, you will be able to connect directly to ollama API inside the stack with "http://ollama:11434/api" without needing to expose the API port in the compose file.

If I comment out the port being exposed, the webui connection to the API fails:

version: '3.3'
services:
    ollama:
      container_name: ollama
      volumes:
          - '/your/path:/root/.ollama'
      #ports:
      #    - '11434:11434' # optional - for exposing the API outside the container stack
      restart: unless-stopped
      image: ollama/ollama

    ollama-webui:
        container_name: ollama-webui
        extra_hosts:
           - "host.docker.internal:host-gateway" 
        #environment:
          #- OLLAMA_API_BASE_URL='http://ollama.example.com/api' # optional - for when ollama is on a remote host
        depends_on:
          - ollama
        ports:
            - '8080:8080'
        restart: unless-stopped
        image: ghcr.io/ollama-webui/ollama-webui:main

Possible solutions:

  1. Some environment variable where the internal hostname can be specified, like <variable_name>='http://ollama:11434/api'
  2. Re-working the existing variable "OLLAMA_API_BASE_URL" so it can leverage the docker internal DNS?

Thanks!

<!-- gh-comment-id:1815427935 --> @nopoz commented on GitHub (Nov 16, 2023): @tjbck Thanks for the change! While this gets closer to the desired result, I think there might be some problems with this approach. In order to use ` - "host.docker.internal:host-gateway" ` which changes the "Ollama Server URL" to `/ollama/api` the ollama API port still needs to be exposed outside the container stack. This is problematic as users might not want to expose the olllama API outside the stack at all. I've included a docker compose example below. When two services are in the same stack, they can utilize the built-in docker DNS to resolve each others hostname. So in the below example, ollama-webui can connect directly to ollama using the "ollama" DNS name. For example, if you start an interactive shell on the ollama-webui container, you will be able to connect directly to ollama API inside the stack with "http://ollama:11434/api" without needing to expose the API port in the compose file. If I comment out the port being exposed, the webui connection to the API fails: ``` version: '3.3' services: ollama: container_name: ollama volumes: - '/your/path:/root/.ollama' #ports: # - '11434:11434' # optional - for exposing the API outside the container stack restart: unless-stopped image: ollama/ollama ollama-webui: container_name: ollama-webui extra_hosts: - "host.docker.internal:host-gateway" #environment: #- OLLAMA_API_BASE_URL='http://ollama.example.com/api' # optional - for when ollama is on a remote host depends_on: - ollama ports: - '8080:8080' restart: unless-stopped image: ghcr.io/ollama-webui/ollama-webui:main ``` Possible solutions: 1. Some environment variable where the internal hostname can be specified, like `<variable_name>='http://ollama:11434/api'` 2. Re-working the existing variable "OLLAMA_API_BASE_URL" so it can leverage the docker internal DNS? Thanks!
Author
Owner

@nopoz commented on GitHub (Nov 16, 2023):

Here are some test results I ran if it helps:

jump into interactive shell on ollama-webui container:
docker exec -t -i ollama-webui /bin/bash

install dig, ping and curl:
apt update
apt install dnsutils
apt install iputils-ping
apt install curl

DNS for ollama resolves as expected - providing the internal IP of ollama service inside the stack:

root@2b0c83a687fd:/app/backend# dig ollama

; <<>> DiG 9.11.5-P4-5.1+deb10u9-Debian <<>> ollama
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 5796
;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 0

;; QUESTION SECTION:
;ollama.                                IN      A

;; ANSWER SECTION:
ollama.                 600     IN      A       172.22.0.2

;; Query time: 0 msec
;; SERVER: 127.0.0.11#53(127.0.0.11)
;; WHEN: Thu Nov 16 23:07:35 UTC 2023
;; MSG SIZE  rcvd: 46

pinging ollama works:

root@2b0c83a687fd:/app/backend# ping -c 1 ollama
PING ollama (172.22.0.2) 56(84) bytes of data.
64 bytes from ollama.ollama_default (172.22.0.2): icmp_seq=1 ttl=64 time=0.034 ms

--- ollama ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms

curl test to API works:

root@2b0c83a687fd:/app/backend# curl -X POST http://ollama:11434/api/generate -d '{
  "model": "mistral",
  "prompt":"Why is the sky blue?"
}'
<expected json output>
<!-- gh-comment-id:1815459129 --> @nopoz commented on GitHub (Nov 16, 2023): Here are some test results I ran if it helps: jump into interactive shell on ollama-webui container: `docker exec -t -i ollama-webui /bin/bash` install dig, ping and curl: `apt update` `apt install dnsutils` `apt install iputils-ping` `apt install curl` DNS for ollama resolves as expected - providing the internal IP of ollama service inside the stack: ``` root@2b0c83a687fd:/app/backend# dig ollama ; <<>> DiG 9.11.5-P4-5.1+deb10u9-Debian <<>> ollama ;; global options: +cmd ;; Got answer: ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 5796 ;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 0 ;; QUESTION SECTION: ;ollama. IN A ;; ANSWER SECTION: ollama. 600 IN A 172.22.0.2 ;; Query time: 0 msec ;; SERVER: 127.0.0.11#53(127.0.0.11) ;; WHEN: Thu Nov 16 23:07:35 UTC 2023 ;; MSG SIZE rcvd: 46 ``` pinging ollama works: ``` root@2b0c83a687fd:/app/backend# ping -c 1 ollama PING ollama (172.22.0.2) 56(84) bytes of data. 64 bytes from ollama.ollama_default (172.22.0.2): icmp_seq=1 ttl=64 time=0.034 ms --- ollama ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms ``` curl test to API works: ``` root@2b0c83a687fd:/app/backend# curl -X POST http://ollama:11434/api/generate -d '{ "model": "mistral", "prompt":"Why is the sky blue?" }' <expected json output> ```
Author
Owner

@tjbck commented on GitHub (Nov 17, 2023):

This looks promising, I'll take a look and update the docker compose + backend files accordingly. In the meantime, PR is always welcome. Thanks!

<!-- gh-comment-id:1815563878 --> @tjbck commented on GitHub (Nov 17, 2023): This looks promising, I'll take a look and update the docker compose + backend files accordingly. In the meantime, PR is always welcome. Thanks!
Author
Owner

@tjbck commented on GitHub (Nov 17, 2023):

Now that I'm taking a second look, you should be able to leverage the OLLAMA_API_BASE_URL environment variable to connect to Ollama without exposing the port externally. Could you please try setting the environment variable to OLLAMA_API_BASE_URL=http://ollama:11434/api and see if it functions as you intended?

Regardless of your result, I'll still update the docker compose file for enhanced security, Thanks!

<!-- gh-comment-id:1815698714 --> @tjbck commented on GitHub (Nov 17, 2023): Now that I'm taking a second look, you should be able to leverage the OLLAMA_API_BASE_URL environment variable to connect to Ollama without exposing the port externally. Could you please try setting the environment variable to `OLLAMA_API_BASE_URL=http://ollama:11434/api` and see if it functions as you intended? Regardless of your result, I'll still update the docker compose file for enhanced security, Thanks!
Author
Owner

@nopoz commented on GitHub (Nov 17, 2023):

Yes, that does seem to work! Thank you.

Here is an example of my compose file I used:

version: '3.3'
services:
    ollama:
      container_name: ollama
      volumes:
          - "/your/path:/root/.ollama"
      restart: unless-stopped
      image: ollama/ollama

    ollama-webui:
        container_name: ollama-webui
        environment:
          - "OLLAMA_API_BASE_URL=http://ollama:11434/api"
        depends_on:
          - ollama
        ports:
            - 8080:8080
        restart: unless-stopped
        image: ghcr.io/ollama-webui/ollama-webui:main
<!-- gh-comment-id:1815754714 --> @nopoz commented on GitHub (Nov 17, 2023): Yes, that does seem to work! Thank you. Here is an example of my compose file I used: ``` version: '3.3' services: ollama: container_name: ollama volumes: - "/your/path:/root/.ollama" restart: unless-stopped image: ollama/ollama ollama-webui: container_name: ollama-webui environment: - "OLLAMA_API_BASE_URL=http://ollama:11434/api" depends_on: - ollama ports: - 8080:8080 restart: unless-stopped image: ghcr.io/ollama-webui/ollama-webui:main ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#27459