docker compose webui connection issue #15

Closed
opened 2025-11-11 14:02:06 -06:00 by GiteaMirror · 10 comments
Owner

Originally created by @chymian on GitHub (Oct 30, 2023).

Originally assigned to: @tjbck on GitHub.

Describe the bug
webui has connection pbls., not showing models when ollama server runs on docker.
true for running webui on docker or cli.

To Reproduce
Steps to reproduce the behavior:
docker-compose.yaml

version: '3.6'

services:
  ollama-api:
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu]
    image: ollama/ollama:latest
    pull_policy: always
    container_name: ollama
    tty: true # enable colorized logs
    restart: unless-stopped
    environment:
      - OLLAMA_ORIGINS="*"
    ports:
      - 11434:11434
    volumes:
      - /var/lib/ollama:/root
      - /srv/models/ollama:/root/.ollama/models


  ollama-webui:
    restart: unless-stopped
    build:
      context: .
      args:
        OLLAMA_API_BASE_URL: 'http://ollama-api:11434/api'
      dockerfile: Dockerfile
    image: ollama-webui:latest
    container_name: ollama-webui
    extra_hosts:
      - "host.docker.internal:host-gateway"
    ports:
      - 3000:8080

Expected behavior
webui connects to ollama-api via internal docker routing.

Server:

  • OS: ubu22.04
  • Browser: ungoogled-chromium

Additional context
via setup & build I have permutated many potenial URLs.
all (localhost, 0.0.0.0, VPN-IP) fail to connect-test, except using the LAN-IP.
but still shows no models.

ollama list works normal.
curl from another host via VPN also works.

in the browser console log:
IP: 10.11.1.x is VPN

Access to fetch at 'http://10.11.1.17:11434/api/tags' from origin 'http://gulag:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
start.48f4feda.js:1     GET http://10.11.1.17:11434/api/tags net::ERR_FAILED
window.fetch @ start.48f4feda.js:1
$ @ 2.23ce0af1.js:316
h @ 2.23ce0af1.js:312
M @ 2.23ce0af1.js:316
2.23ce0af1.js:316 TypeError: Failed to fetch
    at window.fetch (start.48f4feda.js:1:1402)
    at $ (2.23ce0af1.js:316:30653)
    at h (2.23ce0af1.js:312:40212)
    at HTMLButtonElement.M (2.23ce0af1.js:316:343)
2.23ce0af1.js:316 null
192.168.178.17:11434/api/tags:1     Failed to load resource: net::ERR_CONNECTION_TIMED_OUT
2.23ce0af1.js:316 TypeError: Failed to fetch
    at window.fetch (start.48f4feda.js:1:1402)
    at $ (2.23ce0af1.js:316:30653)
    at 2.23ce0af1.js:316:26530
    at v (scheduler.c37d1d9b.js:1:101)
    at Array.map (<anonymous>)
    at index.43b4ac03.js:4:2077
    at z (scheduler.c37d1d9b.js:1:1869)
    at _e (index.43b4ac03.js:4:3168)
    at new oe (app.4662c1c7.js:1:5070)
    at Pe (start.48f4feda.js:1:8373)
2.23ce0af1.js:316 null

IP: 192.168.178.x is LAN

start.48f4feda.js:1     GET http://192.168.178.17:11434/tags net::ERR_CONNECTION_TIMED_OUT
window.fetch @ start.48f4feda.js:1
$ @ 2.23ce0af1.js:316
h @ 2.23ce0af1.js:312
M @ 2.23ce0af1.js:316
2.23ce0af1.js:316 TypeError: Failed to fetch
    at window.fetch (start.48f4feda.js:1:1402)
    at $ (2.23ce0af1.js:316:30653)
    at h (2.23ce0af1.js:312:40212)
    at HTMLButtonElement.M (2.23ce0af1.js:316:343)
2.23ce0af1.js:316 null
2.23ce0af1.js:316 []
2.23ce0af1.js:316 2e01853d-2963-421b-8629-e4cfef86baca
Originally created by @chymian on GitHub (Oct 30, 2023). Originally assigned to: @tjbck on GitHub. **Describe the bug** webui has connection pbls., not showing models when ollama server runs on docker. true for running webui on docker or cli. **To Reproduce** Steps to reproduce the behavior: docker-compose.yaml ```yaml version: '3.6' services: ollama-api: deploy: resources: reservations: devices: - driver: nvidia count: 1 capabilities: [gpu] image: ollama/ollama:latest pull_policy: always container_name: ollama tty: true # enable colorized logs restart: unless-stopped environment: - OLLAMA_ORIGINS="*" ports: - 11434:11434 volumes: - /var/lib/ollama:/root - /srv/models/ollama:/root/.ollama/models ollama-webui: restart: unless-stopped build: context: . args: OLLAMA_API_BASE_URL: 'http://ollama-api:11434/api' dockerfile: Dockerfile image: ollama-webui:latest container_name: ollama-webui extra_hosts: - "host.docker.internal:host-gateway" ports: - 3000:8080 ``` **Expected behavior** webui connects to ollama-api via internal docker routing. **Server:** - OS: ubu22.04 - Browser: ungoogled-chromium **Additional context** via setup & build I have permutated many potenial URLs. all (localhost, 0.0.0.0, VPN-IP) fail to connect-test, except using the LAN-IP. but still shows no models. `ollama list` works normal. curl from another host via VPN also works. __in the browser console log:__ IP: 10.11.1.x is VPN ```log Access to fetch at 'http://10.11.1.17:11434/api/tags' from origin 'http://gulag:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. start.48f4feda.js:1 GET http://10.11.1.17:11434/api/tags net::ERR_FAILED window.fetch @ start.48f4feda.js:1 $ @ 2.23ce0af1.js:316 h @ 2.23ce0af1.js:312 M @ 2.23ce0af1.js:316 2.23ce0af1.js:316 TypeError: Failed to fetch at window.fetch (start.48f4feda.js:1:1402) at $ (2.23ce0af1.js:316:30653) at h (2.23ce0af1.js:312:40212) at HTMLButtonElement.M (2.23ce0af1.js:316:343) 2.23ce0af1.js:316 null 192.168.178.17:11434/api/tags:1 Failed to load resource: net::ERR_CONNECTION_TIMED_OUT 2.23ce0af1.js:316 TypeError: Failed to fetch at window.fetch (start.48f4feda.js:1:1402) at $ (2.23ce0af1.js:316:30653) at 2.23ce0af1.js:316:26530 at v (scheduler.c37d1d9b.js:1:101) at Array.map (<anonymous>) at index.43b4ac03.js:4:2077 at z (scheduler.c37d1d9b.js:1:1869) at _e (index.43b4ac03.js:4:3168) at new oe (app.4662c1c7.js:1:5070) at Pe (start.48f4feda.js:1:8373) 2.23ce0af1.js:316 null ``` IP: 192.168.178.x is LAN ```log start.48f4feda.js:1 GET http://192.168.178.17:11434/tags net::ERR_CONNECTION_TIMED_OUT window.fetch @ start.48f4feda.js:1 $ @ 2.23ce0af1.js:316 h @ 2.23ce0af1.js:312 M @ 2.23ce0af1.js:316 2.23ce0af1.js:316 TypeError: Failed to fetch at window.fetch (start.48f4feda.js:1:1402) at $ (2.23ce0af1.js:316:30653) at h (2.23ce0af1.js:312:40212) at HTMLButtonElement.M (2.23ce0af1.js:316:343) 2.23ce0af1.js:316 null 2.23ce0af1.js:316 [] 2.23ce0af1.js:316 2e01853d-2963-421b-8629-e4cfef86baca ```
Author
Owner

@tjbck commented on GitHub (Oct 30, 2023):

Hi, looks like you're facing a CORS error, could you docker exec to the Ollama container and make sure your environment variable has been registered? Additionally, if both Ollama and Ollama WebUI is running on the same machine, you don't have to add OLLAMA_API_BASE_URL build argument, no need to provide extra_hosts either. Keep us updated. Thanks.

@tjbck commented on GitHub (Oct 30, 2023): Hi, looks like you're facing a CORS error, could you docker exec to the Ollama container and make sure your environment variable has been registered? Additionally, if both Ollama and Ollama WebUI is running on the same machine, you don't have to add OLLAMA_API_BASE_URL build argument, no need to provide extra_hosts either. Keep us updated. Thanks.
Author
Owner

@chymian commented on GitHub (Oct 30, 2023):

you mean whether they are in the env the within the container?

env 
…
OLLAMA_ORIGINS="*"
OLLAMA_HOST=0.0.0.0
…

yes they are.

changed the compose file back to no buid-args & extra hosts with no change.

when I understand it rigth if it's ollama's CORS blocking, then I should not be able to curl from same/another host?

and then I don't understand, why its not throwing the CORS error when using the LAN-IP?

@chymian commented on GitHub (Oct 30, 2023): you mean whether they are in the env the within the container? ``` env … OLLAMA_ORIGINS="*" OLLAMA_HOST=0.0.0.0 … ``` yes they are. changed the compose file back to no buid-args & extra hosts with no change. when I understand it rigth if it's ollama's CORS blocking, then I should not be able to curl from same/another host? and then I don't understand, why its not throwing the CORS error when using the LAN-IP?
Author
Owner

@tjbck commented on GitHub (Oct 30, 2023):

CORS error only occurs in a browser environment, therefore would not affect api calls using curl.

Also, for the second log you provided doesn't seem to include '/api' (should be 'http://192.168.178.17:11434/api/tags') at the end, which would cause the connection problem, not CORS error.

If the environment variables were in fact correctly registered with the Ollama container, there shouldn't be an issue. Could you try running both Ollama and Ollama WebUI containers separately and see if that fixes your issue?

@tjbck commented on GitHub (Oct 30, 2023): CORS error only occurs in a browser environment, therefore would not affect api calls using curl. Also, for the second log you provided doesn't seem to include '/api' (should be 'http://192.168.178.17:11434/api/tags') at the end, which would cause the connection problem, not CORS error. If the environment variables were in fact correctly registered with the Ollama container, there shouldn't be an issue. Could you try running both Ollama and Ollama WebUI containers separately and see if that fixes your issue?
Author
Owner

@chymian commented on GitHub (Oct 30, 2023):

Could you try running both Ollama and Ollama WebUI containers separately and see if that fixes your issue?

I did run them seperatey again, with no change.
I've played arround with CORS a bit:

Server (gulag):

  • LAN 192.168.178.17
  • VPN 10.11.1.17

Laptop/browser
only via VPN 10.11.1.11

when I add:
OLLAMA_ORIGINS="http://gulag:*,http://10.11.1.17:*,http://10.11.1.11:*"
and empty/reset the field in the settings to http://gulag:11434/api,

get docker-logged:
ollama-api | [GIN] 2023/10/30 - 18:05:24 | 403 | 7.953µs | 10.11.1.11 | OPTIONS "/api/tags"

browser console:
Access to fetch at 'http://gulag:11434/api/tags' from origin 'http://gulag:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled

so in opposite to the LAN-IP which has no connection (not routed from/to laptop), using the VPN-adr. at least reaches ollama.
but explicite and implizite CORS allowance do not work.
looking up the error msg. seems that ollama does not responde with the right header?

CORS error only occurs in a browser environment, therefore would not affect api calls using curl.

ollama is answering, if I directly browse 'http://gulag:11434/api/tags' from my laptop.

@chymian commented on GitHub (Oct 30, 2023): > Could you try running both Ollama and Ollama WebUI containers separately and see if that fixes your issue? I did run them seperatey again, with no change. I've played arround with CORS a bit: Server (gulag): - LAN 192.168.178.17 - VPN 10.11.1.17 Laptop/browser only via VPN 10.11.1.11 when I add: ` OLLAMA_ORIGINS="http://gulag:*,http://10.11.1.17:*,http://10.11.1.11:*"` and empty/reset the field in the settings to http://gulag:11434/api, get docker-logged: `ollama-api | [GIN] 2023/10/30 - 18:05:24 | 403 | 7.953µs | 10.11.1.11 | OPTIONS "/api/tags"` __browser console:__ Access to fetch at 'http://gulag:11434/api/tags' from origin 'http://gulag:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: __No 'Access-Control-Allow-Origin' header is present on the requested resource__. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled so in opposite to the LAN-IP which has no connection (not routed from/to laptop), using the VPN-adr. at least reaches ollama. but explicite and implizite CORS allowance do not work. looking up the error msg. seems that ollama does not responde with the right header? > CORS error only occurs in a browser environment, therefore would not affect api calls using curl. ollama is answering, if I directly browse 'http://gulag:11434/api/tags' from my laptop.
Author
Owner

@chymian commented on GitHub (Oct 30, 2023):

in addition, there is a difference in the logged request:

via browser a GET is requested and answered

ollama-api    | [GIN] 2023/10/30 - 18:32:52 | 200 |    1.565883ms |      10.11.1.11 | GET      "/api/tags"

via webui a OPTIONS is requested and blocked by CORS

ollama-api    | [GIN] 2023/10/30 - 18:32:22 | 403 |       5.663µs |      10.11.1.11 | OPTIONS  "/api/tags"

dunno whether that helps.

@chymian commented on GitHub (Oct 30, 2023): in addition, there is a difference in the logged request: via browser a GET is requested and answered ``` ollama-api | [GIN] 2023/10/30 - 18:32:52 | 200 | 1.565883ms | 10.11.1.11 | GET "/api/tags" ``` via webui a OPTIONS is requested and blocked by CORS ``` ollama-api | [GIN] 2023/10/30 - 18:32:22 | 403 | 5.663µs | 10.11.1.11 | OPTIONS "/api/tags" ``` dunno whether that helps.
Author
Owner

@tjbck commented on GitHub (Oct 30, 2023):

ollama is answering, if I directly browse 'http://gulag:11434/api/tags' from my laptop.

If you're directly browsing to the Ollama server, it will not cause a CORS error because it would be from the same origin.

More info on CORS error here

The problem is likely caused by the VPN setup you have, as the WebUI for most people, don't seem to be experiencing your issue.

I did run them seperatey again, with no change.

Please try running Ollama as instructed here from your localhost:

docker run -d -v ollama:/root/.ollama -p 11434:11434 -e OLLAMA_ORIGINS="*" --name ollama ollama/ollama
docker build --build-arg OLLAMA_API_BASE_URL='' -t ollama-webui .
docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui`

If you're both running Ollama and Ollama WebUI from your localhost, It's guaranteed to work.

@tjbck commented on GitHub (Oct 30, 2023): > ollama is answering, if I directly browse 'http://gulag:11434/api/tags' from my laptop. If you're directly browsing to the Ollama server, it will not cause a CORS error because it would be from the same origin. More info on CORS error [here](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS/Errors) The problem is likely caused by the VPN setup you have, as the WebUI for most people, don't seem to be experiencing your issue. > I did run them seperatey again, with no change. Please try running Ollama as instructed [here](https://github.com/ollama-webui/ollama-webui#accessing-ollama-web-interface-over-lan) from your localhost: ```sh docker run -d -v ollama:/root/.ollama -p 11434:11434 -e OLLAMA_ORIGINS="*" --name ollama ollama/ollama ``` ```sh docker build --build-arg OLLAMA_API_BASE_URL='' -t ollama-webui . docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui` ``` If you're both running Ollama and Ollama WebUI from your localhost, It's **guaranteed** to work.
Author
Owner

@chymian commented on GitHub (Oct 30, 2023):

from the link you provided

Two URLs have the same origin if the protocol, port (if specified), and host are the same for both.

I think there is a missunderstanding:

ollama is answering, if I directly browse 'http://gulag:11434/api/tags' from my laptop.

If you're directly browsing to the Ollama server, it will not cause a CORS error because it would be from the same origin.

  • my laptop is only browsing via vpn. (I'm momentary in asia)
  • the headless server (gulag) - different host, runs the 2 docker images. (in germany)

so why would be my laptop same origin with the server, when they only have the protocol in common?
and then why the CORS ERROR, only via webui, and not via browsing directly the ollama-api.

Please try running Ollama as instructed here from your localhost:

that was where I was starting with no success, so I wrote the compose file.…

it's late in the night here, I waill tackle that tomorow again.
thank you very much for your help so far, @tjbck

@chymian commented on GitHub (Oct 30, 2023): from the link you provided > Two URLs have the same origin if the [protocol](https://developer.mozilla.org/en-US/docs/Glossary/Protocol), [port](https://developer.mozilla.org/en-US/docs/Glossary/Port) (if specified), and [host](https://developer.mozilla.org/en-US/docs/Glossary/Host) are the same for both. I think there is a missunderstanding: >> ollama is answering, if I directly browse 'http://gulag:11434/api/tags' from my laptop. >If you're directly browsing to the Ollama server, it will not cause a CORS error because it would be from the same origin. - my laptop is only browsing via vpn. (I'm momentary in asia) - the headless server (gulag) - different host, runs the 2 docker images. (in germany) so why would be my laptop `same origin` with the server, when they only have the protocol in common? and then why the CORS ERROR, only via webui, and not via browsing directly the ollama-api. > Please try running Ollama as instructed [here](https://github.com/ollama-webui/ollama-webui#accessing-ollama-web-interface-over-lan) from your localhost: that was where I was starting with no success, so I wrote the compose file.… it's late in the night here, I waill tackle that tomorow again. thank you very much for your help so far, @tjbck
Author
Owner

@tjbck commented on GitHub (Oct 30, 2023):

Good news! Just added compose.yaml file to the repo and it seems to work!

Just replaced

environment:
      - OLLAMA_ORIGINS="*"

to this:

environment:
      - 'OLLAMA_ORIGINS=*'
@tjbck commented on GitHub (Oct 30, 2023): Good news! Just added compose.yaml file to the repo and it seems to work! Just replaced ``` environment: - OLLAMA_ORIGINS="*" ``` to this: ``` environment: - 'OLLAMA_ORIGINS=*' ```
Author
Owner

@chymian commented on GitHub (Oct 31, 2023):

I can confirm that it's working.
thank you very much.

I just loaded coodbooga and it was producing garbage - endless. is there a way to stop it, without restarting the container?

@chymian commented on GitHub (Oct 31, 2023): I can confirm that it's working. thank you very much. I just loaded coodbooga and it was producing garbage - endless. is there a way to stop it, without restarting the container?
Author
Owner

@tjbck commented on GitHub (Nov 2, 2023):

Stop generation button has been Implemented with #48. Thanks.

@tjbck commented on GitHub (Nov 2, 2023): Stop generation button has been Implemented with #48. Thanks.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#15