Differents images same container #6

Closed
opened 2025-11-11 14:01:41 -06:00 by GiteaMirror · 4 comments
Owner

Originally created by @loranger on GitHub (Oct 22, 2023).

Originally assigned to: @tjbck on GitHub.

Hi,

I try to run ollama and (this awesome) ollama-webui on the same container using docker compose, but I fail.
The aim is to run ollama as an API using a dedicated hostname routed by the traefik reverse proxy and an ui on another dedicated hostname also routed by traefik.

Here is my env file. *.docker is now resolved locally on 127.0.0.1 since I would like to make it work before using a real public hostname and tld.

.env
APP_PROJECT=ollama
APP_DOMAIN=ollama.docker

OLLAMA_HOST=0.0.0.0
OLLAMA_ORIGINS=*
OLLAMA_ENDPOINT="http://api.${APP_DOMAIN}"

And here is my docker-compose file. The project folder container an ollama folder (where are stored ssh keys and models) and an ollama-webui which is a clone of this repository.

docker-compose.yml
version: '3'
services:

  ollama:
    container_name: ${APP_PROJECT}-api
    hostname: ${APP_PROJECT}-api
    image: ollama/ollama
    env_file:
      - .env
    volumes:
      - ./ollama:/root/.ollama
    command: serve
    entrypoint: ['ollama']
    labels:
      - "traefik.http.routers.${APP_PROJECT}.rule=Host(`api.${APP_DOMAIN}`)"
      - "traefik.http.services.${APP_PROJECT}-service.loadbalancer.server.port=11434"

  ollama-webui:
    container_name: ${APP_PROJECT}-webui
    image: ${APP_PROJECT}-webui
    build:
      context: ./ollama-webui/
      dockerfile: Dockerfile
    env_file:
      - .env
    labels:
      - "traefik.http.routers.${APP_PROJECT}-webui.rule=Host(`${APP_DOMAIN}`)"
      - "traefik.http.services.${APP_PROJECT}-webui-service.loadbalancer.server.port=3000"

networks:
  default:
      name: traefik-network
      external: true

Everything seems to run as I get a response from ollama:

$ curl http://api.ollama.docker/
Ollama is running
$ curl http://api.ollama.docker/api/tags
{"models":[{"name":"llama2:latest","modified_at":"2023-10-22T09:27:20.059632774Z","size":3791737648,"digest":"7da22eda89ac1040639e351c0407c590221d8bc4f5ccdf580b85408d024904a3"},{"name":"mistral:latest","modified_at":"2023-10-22T09:37:00.598077313Z","size":4108916688,"digest":"8aa307f73b2622af521e8f22d46e4b777123c4df91898dcb2e4079dc8fdf579e"}]}

And ollama-webui displays its user interface, but it cannot request ollama using the OLLAMA_ENDPOINT as it resolve api.ollama.docker to 127.0.0.1

ollama-webui  | http://api.ollama.docker
ollama-webui  | TypeError: fetch failed
ollama-webui  |     at fetch (file:///app/build/shims.js:20346:13)
ollama-webui  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
ollama-webui  |     at async load (file:///app/build/server/chunks/2-91f95e04.js:5:18)
ollama-webui  |     at async load_server_data (file:///app/build/server/index.js:1930:18)
ollama-webui  |     at async file:///app/build/server/index.js:3301:18 {
ollama-webui  |   cause: Error: connect ECONNREFUSED 127.0.0.1:80
ollama-webui  |       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1595:16) {
ollama-webui  |     errno: -111,
ollama-webui  |     code: 'ECONNREFUSED',
ollama-webui  |     syscall: 'connect',
ollama-webui  |     address: '127.0.0.1',
ollama-webui  |     port: 80
ollama-webui  |   }
ollama-webui  | }

I tried adding the host-gateway directive

  ollama-webui:
    ...
    extra_hosts:
        - "host.docker.internal:host-gateway"

bit it did not work.

I tried with different endpoints, calling service name, container_name or container hostname directly but nothing works.

ollama-webui  | ollama-api
ollama-webui  | TypeError: Failed to parse URL from ollama-api/api/tags
ollama-webui  |     at fetch (file:///app/build/shims.js:20346:13)
ollama-webui  |     at async load (file:///app/build/server/chunks/2-91f95e04.js:5:18)
ollama-webui  |     at async load_server_data (file:///app/build/server/index.js:1930:18)
ollama-webui  |     at async file:///app/build/server/index.js:3301:18 {
ollama-webui  |   [cause]: TypeError: Invalid URL
ollama-webui  |       at new URL (node:internal/url:783:36)
ollama-webui  |       at new Request (file:///app/build/shims.js:13465:22)
ollama-webui  |       at fetch (file:///app/build/shims.js:14461:22)
ollama-webui  |       at fetch (file:///app/build/shims.js:20344:20)
ollama-webui  |       at load (file:///app/build/server/chunks/2-91f95e04.js:5:24)
ollama-webui  |       at load_server_data (file:///app/build/server/index.js:1930:42)
ollama-webui  |       at file:///app/build/server/index.js:3301:24 {
ollama-webui  |     code: 'ERR_INVALID_URL',
ollama-webui  |     input: 'ollama-api/api/tags'
ollama-webui  |   }
ollama-webui  | }

ollama-webui  | http://ollama-api
ollama-webui  | TypeError: fetch failed
ollama-webui  |     at fetch (file:///app/build/shims.js:20346:13)
ollama-webui  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
ollama-webui  |     at async load (file:///app/build/server/chunks/2-91f95e04.js:5:18)
ollama-webui  |     at async load_server_data (file:///app/build/server/index.js:1930:18)
ollama-webui  |     at async file:///app/build/server/index.js:3301:18 {
ollama-webui  |   cause: Error: connect ECONNREFUSED 172.18.0.6:80
ollama-webui  |       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1595:16) {
ollama-webui  |     errno: -111,
ollama-webui  |     code: 'ECONNREFUSED',
ollama-webui  |     syscall: 'connect',
ollama-webui  |     address: '172.18.0.6',
ollama-webui  |     port: 80
ollama-webui  |   }
ollama-webui  | }

Can you help ? Do you have any clue for me, please ?

For debugging purpose, I run everything locally on my macbook pro m1 max using sonoma 14.0 and docker 24.0.6. Hopefully it will then run live on debian 11.8 and docker 24.0.4

Originally created by @loranger on GitHub (Oct 22, 2023). Originally assigned to: @tjbck on GitHub. Hi, I try to run ollama and (this awesome) ollama-webui on the same container using docker compose, but I fail. The aim is to run ollama as an API using a dedicated hostname routed by the traefik reverse proxy and an ui on another dedicated hostname also routed by traefik. Here is my env file. `*.docker` is now resolved locally on 127.0.0.1 since I would like to make it work before using a real public hostname and tld. <details> <summary>.env</summary> ``` APP_PROJECT=ollama APP_DOMAIN=ollama.docker OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* OLLAMA_ENDPOINT="http://api.${APP_DOMAIN}" ``` </details> And here is my docker-compose file. The project folder container an ollama folder (where are stored ssh keys and models) and an ollama-webui which is a clone of this repository. <details> <summary>docker-compose.yml</summary> ```yaml version: '3' services: ollama: container_name: ${APP_PROJECT}-api hostname: ${APP_PROJECT}-api image: ollama/ollama env_file: - .env volumes: - ./ollama:/root/.ollama command: serve entrypoint: ['ollama'] labels: - "traefik.http.routers.${APP_PROJECT}.rule=Host(`api.${APP_DOMAIN}`)" - "traefik.http.services.${APP_PROJECT}-service.loadbalancer.server.port=11434" ollama-webui: container_name: ${APP_PROJECT}-webui image: ${APP_PROJECT}-webui build: context: ./ollama-webui/ dockerfile: Dockerfile env_file: - .env labels: - "traefik.http.routers.${APP_PROJECT}-webui.rule=Host(`${APP_DOMAIN}`)" - "traefik.http.services.${APP_PROJECT}-webui-service.loadbalancer.server.port=3000" networks: default: name: traefik-network external: true ``` </details> Everything seems to run as I get a response from ollama: ```shell $ curl http://api.ollama.docker/ Ollama is running $ curl http://api.ollama.docker/api/tags {"models":[{"name":"llama2:latest","modified_at":"2023-10-22T09:27:20.059632774Z","size":3791737648,"digest":"7da22eda89ac1040639e351c0407c590221d8bc4f5ccdf580b85408d024904a3"},{"name":"mistral:latest","modified_at":"2023-10-22T09:37:00.598077313Z","size":4108916688,"digest":"8aa307f73b2622af521e8f22d46e4b777123c4df91898dcb2e4079dc8fdf579e"}]} ``` And ollama-webui displays its user interface, but it cannot request ollama using the OLLAMA_ENDPOINT as it resolve api.ollama.docker to 127.0.0.1 ``` ollama-webui | http://api.ollama.docker ollama-webui | TypeError: fetch failed ollama-webui | at fetch (file:///app/build/shims.js:20346:13) ollama-webui | at process.processTicksAndRejections (node:internal/process/task_queues:95:5) ollama-webui | at async load (file:///app/build/server/chunks/2-91f95e04.js:5:18) ollama-webui | at async load_server_data (file:///app/build/server/index.js:1930:18) ollama-webui | at async file:///app/build/server/index.js:3301:18 { ollama-webui | cause: Error: connect ECONNREFUSED 127.0.0.1:80 ollama-webui | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1595:16) { ollama-webui | errno: -111, ollama-webui | code: 'ECONNREFUSED', ollama-webui | syscall: 'connect', ollama-webui | address: '127.0.0.1', ollama-webui | port: 80 ollama-webui | } ollama-webui | } ``` I tried adding the host-gateway directive ```yaml ollama-webui: ... extra_hosts: - "host.docker.internal:host-gateway" ``` bit it did not work. I tried with different endpoints, calling service name, container_name or container hostname directly but nothing works. ``` ollama-webui | ollama-api ollama-webui | TypeError: Failed to parse URL from ollama-api/api/tags ollama-webui | at fetch (file:///app/build/shims.js:20346:13) ollama-webui | at async load (file:///app/build/server/chunks/2-91f95e04.js:5:18) ollama-webui | at async load_server_data (file:///app/build/server/index.js:1930:18) ollama-webui | at async file:///app/build/server/index.js:3301:18 { ollama-webui | [cause]: TypeError: Invalid URL ollama-webui | at new URL (node:internal/url:783:36) ollama-webui | at new Request (file:///app/build/shims.js:13465:22) ollama-webui | at fetch (file:///app/build/shims.js:14461:22) ollama-webui | at fetch (file:///app/build/shims.js:20344:20) ollama-webui | at load (file:///app/build/server/chunks/2-91f95e04.js:5:24) ollama-webui | at load_server_data (file:///app/build/server/index.js:1930:42) ollama-webui | at file:///app/build/server/index.js:3301:24 { ollama-webui | code: 'ERR_INVALID_URL', ollama-webui | input: 'ollama-api/api/tags' ollama-webui | } ollama-webui | } ollama-webui | http://ollama-api ollama-webui | TypeError: fetch failed ollama-webui | at fetch (file:///app/build/shims.js:20346:13) ollama-webui | at process.processTicksAndRejections (node:internal/process/task_queues:95:5) ollama-webui | at async load (file:///app/build/server/chunks/2-91f95e04.js:5:18) ollama-webui | at async load_server_data (file:///app/build/server/index.js:1930:18) ollama-webui | at async file:///app/build/server/index.js:3301:18 { ollama-webui | cause: Error: connect ECONNREFUSED 172.18.0.6:80 ollama-webui | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1595:16) { ollama-webui | errno: -111, ollama-webui | code: 'ECONNREFUSED', ollama-webui | syscall: 'connect', ollama-webui | address: '172.18.0.6', ollama-webui | port: 80 ollama-webui | } ollama-webui | } ``` Can you help ? Do you have any clue for me, please ? For debugging purpose, I run everything locally on my macbook pro m1 max using sonoma 14.0 and docker 24.0.6. Hopefully it will then run live on debian 11.8 and docker 24.0.4
Author
Owner

@coolaj86 commented on GitHub (Oct 22, 2023):

@loranger Go put a thumbs up or comment on these two related issues over at the ollama repo:

Also, as soon as #10 is pulled in (which looks likely to happen within a day or two), you will be able to use the build version in your webserver directly with something like this:

CMD npm run build
COPY ./build/ ./ollama-webui/

It will also change OLLAMA_ENDPOINT is changing to OLLAMA_API_BASE_URL, which is the more conventional naming.

Sorry I can't be of more help understanding the specifics of why the Dockerfile isn't behaving as expected, but from the error messages it appears to be related to Docker and this project directly - so you might get more expertise in a Docker group.

Also, I notice a lot of people turning to Docker these days to do things that are otherwise very simple to do and easy to manage. Do you need all of the complexity of Docker? Or would just setting up the two web services with a simple Reverse Proxy + HTTPS solution like caddy work better? And if you set that up, could you not just run the setup script from Docker?

@coolaj86 commented on GitHub (Oct 22, 2023): @loranger Go put a thumbs up or comment on these two related issues over at the `ollama` repo: - https://github.com/jmorganca/ollama/issues/874 - https://github.com/jmorganca/ollama/issues/875 Also, as soon as #10 is pulled in (which looks likely to happen within a day or two), you will be able to use the build version in your webserver directly with something like this: ```Dockerfile CMD npm run build COPY ./build/ ./ollama-webui/ ``` It will also change `OLLAMA_ENDPOINT` is changing to `OLLAMA_API_BASE_URL`, which is the more conventional naming. Sorry I can't be of more help understanding the specifics of _why_ the Dockerfile isn't behaving as expected, but from the error messages it appears to be related to Docker and this project directly - so you might get more expertise in a Docker group. Also, I notice a lot of people turning to Docker these days to do things that are otherwise very simple to do and easy to manage. Do you need all of the complexity of Docker? Or would just setting up the two web services with a simple Reverse Proxy + HTTPS solution like [`caddy`](https://webinstall.dev/caddy) work better? And if you set that up, could you not just run the setup script from Docker?
Author
Owner

@tjbck commented on GitHub (Oct 22, 2023):

@loranger Just merged the static build PR #10 to main, could you please try again with the latest commit and see if that fixes your issue? It introduces breaking changes so your command should be replaced with the following instead:

docker build --build-arg OLLAMA_API_BASE_URL='http://api.ollama.docker/api' -t ollama-webui .
docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui

Thanks!

@tjbck commented on GitHub (Oct 22, 2023): @loranger Just merged the static build PR #10 to main, could you please try again with the latest commit and see if that fixes your issue? It introduces breaking changes so your command should be replaced with the following instead: ```sh docker build --build-arg OLLAMA_API_BASE_URL='http://api.ollama.docker/api' -t ollama-webui . docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui ``` Thanks!
Author
Owner

@loranger commented on GitHub (Oct 22, 2023):

@coolaj86 Count me in, my thumbs are already on these two issues.
Unfortunately, I'm already hosting a few services using docker and traefik and I'd rather launch a few new containers than host the files locally, open ports when all I have to do is declare the host using traefik labels. I keep apps isolated from each other while allowing me to use different, even incompatible, environments.

@tjbck Thanks for the quick fix. I changed the webui traefik loadbalancer server port from 3000 to 8080 and ran a docker compose build --build-arg OLLAMA_API_BASE_URL='http://api.ollama.docker/api' and now everything works flawlessly 👍🏻
I confess I would have preferred to keep the OLLAMA_API_BASE_URL environment variable dynamic rather than having to rebuild the whole ollama-webui image each time the url is changed, but in any case, it works very well. Thanks a lot!

@loranger commented on GitHub (Oct 22, 2023): @coolaj86 Count me in, my thumbs are already on these two issues. Unfortunately, I'm already hosting a few services using docker and traefik and I'd rather launch a few new containers than host the files locally, open ports when all I have to do is declare the host using traefik labels. I keep apps isolated from each other while allowing me to use different, even incompatible, environments. @tjbck Thanks for the quick fix. I changed the webui traefik loadbalancer server port from `3000` to `8080` and ran a `docker compose build --build-arg OLLAMA_API_BASE_URL='http://api.ollama.docker/api'` and now everything works flawlessly 👍🏻 I confess I would have preferred to keep the `OLLAMA_API_BASE_URL` environment variable dynamic rather than having to rebuild the whole ollama-webui image each time the url is changed, but in any case, it works very well. Thanks a lot!
Author
Owner

@coolaj86 commented on GitHub (Oct 22, 2023):

I would have preferred to keep the OLLAMA_API_BASE_URL environment variable dynamic

If you use a web server you can set it to OLLAMA_API_BASE_URL=/api where /api is the route matched by the reverse proxy to the Ollama host, then it will behave dynamically (which is the default for websites and web browsers).

@coolaj86 commented on GitHub (Oct 22, 2023): > I would have preferred to keep the `OLLAMA_API_BASE_URL` environment variable dynamic If you use a web server you can set it to `OLLAMA_API_BASE_URL=/api` where `/api` is the route matched by the reverse proxy to the Ollama host, then it will behave dynamically (which is the default for websites and web browsers).
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#6