mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-12 01:54:38 -05:00
Differents images same container #6
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @loranger on GitHub (Oct 22, 2023).
Originally assigned to: @tjbck on GitHub.
Hi,
I try to run ollama and (this awesome) ollama-webui on the same container using docker compose, but I fail.
The aim is to run ollama as an API using a dedicated hostname routed by the traefik reverse proxy and an ui on another dedicated hostname also routed by traefik.
Here is my env file.
*.dockeris now resolved locally on 127.0.0.1 since I would like to make it work before using a real public hostname and tld..env
And here is my docker-compose file. The project folder container an ollama folder (where are stored ssh keys and models) and an ollama-webui which is a clone of this repository.
docker-compose.yml
Everything seems to run as I get a response from ollama:
And ollama-webui displays its user interface, but it cannot request ollama using the OLLAMA_ENDPOINT as it resolve api.ollama.docker to 127.0.0.1
I tried adding the host-gateway directive
bit it did not work.
I tried with different endpoints, calling service name, container_name or container hostname directly but nothing works.
Can you help ? Do you have any clue for me, please ?
For debugging purpose, I run everything locally on my macbook pro m1 max using sonoma 14.0 and docker 24.0.6. Hopefully it will then run live on debian 11.8 and docker 24.0.4
@coolaj86 commented on GitHub (Oct 22, 2023):
@loranger Go put a thumbs up or comment on these two related issues over at the
ollamarepo:Also, as soon as #10 is pulled in (which looks likely to happen within a day or two), you will be able to use the build version in your webserver directly with something like this:
It will also change
OLLAMA_ENDPOINTis changing toOLLAMA_API_BASE_URL, which is the more conventional naming.Sorry I can't be of more help understanding the specifics of why the Dockerfile isn't behaving as expected, but from the error messages it appears to be related to Docker and this project directly - so you might get more expertise in a Docker group.
Also, I notice a lot of people turning to Docker these days to do things that are otherwise very simple to do and easy to manage. Do you need all of the complexity of Docker? Or would just setting up the two web services with a simple Reverse Proxy + HTTPS solution like
caddywork better? And if you set that up, could you not just run the setup script from Docker?@tjbck commented on GitHub (Oct 22, 2023):
@loranger Just merged the static build PR #10 to main, could you please try again with the latest commit and see if that fixes your issue? It introduces breaking changes so your command should be replaced with the following instead:
Thanks!
@loranger commented on GitHub (Oct 22, 2023):
@coolaj86 Count me in, my thumbs are already on these two issues.
Unfortunately, I'm already hosting a few services using docker and traefik and I'd rather launch a few new containers than host the files locally, open ports when all I have to do is declare the host using traefik labels. I keep apps isolated from each other while allowing me to use different, even incompatible, environments.
@tjbck Thanks for the quick fix. I changed the webui traefik loadbalancer server port from
3000to8080and ran adocker compose build --build-arg OLLAMA_API_BASE_URL='http://api.ollama.docker/api'and now everything works flawlessly 👍🏻I confess I would have preferred to keep the
OLLAMA_API_BASE_URLenvironment variable dynamic rather than having to rebuild the whole ollama-webui image each time the url is changed, but in any case, it works very well. Thanks a lot!@coolaj86 commented on GitHub (Oct 22, 2023):
If you use a web server you can set it to
OLLAMA_API_BASE_URL=/apiwhere/apiis the route matched by the reverse proxy to the Ollama host, then it will behave dynamically (which is the default for websites and web browsers).