[GH-ISSUE #2507] Running Ollama on localnetwork #1465

Closed
opened 2026-04-12 11:22:18 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @Jimmys-Code on GitHub (Feb 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2507

I am building a python ai project inside a docker container an my windows PC. I was wondering if i could run the Ollama server on my Mac and connect to it from the Pc from inside that docker container how to actually achieve this. Still new to python and programming so any help would be much appreciated thanks.

Originally created by @Jimmys-Code on GitHub (Feb 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2507 I am building a python ai project inside a docker container an my windows PC. I was wondering if i could run the Ollama server on my Mac and connect to it from the Pc from inside that docker container how to actually achieve this. Still new to python and programming so any help would be much appreciated thanks.
GiteaMirror added the question label 2026-04-12 11:22:18 -05:00
Author
Owner

@virt-10 commented on GitHub (Feb 15, 2024):

https://github.com/ollama/ollama/blob/main/docs/faq.md

I don't use docker but probably something like this for ollama docker

docker run -d -v ollama:/root/.ollama -e OLLAMA_HOST="0.0.0.0" -p 11434:11434 --name ollama ollama/ollama

If you meant allow windows docker to access ollama you need to launch ollama with OLLAMA_HOST="0.0.0.0"
and that you expose the port

In your windows docker, you may need to create the container with host network
https://docs.docker.com/network/

<!-- gh-comment-id:1945290158 --> @virt-10 commented on GitHub (Feb 15, 2024): https://github.com/ollama/ollama/blob/main/docs/faq.md I don't use docker but probably something like this for ollama docker ``` docker run -d -v ollama:/root/.ollama -e OLLAMA_HOST="0.0.0.0" -p 11434:11434 --name ollama ollama/ollama ``` If you meant allow windows docker to access ollama you need to launch ollama with OLLAMA_HOST="0.0.0.0" and that you expose the port In your windows docker, you may need to create the container with host network https://docs.docker.com/network/
Author
Owner

@mxyng commented on GitHub (Feb 15, 2024):

If I understand the original issue, you want to serve ollama from macOS without Docker and connect to it on Windows inside a container.

First, on your macOS system you need to allow Ollama to accept requests from any address by binding to 0.0.0.0. See the FAQ for now to do this on MacOS.

Then, in your container, set base URL to the macOS system's IP address. If you're using the Ollama Python or JS client libraries, setting the environment variable OLLAMA_HOST is sufficient. If you're using the API directly, make sure requests are being sent to http://<macos-address>:11434/

<!-- gh-comment-id:1947178819 --> @mxyng commented on GitHub (Feb 15, 2024): If I understand the original issue, you want to serve ollama from macOS without Docker and connect to it on Windows inside a container. First, on your macOS system you need to allow Ollama to accept requests from any address by binding to 0.0.0.0. See the [FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md) for now to do this on MacOS. Then, in your container, set base URL to the macOS system's IP address. If you're using the Ollama Python or JS client libraries, setting the environment variable `OLLAMA_HOST` is sufficient. If you're using the API directly, make sure requests are being sent to `http://<macos-address>:11434/`
Author
Owner

@hoyyeva commented on GitHub (Mar 11, 2024):

Hey @JAzco4, big thanks for using Ollama for your project! Resolving this for now but please let us know if you're still facing any issues, we are happy to help!

<!-- gh-comment-id:1989201181 --> @hoyyeva commented on GitHub (Mar 11, 2024): Hey @JAzco4, big thanks for using Ollama for your project! Resolving this for now but please let us know if you're still facing any issues, we are happy to help!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1465