[GH-ISSUE #10122] Running Ollama as a k8s STS with external script as entrypoint to load models #6641

Closed
opened 2026-04-12 18:19:31 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @khteh on GitHub (Apr 4, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10122

I manage to run Ollama as a k8s STS. I am using it for Python Langchain LLM/RAG application. However the following Dockerfile ENTRYPOINT script which tries to pull a list of images exported as MODELS ENV from k8s STS manifest runs into problem. Dockerfile has the following ENTRYPOINT and CMD:

ENTRYPOINT ["/usr/local/bin/run.sh"]
CMD ["bash"]

run.sh:

#!/bin/bash
set -x
ollama serve&
sleep 10
models="${MODELS//,/ }"
for i in "${models[@]}"; do \
      echo model: $i  \
      ollama pull $i \
    done

k8s logs:

+ models=llama3.2
/usr/local/bin/run.sh: line 10: syntax error: unexpected end of file
Originally created by @khteh on GitHub (Apr 4, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10122 I manage to run Ollama as a k8s STS. I am using it for Python Langchain LLM/RAG application. However the following Dockerfile `ENTRYPOINT` script which tries to pull a list of images exported as `MODELS` ENV from k8s STS manifest runs into problem. Dockerfile has the following `ENTRYPOINT` and `CMD`: ``` ENTRYPOINT ["/usr/local/bin/run.sh"] CMD ["bash"] ``` `run.sh`: ``` #!/bin/bash set -x ollama serve& sleep 10 models="${MODELS//,/ }" for i in "${models[@]}"; do \ echo model: $i \ ollama pull $i \ done ``` k8s logs: ``` + models=llama3.2 /usr/local/bin/run.sh: line 10: syntax error: unexpected end of file ```
Author
Owner
<!-- gh-comment-id:2779008943 --> @khteh commented on GitHub (Apr 4, 2025): https://stackoverflow.com/questions/79554394/running-ollama-as-a-k8s-sts-with-external-script-as-entrypoint-to-load-models
Author
Owner

@ghmer commented on GitHub (Apr 5, 2025):

This is not an ollama issue, but an error with your script, per

/usr/local/bin/run.sh: line 10: syntax error: unexpected end of file

IMHO you are missing some semicolons in your for loop.

<!-- gh-comment-id:2780583671 --> @ghmer commented on GitHub (Apr 5, 2025): This is not an ollama issue, but an error with your script, per > /usr/local/bin/run.sh: line 10: syntax error: unexpected end of file IMHO you are missing some semicolons in your for loop.
Author
Owner

@khteh commented on GitHub (Apr 5, 2025):

Yes, the script is fixed and I am able to run it now but with quick and dirty fix as suggested in https://stackoverflow.com/questions/79554394/running-ollama-as-a-k8s-sts-with-external-script-as-entrypoint-to-load-models

A good solution is what is suggested in the referenced #5385 and #3369

<!-- gh-comment-id:2780592842 --> @khteh commented on GitHub (Apr 5, 2025): Yes, the script is fixed and I am able to run it now but with quick and dirty fix as suggested in https://stackoverflow.com/questions/79554394/running-ollama-as-a-k8s-sts-with-external-script-as-entrypoint-to-load-models A good solution is what is suggested in the referenced #5385 and #3369
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6641