[GH-ISSUE #3669] Ollama api port confusion #2260

Closed
opened 2026-04-12 12:32:00 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @17Reset on GitHub (Apr 16, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3669

What is the issue?

After I start the two ollama services on the specified ports (8080: llm_api, 8081:emb_api), I add the local gguf model to each of them, and then use curl http://192.168.18.165:8080/api/tags to view the model inside llm_api, which will print the emb_api with the models. This is obviously incorrect.

What did you expect to see?

No response

Steps to reproduce

OLLAMA_HOST=192.168.18.165:8080 ollama serve
curl http://192.168.18.165:8080/api/create -d '{
  "name": "smaug_34b_q8",
  "modelfile": "FROM /mnt/Athena/Model/Abacus/llm_quantized/smaug_34b_q8.gguf"
}'
OLLAMA_HOST=192.168.18.165:8081 ollama serve
curl http://192.168.18.165:8081/api/create -d '{
  "name": "bge-large-en",
  "modelfile": "FROM /mnt/Athena/Model/BAAI/repo/bge-large-en-v1.5-gguf/bge-large-en-v1.5-f16.gguf"
}'

Are there any recent changes that introduced the issue?

No response

OS

Linux

Architecture

No response

Platform

No response

Ollama version

ollama version is 0.1.31

GPU

Nvidia

GPU info

No response

CPU

Intel

Other software

No response

Originally created by @17Reset on GitHub (Apr 16, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3669 ### What is the issue? After I start the two ollama services on the specified ports (8080: llm_api, 8081:emb_api), I add the local gguf model to each of them, and then use `curl http://192.168.18.165:8080/api/tags` to view the model inside llm_api, which will print the emb_api with the models. This is obviously incorrect. ### What did you expect to see? _No response_ ### Steps to reproduce ``` OLLAMA_HOST=192.168.18.165:8080 ollama serve curl http://192.168.18.165:8080/api/create -d '{ "name": "smaug_34b_q8", "modelfile": "FROM /mnt/Athena/Model/Abacus/llm_quantized/smaug_34b_q8.gguf" }' ``` ``` OLLAMA_HOST=192.168.18.165:8081 ollama serve curl http://192.168.18.165:8081/api/create -d '{ "name": "bge-large-en", "modelfile": "FROM /mnt/Athena/Model/BAAI/repo/bge-large-en-v1.5-gguf/bge-large-en-v1.5-f16.gguf" }' ``` ### Are there any recent changes that introduced the issue? _No response_ ### OS Linux ### Architecture _No response_ ### Platform _No response_ ### Ollama version ollama version is 0.1.31 ### GPU Nvidia ### GPU info _No response_ ### CPU Intel ### Other software _No response_
GiteaMirror added the bug label 2026-04-12 12:32:00 -05:00
Author
Owner

@thinkverse commented on GitHub (Apr 16, 2024):

I don't think this would be considered incorrect. Ollama saves models to a shared location. For Linux that is /usr/share/ollama/.ollama/models.

There is an OLLAMA_MODELS environment variable you can set that I just saw in a comment, not sure if it's documented anywhere. @jmorganca could this be used to set the models directory for each service?

f335722275/server/modelpath.go (L102-L107)

<!-- gh-comment-id:2058849678 --> @thinkverse commented on GitHub (Apr 16, 2024): I don't think this would be considered incorrect. Ollama saves models to a shared location. For Linux that is `/usr/share/ollama/.ollama/models`. There is an `OLLAMA_MODELS` environment variable you can set that I just saw in a comment, not sure if it's documented anywhere. @jmorganca could this be used to set the models directory for each service? https://github.com/ollama/ollama/blob/f335722275d8184836d1b777f356fa7b0a012ede/server/modelpath.go#L102-L107
Author
Owner

@thinkverse commented on GitHub (Apr 16, 2024):

Tested setting the OLLAMA_MODELS environment variable and with that each service has its own model directory.

When the services were first created each /api/tags had no models.

Screenshot 2024-04-16 at 13 40 09

Afterward, I created a model from Ollama's default shared directory into only one service.

Screenshot 2024-04-16 at 13 42 38

And when I later checked /api/tags only one service has a model in its directory.

Screenshot 2024-04-16 at 13 44 17

If this is your desired result then set the OLLAMA_MODELS environment variable when creating your service so each service has its own models directory, for instance:

OLLAMA_HOST=127.0.0.1:8081 OLLAMA_MODELS=~/Ollama/emb_models ollama serve
<!-- gh-comment-id:2058898468 --> @thinkverse commented on GitHub (Apr 16, 2024): Tested setting the `OLLAMA_MODELS` environment variable and with that each service has its own model directory. When the services were first created each `/api/tags` had no models. <img width="1120" alt="Screenshot 2024-04-16 at 13 40 09" src="https://github.com/ollama/ollama/assets/2221746/96a15b96-f1ce-4247-9226-3f8050e8c3b0"> Afterward, I created a model from Ollama's default shared directory into only one service. <img width="1206" alt="Screenshot 2024-04-16 at 13 42 38" src="https://github.com/ollama/ollama/assets/2221746/036cf028-018d-47e8-81ec-aca0ca916d83"> And when I later checked `/api/tags` only one service has a model in its directory. <img width="1158" alt="Screenshot 2024-04-16 at 13 44 17" src="https://github.com/ollama/ollama/assets/2221746/ac4faae4-a9bc-45cf-8a91-237d8c6004b6"> If this is your desired result then set the `OLLAMA_MODELS` environment variable when creating your service so each service has its own models directory, for instance: ```bash OLLAMA_HOST=127.0.0.1:8081 OLLAMA_MODELS=~/Ollama/emb_models ollama serve ```
Author
Owner

@dims commented on GitHub (Apr 16, 2024):

Duplicate of https://github.com/ollama/ollama/issues/3666

<!-- gh-comment-id:2058925662 --> @dims commented on GitHub (Apr 16, 2024): Duplicate of https://github.com/ollama/ollama/issues/3666
Author
Owner

@pdevine commented on GitHub (Apr 16, 2024):

This is expected behaviour. You can use the OLLAMA_MODELS env variable to change the location for each instance.

<!-- gh-comment-id:2060055086 --> @pdevine commented on GitHub (Apr 16, 2024): This is expected behaviour. You can use the `OLLAMA_MODELS` env variable to change the location for each instance.
Author
Owner

@17Reset commented on GitHub (Apr 17, 2024):

Thanks, I was the one who didn't read the documentation carefully.

<!-- gh-comment-id:2060261755 --> @17Reset commented on GitHub (Apr 17, 2024): Thanks, I was the one who didn't read the documentation carefully.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2260