[GH-ISSUE #10136] Where is the log file and how can I configure the location of it? #6650

Closed
opened 2026-04-12 18:20:33 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @khteh on GitHub (Apr 5, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10136

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md No, it is NOT there:

root@ollama-0:/# find . -name "*.log" -print
./var/log/bootstrap.log
./var/log/dpkg.log
./var/log/alternatives.log
./var/log/apt/history.log
./var/log/apt/term.log

It seems to be logging to the console instead of file..?

Originally created by @khteh on GitHub (Apr 5, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10136 https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md No, it is NOT there: ``` root@ollama-0:/# find . -name "*.log" -print ./var/log/bootstrap.log ./var/log/dpkg.log ./var/log/alternatives.log ./var/log/apt/history.log ./var/log/apt/term.log ``` It seems to be logging to the console instead of file..?
GiteaMirror added the question label 2026-04-12 18:20:33 -05:00
Author
Owner

@khteh commented on GitHub (Apr 5, 2025):

I run ollama container in a local k8s using /api/version for the probes. How can I skip the probes logs?

[ollama-0 ollama] [GIN] 2025/04/05 - 03:25:11 | 200 |      41.629µs |   192.168.0.149 | GET      "/api/version" 
<!-- gh-comment-id:2780176200 --> @khteh commented on GitHub (Apr 5, 2025): I run ollama container in a local k8s using `/api/version` for the probes. How can I skip the probes logs? ``` [ollama-0 ollama] [GIN] 2025/04/05 - 03:25:11 | 200 | 41.629µs | 192.168.0.149 | GET "/api/version" ```
Author
Owner

@ghmer commented on GitHub (Apr 5, 2025):

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md No, it is NOT there:

from the above link:

When you run Ollama in a container, the logs go to stdout/stderr in the container

No clue if you can suppress these messages, but I don't think so. If you explained your use case or your issue, someone might help you figure out a solution/workaround.

<!-- gh-comment-id:2780544986 --> @ghmer commented on GitHub (Apr 5, 2025): > https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md No, it is NOT there: > from the above link: > When you run Ollama in a container, the logs go to stdout/stderr in the container No clue if you can suppress these messages, but I don't think so. If you explained your use case or your issue, someone might help you figure out a solution/workaround.
Author
Owner

@khteh commented on GitHub (Apr 5, 2025):

This is not good for EFK logging stack. Fluentd can filter out logs based on patterns. So this is a feature enhancement request.

<!-- gh-comment-id:2780546109 --> @khteh commented on GitHub (Apr 5, 2025): This is not good for EFK logging stack. Fluentd can filter out logs based on patterns. So this is a feature enhancement request.
Author
Owner

@khteh commented on GitHub (Apr 7, 2026):

https://github.com/ollama/ollama/issues/15387

<!-- gh-comment-id:4199037695 --> @khteh commented on GitHub (Apr 7, 2026): https://github.com/ollama/ollama/issues/15387
Author
Owner

@rick-github commented on GitHub (Apr 7, 2026):

Image
<!-- gh-comment-id:4199061060 --> @rick-github commented on GitHub (Apr 7, 2026): <img width="562" height="105" alt="Image" src="https://github.com/user-attachments/assets/729e36e6-524d-41dc-b8e3-5c7debb2bed5" />
Author
Owner

@khteh commented on GitHub (Apr 7, 2026):

I suggest logging to a physical file so that we have a choice to use a sidecar, fluentd for instance, to pick it up for downstream processing of the logs. This way, the logs stay in the file system and could be rotated.

<!-- gh-comment-id:4199131309 --> @khteh commented on GitHub (Apr 7, 2026): I suggest logging to a physical file so that we have a choice to use a sidecar, fluentd for instance, to pick it up for downstream processing of the logs. This way, the logs stay in the file system and could be rotated.
Author
Owner

@rick-github commented on GitHub (Apr 7, 2026):

In docker, ollama output is logged to a physical file managed by the container manager, and docker logs reads from that file. If your container manager doesn't do that, you could add 2>&1 | tee /var/log/ollama.log to your run script.

<!-- gh-comment-id:4199200269 --> @rick-github commented on GitHub (Apr 7, 2026): In docker, ollama output is logged to a physical file managed by the container manager, and `docker logs` reads from that file. If your container manager doesn't do that, you could add `2>&1 | tee /var/log/ollama.log` to your run script.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6650