[GH-ISSUE #6565] Does ollma have the feature to save model response in log file? #29892

Closed
opened 2026-04-22 09:13:35 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @keezen on GitHub (Aug 30, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6565

OS: Linux
ollama version: 0.3.7-rc5
model: starcoder2:3b

I am deploying ollama for code completion and set OLLAMA_DEBUG=1, but the log file only saves the model request but not the model reponse for completion.
Does ollma have the feature to save model response in log file?

Here are the log fragments from ollama serve:

[GIN] 2024/08/30 - 10:28:57 | 200 | 82.985212ms | 127.0.0.1 | POST "/v1/completions"
time=2024-08-30T10:28:57.412+08:00 level=DEBUG source=sched.go:403 msg="context for request finished"
time=2024-08-30T10:28:57.414+08:00 level=DEBUG source=sched.go:334 msg="runner with non-zero duration has gone idle, adding timer" modelPath=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e duration=5m0s
time=2024-08-30T10:28:57.414+08:00 level=DEBUG source=sched.go:352 msg="after processing request finished event" modelPath=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e refCount=0
time=2024-08-30T10:28:59.157+08:00 level=DEBUG source=sched.go:571 msg="evaluating already loaded" model=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e
DEBUG [process_single_task] slot data | n_idle_slots=4 n_processing_slots=0 task_id=619 tid="140186734161920" timestamp=1724984939
time=2024-08-30T10:28:59.160+08:00 level=DEBUG source=routes.go:211 msg="generate request" prompt="xxxx"
time=2024-08-30T10:34:20.685+08:00 level=DEBUG source=sched.go:334 msg="runner with non-zero duration has gone idle, adding timer" modelPath=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e duration=5m0s

Originally created by @keezen on GitHub (Aug 30, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6565 OS: Linux ollama version: 0.3.7-rc5 model: starcoder2:3b I am deploying ollama for code completion and set OLLAMA_DEBUG=1, but the log file only saves the model request but not the model reponse for completion. Does ollma have the feature to save model response in log file? Here are the log fragments from ollama serve: -------------------------------------------------- [GIN] 2024/08/30 - 10:28:57 | 200 | 82.985212ms | 127.0.0.1 | POST "/v1/completions" time=2024-08-30T10:28:57.412+08:00 level=DEBUG source=sched.go:403 msg="context for request finished" time=2024-08-30T10:28:57.414+08:00 level=DEBUG source=sched.go:334 msg="runner with non-zero duration has gone idle, adding timer" modelPath=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e duration=5m0s time=2024-08-30T10:28:57.414+08:00 level=DEBUG source=sched.go:352 msg="after processing request finished event" modelPath=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e refCount=0 time=2024-08-30T10:28:59.157+08:00 level=DEBUG source=sched.go:571 msg="evaluating already loaded" model=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e DEBUG [process_single_task] slot data | n_idle_slots=4 n_processing_slots=0 task_id=619 tid="140186734161920" timestamp=1724984939 time=2024-08-30T10:28:59.160+08:00 level=DEBUG source=routes.go:211 msg="generate request" prompt="xxxx" time=2024-08-30T10:34:20.685+08:00 level=DEBUG source=sched.go:334 msg="runner with non-zero duration has gone idle, adding timer" modelPath=/home/kas/.ollama/models/blobs/sha256-28bfdfaeba9f51611c00ed322ba684ce6db076756dbc46643f98a8a748c5199e duration=5m0s
GiteaMirror added the feature request label 2026-04-22 09:13:35 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 30, 2024):

ollama doesn't log the response. This is normally done with external passive tools like tcpflow or tcpdump. You could also run a proxy to intercept the traffic and log it:

# start ollama listening on a different port
OLLAMA_HOST=localhost:11435 ollama serve
# run a process on the original port that fowards traffic to ollama and records in a file
! [ -p pipe ] && mknod pipe p
nc -ln -p 11434 < pipe | tee -a traffic-in.log | nc localhost 11435 | tee -a traffic-out.log > pipe

<!-- gh-comment-id:2321507680 --> @rick-github commented on GitHub (Aug 30, 2024): ollama doesn't log the response. This is normally done with external passive tools like `tcpflow` or `tcpdump`. You could also run a proxy to intercept the traffic and log it: ```shell # start ollama listening on a different port OLLAMA_HOST=localhost:11435 ollama serve # run a process on the original port that fowards traffic to ollama and records in a file ! [ -p pipe ] && mknod pipe p nc -ln -p 11434 < pipe | tee -a traffic-in.log | nc localhost 11435 | tee -a traffic-out.log > pipe ```
Author
Owner

@keezen commented on GitHub (Sep 2, 2024):

ollama doesn't log the response. This is normally done with external passive tools like tcpflow or tcpdump. You could also run a proxy to intercept the traffic and log it:

# start ollama listening on a different port
OLLAMA_HOST=localhost:11435 ollama serve
# run a process on the original port that fowards traffic to ollama and records in a file
! [ -p pipe ] && mknod pipe p
nc -ln -p 11434 < pipe | tee -a traffic-in.log | nc localhost 11435 | tee -a traffic-out.log > pipe

thanks for your reply!

<!-- gh-comment-id:2323724145 --> @keezen commented on GitHub (Sep 2, 2024): > ollama doesn't log the response. This is normally done with external passive tools like `tcpflow` or `tcpdump`. You could also run a proxy to intercept the traffic and log it: > > ```shell > # start ollama listening on a different port > OLLAMA_HOST=localhost:11435 ollama serve > # run a process on the original port that fowards traffic to ollama and records in a file > ! [ -p pipe ] && mknod pipe p > nc -ln -p 11434 < pipe | tee -a traffic-in.log | nc localhost 11435 | tee -a traffic-out.log > pipe > ``` thanks for your reply!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29892