[GH-ISSUE #10975] Log request/response payload content #53745

Closed
opened 2026-04-29 04:39:30 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @phucly01 on GitHub (Jun 5, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10975

I tried to troubleshoot a tool calling problem and wanted to see the payload contents of the http request. I have turned on DEBUG but it doesn't really help. The log only show http code and uri.

It would be helpful if there is a way to turn on the details log where http requests and responses can be printed in the log along with payload contents.

Originally created by @phucly01 on GitHub (Jun 5, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10975 I tried to troubleshoot a tool calling problem and wanted to see the payload contents of the http request. I have turned on DEBUG but it doesn't really help. The log only show http code and uri. It would be helpful if there is a way to turn on the details log where http requests and responses can be printed in the log along with payload contents.
GiteaMirror added the feature request label 2026-04-29 04:39:31 -05:00
Author
Owner

@rick-github commented on GitHub (Jun 5, 2025):

Setting OLLAMA_DEBUG=2 will show the prompt as well.

Logging HTTP requests and responses is usually done with an external tool. For example, mitmdump:

services:
  ollama-backend:
    image: ollama/ollama:${OLLAMA_DOCKER_TAG-latest}
    volumes:
      - ${OLLAMA_MODELS-./ollama}:/root/.ollama
    environment:
      - OLLAMA_KEEP_ALIVE=${OLLAMA_KEEP_ALIVE--1}
      - OLLAMA_DEBUG=${OLLAMA_DEBUG-2}

  ollama-mitmproxy:
    image: mitmproxy/mitmproxy
    command: [ "/usr/local/bin/mitmdump", "--flow-detail", "4", "--mode", "reverse:http://ollama-backend:11434" ]
    ports:
      - 11434:8080

I use docker here as a environment management tool but this can be run from the command line as well. The downside of this is that mitmproxy buffers the output so the streaming effect is lost.

In non-Windows environments nc can be used as a mitm proxy to record traffic:

# start ollama listening on a different port
OLLAMA_HOST=localhost:11435 ollama serve
# run a process on the original port that fowards traffic to ollama and records in a file
! [ -p pipe ] && mknod pipe p
nc -ln -p 11434 < pipe | tee -a traffic-in.log | nc localhost 11435 | tee -a traffic-out.log > pipe

Other alternatives are tcpdump and tcpflow which intercept the traffic without having to set up a proxy. The output of those requires some post-processing.

<!-- gh-comment-id:2943820125 --> @rick-github commented on GitHub (Jun 5, 2025): Setting `OLLAMA_DEBUG=2` will show the prompt as well. Logging HTTP requests and responses is usually done with an external tool. For example, mitmdump: ```yaml services: ollama-backend: image: ollama/ollama:${OLLAMA_DOCKER_TAG-latest} volumes: - ${OLLAMA_MODELS-./ollama}:/root/.ollama environment: - OLLAMA_KEEP_ALIVE=${OLLAMA_KEEP_ALIVE--1} - OLLAMA_DEBUG=${OLLAMA_DEBUG-2} ollama-mitmproxy: image: mitmproxy/mitmproxy command: [ "/usr/local/bin/mitmdump", "--flow-detail", "4", "--mode", "reverse:http://ollama-backend:11434" ] ports: - 11434:8080 ``` I use docker here as a environment management tool but this can be run from the command line as well. The downside of this is that mitmproxy buffers the output so the streaming effect is lost. In non-Windows environments `nc` can be used as a mitm proxy to record traffic: ```sh # start ollama listening on a different port OLLAMA_HOST=localhost:11435 ollama serve # run a process on the original port that fowards traffic to ollama and records in a file ! [ -p pipe ] && mknod pipe p nc -ln -p 11434 < pipe | tee -a traffic-in.log | nc localhost 11435 | tee -a traffic-out.log > pipe ``` Other alternatives are `tcpdump` and `tcpflow` which intercept the traffic without having to set up a proxy. The output of those requires some post-processing.
Author
Owner

@phucly01 commented on GitHub (Jun 6, 2025):

Thanks, will give it a try. This will be a wild goose chase.

<!-- gh-comment-id:2947120170 --> @phucly01 commented on GitHub (Jun 6, 2025): Thanks, will give it a try. This will be a wild goose chase.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53745