[GH-ISSUE #5848] The logs do not contain the request content sent by the client. #3648

Closed
opened 2026-04-12 14:25:54 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @H9990HH969 on GitHub (Jul 22, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5848

To facilitate debugging of the program, I need to see the requests sent to the large model from the frontend. However, I've noticed that the request URLs and contents are not visible in the logs. Where can I find them?

I have deployed DBGPT using Docker.

Originally created by @H9990HH969 on GitHub (Jul 22, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5848 To facilitate debugging of the program, I need to see the requests sent to the large model from the frontend. However, I've noticed that the request URLs and contents are not visible in the logs. Where can I find them? I have deployed DBGPT using Docker.
Author
Owner

@rick-github commented on GitHub (Jul 22, 2024):

The request body is not logged. You can see the prompt sent to the inference engine by setting OLLAMA_DEBUG=1 in the server environment and searching for msg="generate request" in the logs. If you want the contents of the HTTP traffic you'll have to use external tools. I usually install tcpflow in my ollama container image and use that:

$ docker compose exec -it ollama apt install -y tcpflow
$ docker compose exec -it ollama tcpflow -c 'src port 11434 or dst port 11434'

Other tools like ngrep are also useful.

<!-- gh-comment-id:2242697973 --> @rick-github commented on GitHub (Jul 22, 2024): The request body is not logged. You can see the prompt sent to the inference engine by setting `OLLAMA_DEBUG=1` in the server environment and searching for `msg="generate request"` in the logs. If you want the contents of the HTTP traffic you'll have to use external tools. I usually install `tcpflow` in my ollama container image and use that: ``` $ docker compose exec -it ollama apt install -y tcpflow $ docker compose exec -it ollama tcpflow -c 'src port 11434 or dst port 11434' ``` Other tools like `ngrep` are also useful.
Author
Owner

@konstantin1722 commented on GitHub (Jul 23, 2024):

I used WireShark. I have ollama deployed on a regular pc, so I used such a filter to inspect how langchain composes some promts: ip.dst == 192.168.1.55 && tcp.dstport == 11434 && http. Accordingly, by slightly changing the filter you can intercept requests for the docker container.

<!-- gh-comment-id:2245262190 --> @konstantin1722 commented on GitHub (Jul 23, 2024): I used WireShark. I have ollama deployed on a regular pc, so I used such a filter to inspect how langchain composes some promts: `ip.dst == 192.168.1.55 && tcp.dstport == 11434 && http`. Accordingly, by slightly changing the filter you can intercept requests for the docker container.
Author
Owner

@dhiltgen commented on GitHub (Aug 1, 2024):

As mentioned above, by default, for privacy concerns, we intentionally do not log the contents of the requests on the server. If you enable debug logging, then much more logging is produced, including the prompts.

<!-- gh-comment-id:2264147003 --> @dhiltgen commented on GitHub (Aug 1, 2024): As mentioned above, by default, for privacy concerns, we intentionally do not log the contents of the requests on the server. If you enable debug logging, then much more logging is produced, including the prompts.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3648