[GH-ISSUE #15829] No Context Memory & Tool Calling Failure with Local Gemma4 via Ollama Docker #72148

Open
opened 2026-05-05 03:32:58 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @OrzHex on GitHub (Apr 27, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15829

What is the issue?

I containerized Ollama using Docker and pulled the local gemma4:e2b / gemma4:26b models. I connect via port 11434. Both cc-haha (https://github.com/NanmiCoder/cc-haha ) and claude-code-tudou (https://github.com/AICoderTudou/claude-code-tudou ) can connect normally — the web interface loads and regular chat works.

However, two critical issues occur:
The conversation has no context memory (the model cannot remember previous messages in the chat).
Tool calling / function usage is not working.

How can I fix these problems?

Example of the context loss issue:
You: Hi, I am Tom. I am 8 years old.
AI: Hi Tom! Nice to meet you.
You: How old is Tom?
AI: I am gemma4. I do not have an age.

Example of the tool calling abnormal issue:
You: Open local file config.json.
AI: I can help you with file-related questions. Please tell me more details.

Relevant log output


OS

Docker

GPU

No response

CPU

Intel

Ollama version

0.21.1

Originally created by @OrzHex on GitHub (Apr 27, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15829 ### What is the issue? I containerized Ollama using Docker and pulled the local gemma4:e2b / gemma4:26b models. I connect via port 11434. Both cc-haha (https://github.com/NanmiCoder/cc-haha ) and claude-code-tudou (https://github.com/AICoderTudou/claude-code-tudou ) can connect normally — the web interface loads and regular chat works. However, two critical issues occur: The conversation has no context memory (the model cannot remember previous messages in the chat). Tool calling / function usage is not working. How can I fix these problems? Example of the context loss issue: You: Hi, I am Tom. I am 8 years old. AI: Hi Tom! Nice to meet you. You: How old is Tom? AI: I am gemma4. I do not have an age. Example of the tool calling abnormal issue: You: Open local file config.json. AI: I can help you with file-related questions. Please tell me more details. ### Relevant log output ```shell ``` ### OS Docker ### GPU _No response_ ### CPU Intel ### Ollama version 0.21.1
GiteaMirror added the bug label 2026-05-05 03:32:58 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#72148