[GH-ISSUE #9785] Incorrect Context Length Metadata for Gemma 3 Model #6399

Closed
opened 2026-04-12 17:55:29 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @tclm on GitHub (Mar 15, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9785

What is the issue?

I noticed a discrepancy between the context length metadata for the Gemma 3 model in the Ollama model repository and the official documentation from Google.

According to Google's official documentation, Gemma 3 models support a context length of 128K tokens. However, the Ollama model repository lists the context length as only 8K tokens.

This incorrect metadata could mislead users and prevent them from fully utilizing the capabilities of the Gemma 3 model.

Could you please update the context length metadata for the Gemma 3 model to accurately reflect the 128K context window as specified by Google?

Thank you for your attention to this matter.

Image

Image

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @tclm on GitHub (Mar 15, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9785 ### What is the issue? I noticed a discrepancy between the context length metadata for the Gemma 3 model in the Ollama model repository and the official documentation from Google. According to Google's official documentation, Gemma 3 models support a context length of 128K tokens. However, the Ollama model repository lists the context length as only 8K tokens. This incorrect metadata could mislead users and prevent them from fully utilizing the capabilities of the Gemma 3 model. Could you please update the context length metadata for the Gemma 3 model to accurately reflect the 128K context window as specified by Google? Thank you for your attention to this matter. ![Image](https://github.com/user-attachments/assets/ce1f2044-c30a-460f-8620-bfda146f53a5) ![Image](https://github.com/user-attachments/assets/890c4ce7-18b4-4261-9d56-83646935f734) ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 17:55:29 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6399