[GH-ISSUE #3853] support 128k context length phi3 #2388

Closed
opened 2026-04-12 12:42:22 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @lostmygithubaccount on GitHub (Apr 23, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3853

I might be misinterpreting, but it looks like only the 4k context length phi3 model is supported currently: https://ollama.com/library/phi3

(at least without downloading the weights separately and creating a Modelfile)

Originally created by @lostmygithubaccount on GitHub (Apr 23, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3853 I might be misinterpreting, but it looks like only the 4k context length phi3 model is supported currently: https://ollama.com/library/phi3 (at least without downloading the weights separately and creating a Modelfile)
GiteaMirror added the feature request label 2026-04-12 12:42:22 -05:00
Author
Owner

@thinkverse commented on GitHub (Apr 23, 2024):

Ollama has to wait for the upstream llama.cpp backend (https://github.com/ggerganov/llama.cpp/issues/6849#issuecomment-2072860077) to support it first.

<!-- gh-comment-id:2073407523 --> @thinkverse commented on GitHub (Apr 23, 2024): Ollama has to wait for the upstream llama.cpp backend (https://github.com/ggerganov/llama.cpp/issues/6849#issuecomment-2072860077) to support it first.
Author
Owner

@lostmygithubaccount commented on GitHub (Apr 23, 2024):

ah thanks, will follow along there as well

<!-- gh-comment-id:2073416709 --> @lostmygithubaccount commented on GitHub (Apr 23, 2024): ah thanks, will follow along there as well
Author
Owner
<!-- gh-comment-id:2125214441 --> @napa3um commented on GitHub (May 22, 2024): https://www.reddit.com/r/LocalLLaMA/comments/1cxi14h/phi3_128k_model_support_merged_into_llamacpp/
Author
Owner

@mirandadam commented on GitHub (May 22, 2024):

128k context is now available for phi3 mini, small, medium and vision:
https://azure.microsoft.com/en-us/blog/new-models-added-to-the-phi-3-family-available-on-microsoft-azure/
I commented here just to make sure those are taken into consideration since when this issue was first opened, only mini was available.

<!-- gh-comment-id:2125761943 --> @mirandadam commented on GitHub (May 22, 2024): 128k context is now available for phi3 mini, small, medium and vision: https://azure.microsoft.com/en-us/blog/new-models-added-to-the-phi-3-family-available-on-microsoft-azure/ I commented here just to make sure those are taken into consideration since when this issue was first opened, only mini was available.
Author
Owner

@hchasens commented on GitHub (May 23, 2024):

Ollama has to wait for the upstream llama.cpp backend (ggerganov/llama.cpp#6849 (comment)) to support it first.

I just got added. Next Ollama release with a pull from the llamacpp mainline we'll get support.

<!-- gh-comment-id:2127871621 --> @hchasens commented on GitHub (May 23, 2024): > Ollama has to wait for the upstream llama.cpp backend ([ggerganov/llama.cpp#6849 (comment)](https://github.com/ggerganov/llama.cpp/issues/6849#issuecomment-2072860077)) to support it first. I just got added. Next Ollama release with a pull from the llamacpp mainline we'll get support.
Author
Owner

@napa3um commented on GitHub (May 24, 2024):

https://github.com/ollama/ollama/issues/4574#issuecomment-2127189379 - But will this problem go away in 0.1.39? In 0.1.39-rc1 it is present

<!-- gh-comment-id:2128581466 --> @napa3um commented on GitHub (May 24, 2024): https://github.com/ollama/ollama/issues/4574#issuecomment-2127189379 - But will this problem go away in 0.1.39? In 0.1.39-rc1 it is present
Author
Owner
<!-- gh-comment-id:2146721644 --> @jmorganca commented on GitHub (Jun 4, 2024): https://ollama.com/library/phi3:mini-128k https://ollama.com/library/phi3:medium-128k
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2388