[GH-ISSUE #9049] Long system prompt results in model does not answer question according to prompt content. #31650

Closed
opened 2026-04-22 12:18:23 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @netspym on GitHub (Feb 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9049

What is the issue?

Has anyone recently deployed ollama on Ubuntu? I've noticed that no matter which model I use, including qwen, deepseek, and phi4 (fp16 full model), if the system prompt I give it is too long, it doesn't respond according to the content of the system prompt. It only works properly if the system prompt is within 100 characters. Has anyone else encountered this issue? Thanks.

num_ctx is already set at 100000 big enough.
data = {"model": llm_model_name, "messages": messages, "temperature": 0.3, "num_ctx": 1000000, "keep_alive": -1}
data_json = json.dumps(data, ensure_ascii=False)
headers = {'Content-Type': 'application/json'}

Relevant log output


OS

Linux

GPU

No response

CPU

AMD

Ollama version

0.5.7

Originally created by @netspym on GitHub (Feb 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9049 ### What is the issue? Has anyone recently deployed ollama on Ubuntu? I've noticed that no matter which model I use, including qwen, deepseek, and phi4 (fp16 full model), if the system prompt I give it is too long, it doesn't respond according to the content of the system prompt. It only works properly if the system prompt is within 100 characters. Has anyone else encountered this issue? Thanks. num_ctx is already set at 100000 big enough. data = {"model": llm_model_name, "messages": messages, "temperature": 0.3, "num_ctx": 1000000, "keep_alive": -1} data_json = json.dumps(data, ensure_ascii=False) headers = {'Content-Type': 'application/json'} ### Relevant log output ```shell ``` ### OS Linux ### GPU _No response_ ### CPU AMD ### Ollama version 0.5.7
GiteaMirror added the bug label 2026-04-22 12:18:23 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 12, 2025):

Server logs will aid in debugging.

<!-- gh-comment-id:2654155814 --> @rick-github commented on GitHub (Feb 12, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will aid in debugging.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#31650