[GH-ISSUE #5660] Ollama 0.2.2 cannot read the system prompt when invoking the API using Python. #50041

Closed
opened 2026-04-28 13:55:45 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @letdo1945 on GitHub (Jul 13, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5660

Originally assigned to: @jmorganca on GitHub.

What is the issue?

model: qwen2&glm4
After the Ollama update, when I invoke Ollama through Python, the model is unable to read the system prompt.

def LLM_Process(model, sys_prom, usr_prom):
    messages = [
        {'role': 'user', 'content': usr_prom},
        {'role': 'system', 'content': sys_prom}
    ]
    resp = ollama.chat(model, messages)
    try:     
      out = resp['message']['content']
      return out
    except AttributeError:
        # 可能是信息太长或有违规信息
        print("跳过处理。")
        return None

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.2.2

Originally created by @letdo1945 on GitHub (Jul 13, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5660 Originally assigned to: @jmorganca on GitHub. ### What is the issue? model: qwen2&glm4 After the Ollama update, when I invoke Ollama through Python, the model is unable to read the system prompt. ``` def LLM_Process(model, sys_prom, usr_prom): messages = [ {'role': 'user', 'content': usr_prom}, {'role': 'system', 'content': sys_prom} ] resp = ollama.chat(model, messages) try: out = resp['message']['content'] return out except AttributeError: # 可能是信息太长或有违规信息 print("跳过处理。") return None ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.2.2
GiteaMirror added the bug label 2026-04-28 13:55:45 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 13, 2024):

Concur.

$ curl -s ollama:11434/api/version | jq
{
  "version": "0.2.1"
}
$ curl -s ollama:11434/api/chat -d '{"model": "gemma2", "messages": [{"role": "system", "content": "You speak like a pirate"}, {"role": "user", "content": "Hello!"}], "stream": false}' | jq .message
{
  "role": "assistant",
  "content": "Ahoy there, matey!  Shiver me timbers, it be a pleasure to meet ye! What brings ye to these digital waters? 🦜✨"
}
$ curl -s localhost:11434/api/version | jq
{
  "version": "0.2.2"
}
$ curl -s localhost:11434/api/chat -d '{"model": "gemma2", "messages": [{"role": "system", "content": "You speak like a pirate"}, {"role": "user", "content": "Hello!"}], "stream": false}' | jq .message
{
  "role": "assistant",
  "content": "Hello! 👋  How can I help you today? 😄"
}
<!-- gh-comment-id:2226721531 --> @rick-github commented on GitHub (Jul 13, 2024): Concur. ``` $ curl -s ollama:11434/api/version | jq { "version": "0.2.1" } $ curl -s ollama:11434/api/chat -d '{"model": "gemma2", "messages": [{"role": "system", "content": "You speak like a pirate"}, {"role": "user", "content": "Hello!"}], "stream": false}' | jq .message { "role": "assistant", "content": "Ahoy there, matey! Shiver me timbers, it be a pleasure to meet ye! What brings ye to these digital waters? 🦜✨" } ``` ``` $ curl -s localhost:11434/api/version | jq { "version": "0.2.2" } $ curl -s localhost:11434/api/chat -d '{"model": "gemma2", "messages": [{"role": "system", "content": "You speak like a pirate"}, {"role": "user", "content": "Hello!"}], "stream": false}' | jq .message { "role": "assistant", "content": "Hello! 👋 How can I help you today? 😄" } ```
Author
Owner

@jmorganca commented on GitHub (Jul 13, 2024):

Hi sorry about this – it has been fixed in https://github.com/ollama/ollama/releases/tag/v0.2.3

<!-- gh-comment-id:2226780329 --> @jmorganca commented on GitHub (Jul 13, 2024): Hi sorry about this – it has been fixed in https://github.com/ollama/ollama/releases/tag/v0.2.3
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50041