[GH-ISSUE #5012] Seeded API request is returning inconsistent results #28931

Closed
opened 2026-04-22 07:29:24 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ScreamingHawk on GitHub (Jun 13, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5012

What is the issue?

When using seed: 42069 and temperature: 0.0, I get consistent results only for the first ~126 characters. When using a seed, I would expect a deterministic result for the entire interaction.

    from ollama import Client, Options
    
    ollama_client = Client(host="http://localhost:11434")
    opts = Options()
    opts["seed"] = 42069
    opts["temperature"] = 0.0
    
    messages = [
        {"role": "user", "content": "Write a description of a cat"},
    ]
    
    res1 = ollama_client.chat(
        model="orca-mini",
        stream=False,
        messages=messages,
        options=opts
    )
    res2 = ollama_client.chat(
        model="orca-mini",
        stream=False,
        messages=messages,
        options=opts
    )
    
    # Fails
    assert res1["message"]["content"] == res2["message"]["content"]

OS

Windows

GPU

AMD

CPU

Intel

Ollama version

0.1.42

Originally created by @ScreamingHawk on GitHub (Jun 13, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5012 ### What is the issue? When using `seed: 42069` and `temperature: 0.0`, I get consistent results only for the first ~126 characters. When using a seed, I would expect a deterministic result for the entire interaction. ```py from ollama import Client, Options ollama_client = Client(host="http://localhost:11434") opts = Options() opts["seed"] = 42069 opts["temperature"] = 0.0 messages = [ {"role": "user", "content": "Write a description of a cat"}, ] res1 = ollama_client.chat( model="orca-mini", stream=False, messages=messages, options=opts ) res2 = ollama_client.chat( model="orca-mini", stream=False, messages=messages, options=opts ) # Fails assert res1["message"]["content"] == res2["message"]["content"] ``` ### OS Windows ### GPU AMD ### CPU Intel ### Ollama version 0.1.42
GiteaMirror added the bug label 2026-04-22 07:29:24 -05:00
Author
Owner

@jmorganca commented on GitHub (Jun 14, 2024):

Hi there, sorry about this, I am looking into it. I'll merge with https://github.com/ollama/ollama/issues/4990

<!-- gh-comment-id:2168357167 --> @jmorganca commented on GitHub (Jun 14, 2024): Hi there, sorry about this, I am looking into it. I'll merge with https://github.com/ollama/ollama/issues/4990
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28931