[GH-ISSUE #4293] longtext llama3-gradient bug #28440

Open
opened 2026-04-22 06:37:37 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @bambooqj on GitHub (May 9, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4293

What is the issue?

If I use 'ollama' for long text processing, then the 'system' statements will no longer be effective. Instead, it will produce random outputs. The model is 'llama3-gradient'.

systemmsg="""
Please analyze the type of website based on the 'body' content I provide, and return to me in JSON format with the structure {type:..., why:...}.
"""
def get_webpage_content(url):
    try:
        if response.status_code == 200:
            return response.text
        else:
            print(f'Failed to retrieve the webpage. Status code: {response.status_code}')
            return None
    except requests.exceptions.RequestException as e:
        print(f'An error occurred: {e}')
        return None

body = get_webpage_content('http://www.sohu.com')
response = ollama.generate(model='llama3-gradient', prompt=body,format='json',options={
    "seed": 123,
    "num_ctx": 32000
  },
  system=systemmsg)
print(response['response'])

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.1.34

Originally created by @bambooqj on GitHub (May 9, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4293 ### What is the issue? If I use 'ollama' for long text processing, then the 'system' statements will no longer be effective. Instead, it will produce random outputs. The model is 'llama3-gradient'. ``` systemmsg=""" Please analyze the type of website based on the 'body' content I provide, and return to me in JSON format with the structure {type:..., why:...}. """ def get_webpage_content(url): try: if response.status_code == 200: return response.text else: print(f'Failed to retrieve the webpage. Status code: {response.status_code}') return None except requests.exceptions.RequestException as e: print(f'An error occurred: {e}') return None body = get_webpage_content('http://www.sohu.com') response = ollama.generate(model='llama3-gradient', prompt=body,format='json',options={ "seed": 123, "num_ctx": 32000 }, system=systemmsg) print(response['response']) ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.34
GiteaMirror added the bug label 2026-04-22 06:37:37 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28440