[GH-ISSUE #15517] 0.20.5 Ollama Error: 500 Internal Server Error: memory layout cannot be allocated #71977

Open
opened 2026-05-05 03:13:05 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @Haiwen-Yin on GitHub (Apr 12, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15517

What is the issue?

Error: 500 Internal Server Error: memory layout cannot be allocated

AMD Ryzen AI Max+ 395 128GB (64GB for system/64GB for GPU)

server.log has been uploaded.
server.log

Relevant log output


OS

Windows

GPU

AMD

CPU

AMD

Ollama version

0.20.5

Originally created by @Haiwen-Yin on GitHub (Apr 12, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15517 ### What is the issue? Error: 500 Internal Server Error: memory layout cannot be allocated AMD Ryzen AI Max+ 395 128GB (64GB for system/64GB for GPU) server.log has been uploaded. [server.log](https://github.com/user-attachments/files/26658608/server.log) ### Relevant log output ```shell ``` ### OS Windows ### GPU AMD ### CPU AMD ### Ollama version 0.20.5
GiteaMirror added the bug label 2026-05-05 03:13:05 -05:00
Author
Owner

@rick-github commented on GitHub (Apr 12, 2026):

Try setting OLLAMA_CONTEXT_LENGTH to something smaller than the default of 262144, say 32768.

<!-- gh-comment-id:4231488320 --> @rick-github commented on GitHub (Apr 12, 2026): Try setting [`OLLAMA_CONTEXT_LENGTH`](https://docs.ollama.com/context-length) to something smaller than the default of 262144, say 32768.
Author
Owner

@Haiwen-Yin commented on GitHub (Apr 12, 2026):

I just found if I unzip ollama-windows-amd64.zip and ollama-windows-amd64-rocm.zip to a directory. Then set some parameter, including OLLAMA_CONTEXT_LENTH=262144, using ollama serve to start Ollama service, then ran LLM by using ollama run qwen3.5:35b, it goes fine!

If I use OllamaSetup.exe to install Ollama, then LLM can not run!

<!-- gh-comment-id:4231528589 --> @Haiwen-Yin commented on GitHub (Apr 12, 2026): I just found if I unzip [ollama-windows-amd64.zip](https://github.com/ollama/ollama/releases/download/v0.20.5/ollama-windows-amd64.zip) and [ollama-windows-amd64-rocm.zip](https://github.com/ollama/ollama/releases/download/v0.20.5/ollama-windows-amd64-rocm.zip) to a directory. Then set some parameter, including OLLAMA_CONTEXT_LENTH=262144, using ollama serve to start Ollama service, then ran LLM by using ollama run qwen3.5:35b, it goes fine! If I use [OllamaSetup.exe](https://github.com/ollama/ollama/releases/download/v0.20.5/OllamaSetup.exe) to install Ollama, then LLM can not run!
Author
Owner

@rick-github commented on GitHub (Apr 12, 2026):

Post the log from ollama going fine.

<!-- gh-comment-id:4231531748 --> @rick-github commented on GitHub (Apr 12, 2026): Post the log from ollama going fine.
Author
Owner

@PureBlissAK commented on GitHub (Apr 18, 2026):

🤖 Automated Triage & Analysis Report

Issue: #15517
Analyzed: 2026-04-18T18:20:45.329740

Analysis

  • Type: unknown
  • Severity: medium
  • Components: unknown

Implementation Plan

  • Effort: medium
  • Steps:

This issue has been triaged and marked for implementation.

<!-- gh-comment-id:4274307115 --> @PureBlissAK commented on GitHub (Apr 18, 2026): <!-- ollama-issue-orchestrator:v1 issue:15517 --> ## 🤖 Automated Triage & Analysis Report **Issue**: #15517 **Analyzed**: 2026-04-18T18:20:45.329740 ### Analysis - **Type**: unknown - **Severity**: medium - **Components**: unknown ### Implementation Plan - **Effort**: medium - **Steps**: *This issue has been triaged and marked for implementation.*
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71977