[GH-ISSUE #13282] Memory layout cannot be allocated #34537

Closed
opened 2026-04-22 18:11:50 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @zhaoyuxin2 on GitHub (Dec 1, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13282

What is the issue?

We hope to deploy the qwen3-vl:235b model locally, but now we are encountering the issue of "Memory layout cannot be allocated". Could you please advise on how to solve this problem? (When we deployed qwen2.5-vl:72b, it could be deployed normally and the computer's memory was sufficient.)

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @zhaoyuxin2 on GitHub (Dec 1, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13282 ### What is the issue? We hope to deploy the qwen3-vl:235b model locally, but now we are encountering the issue of "Memory layout cannot be allocated". Could you please advise on how to solve this problem? (When we deployed qwen2.5-vl:72b, it could be deployed normally and the computer's memory was sufficient.) ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-22 18:11:50 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34537