[GH-ISSUE #14974] Qwen 3.5:27b and 35b running locally does not perform agentic abiltiies in claude code. #71687

Closed
opened 2026-05-05 02:20:47 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Alunacoz on GitHub (Mar 20, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14974

What is the issue?

OS: Arch Linux (Cachy OS)
GPU: AMD RX 9070XT
CPU: AMD Ryzen 9800X3D

Steps to reproduce:

  1. Install ollama-rocm from the AUR
  2. run ollama launch claude --model qwen3.5:27b
  3. Prompt the model to attempt to do anything that isn't reading files
  4. The prompt will exit and nothing happens. Model stops thinking.

Notes:
ollama launch claude --model qwen3.5:cloud DOES work, and can read and write files. It's just all versions that run locally seem to have this issue.

It looks like it's trying to run commands but fails.

Using the recommended "qwen:3.5" that's not cloud based also fails. (I assume it's the 9B variant?)

Relevant log output

Example conversation:

27B:

❯ In the root of the project directory, create a file named "test.txt" and write "hello" inside of it.

● Write
  hello

❯ You did not write the file. I do not see it.

● I'll check what files exist in the current directory to understand what's happening.

  ls -la

35b:

 ❯ ollama launch claude --model qwen3.5:35b

pulling manifest
pulling 900dde62fb7e: 100% ▕█████████████████████████████████████████████████████████████▏  23 GB
pulling 7339fa418c9a: 100% ▕█████████████████████████████████████████████████████████████▏  11 KB
pulling 9371364b27a5: 100% ▕█████████████████████████████████████████████████████████████▏   65 B
pulling 606ad9f1ecbc: 100% ▕█████████████████████████████████████████████████████████████▏  482 B
verifying sha256 digest
writing manifest
success

Launching Claude Code with qwen3.5:35b...
 ▐▛███▜▌   Claude Code v2.1.80
▝▜█████▛▘  qwen3.5:35b · API Usage Billing
  ▘▘ ▝▝    ~/Repos/Redacted

  ↑ Opus now defaults to 1M context · 5x more room, same pricing

❯ In the root of the project directory, create a file named "test.txt" and write "hello" inside of it.

● I'll create the test.txt file in the root directory with "hello" inside.

✻ Cooked for 48s

❯ You did not write the file. I do not see it.

● I need to check what happened with the file. Let me look at the current directory and verify what files exist:

  /home/redacted/Repos/Redacted

❯

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.18.2

Originally created by @Alunacoz on GitHub (Mar 20, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14974 ### What is the issue? OS: Arch Linux (Cachy OS) GPU: AMD RX 9070XT CPU: AMD Ryzen 9800X3D Steps to reproduce: 1. Install ollama-rocm from the AUR 2. run `ollama launch claude --model qwen3.5:27b` 3. Prompt the model to attempt to do anything that isn't reading files 4. The prompt will exit and nothing happens. Model stops thinking. Notes: `ollama launch claude --model qwen3.5:cloud` DOES work, and can read and write files. It's just all versions that run locally seem to have this issue. It looks like it's trying to run commands but fails. Using the recommended "qwen:3.5" that's not cloud based also fails. (I assume it's the 9B variant?) ### Relevant log output ```shell Example conversation: 27B: ❯ In the root of the project directory, create a file named "test.txt" and write "hello" inside of it. ● Write hello ❯ You did not write the file. I do not see it. ● I'll check what files exist in the current directory to understand what's happening. ls -la 35b: ❯ ollama launch claude --model qwen3.5:35b pulling manifest pulling 900dde62fb7e: 100% ▕█████████████████████████████████████████████████████████████▏ 23 GB pulling 7339fa418c9a: 100% ▕█████████████████████████████████████████████████████████████▏ 11 KB pulling 9371364b27a5: 100% ▕█████████████████████████████████████████████████████████████▏ 65 B pulling 606ad9f1ecbc: 100% ▕█████████████████████████████████████████████████████████████▏ 482 B verifying sha256 digest writing manifest success Launching Claude Code with qwen3.5:35b... ▐▛███▜▌ Claude Code v2.1.80 ▝▜█████▛▘ qwen3.5:35b · API Usage Billing ▘▘ ▝▝ ~/Repos/Redacted ↑ Opus now defaults to 1M context · 5x more room, same pricing ❯ In the root of the project directory, create a file named "test.txt" and write "hello" inside of it. ● I'll create the test.txt file in the root directory with "hello" inside. ✻ Cooked for 48s ❯ You did not write the file. I do not see it. ● I need to check what happened with the file. Let me look at the current directory and verify what files exist: /home/redacted/Repos/Redacted ❯ ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.18.2
GiteaMirror added the bug label 2026-05-05 02:20:47 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 20, 2026):

Increase context window size.

<!-- gh-comment-id:4096683962 --> @rick-github commented on GitHub (Mar 20, 2026): Increase [context window size](https://docs.ollama.com/integrations/claude-code#manual-setup:~:text=Claude%20Code%20requires%20a%20large%20context%20window).
Author
Owner

@Alunacoz commented on GitHub (Mar 22, 2026):

That was it, I didn't notice that ollama defaults to 4k context on a 16gb VRAM GPU!

<!-- gh-comment-id:4106835639 --> @Alunacoz commented on GitHub (Mar 22, 2026): That was it, I didn't notice that ollama defaults to 4k context on a 16gb VRAM GPU!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71687