[GH-ISSUE #14030] Claude Code fails with 1211 Unknown Model when using Ollama backend #34930

Closed
opened 2026-04-22 18:56:09 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @ag-i on GitHub (Feb 2, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14030

What is the issue?

Ollama is running correctly and can load and run models on GPU, but Claude Code fails to launch models via Ollama, Codex work find
returning:

error code: "1211"
message: "Unknown Model"

What works
Ollama daemon starts successfully
Codex with Ollama as the backend GPU acceleration works (NVIDIA)
Models load and run correctly via ollama run with CONTEXT 64000
GPU acceleration works (NVIDIA)

What fails
Using Claude Code with Ollama as the backend
Model launch fails immediately with Unknown Model (1211)

Environment:
OS: Debian GNU/Linux 13 (trixie)
Kernel: 6.12.63+deb13-amd64
Hardware
CPU: Intel i9-14900k
GPU: NVIDIA GeForce RTX 5090
RAM: 128 GB

Version:
Ollama: 0.15.4
Claude code: v2.1.19
codex-cli 0.91.0

Model runs correctly via CLI

Image Image Image
Originally created by @ag-i on GitHub (Feb 2, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14030 ### What is the issue? Ollama is running correctly and can load and run models on GPU, but Claude Code fails to launch models via Ollama, Codex work find returning: ```vbnet error code: "1211" message: "Unknown Model" ``` What works Ollama daemon starts successfully Codex with Ollama as the backend GPU acceleration works (NVIDIA) Models load and run correctly via ollama run with CONTEXT 64000 GPU acceleration works (NVIDIA) What fails Using Claude Code with Ollama as the backend Model launch fails immediately with Unknown Model (1211) Environment: OS: Debian GNU/Linux 13 (trixie) Kernel: 6.12.63+deb13-amd64 Hardware CPU: Intel i9-14900k GPU: NVIDIA GeForce RTX 5090 RAM: 128 GB Version: Ollama: 0.15.4 Claude code: v2.1.19 codex-cli 0.91.0 Model runs correctly via CLI <img width="631" height="435" alt="Image" src="https://github.com/user-attachments/assets/ddeee19c-5505-45b9-8d75-361958757b88" /> <img width="1051" height="433" alt="Image" src="https://github.com/user-attachments/assets/771b6d7f-fa1b-4ed5-9847-3d893ee4adee" /> <img width="747" height="75" alt="Image" src="https://github.com/user-attachments/assets/ca9f59f8-6cb5-4897-b6cc-551004965fca" />
GiteaMirror added the bug label 2026-04-22 18:56:09 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 2, 2026):

How are you starting Claude Code? Did you configure the environment variables to tell CC how to connect to ollama?

<!-- gh-comment-id:3835813828 --> @rick-github commented on GitHub (Feb 2, 2026): How are you starting Claude Code? Did you [configure](https://docs.ollama.com/integrations/claude-code#manual-setup) the environment variables to tell CC how to connect to ollama?
Author
Owner

@ag-i commented on GitHub (Feb 2, 2026):

Hi @rick-github
using cli cmmand which in doc for claude code and codex

ollama launch claude --config abd selecting model form dropdow menu as some on given SS images

ollama launch codex --config
ollama launch claude --config

<!-- gh-comment-id:3836060274 --> @ag-i commented on GitHub (Feb 2, 2026): Hi @rick-github using cli cmmand which in doc for claude code and codex ollama launch claude --config abd selecting model form dropdow menu as some on given SS images ```bash ollama launch codex --config ollama launch claude --config ```
Author
Owner

@rick-github commented on GitHub (Feb 2, 2026):

If you launch Claude (ollama launch claude --config), what's the output when you enter !env | grep ANTH at the > prompt?

<!-- gh-comment-id:3836175910 --> @rick-github commented on GitHub (Feb 2, 2026): If you launch Claude (`ollama launch claude --config`), what's the output when you enter `!env | grep ANTH` at the **`>`** prompt?
Author
Owner

@ag-i commented on GitHub (Feb 2, 2026):

i have check all env are good to see taking model info as check in 1st SS image of Claude code model id is picked glm-4.7-flash not the glm-4.7-flash:latest

i install codex for testing it working file but not claude code

<!-- gh-comment-id:3836227932 --> @ag-i commented on GitHub (Feb 2, 2026): i have check all env are good to see taking model info as check in 1st SS image of Claude code model id is picked `glm-4.7-flash` not the `glm-4.7-flash:latest` i install codex for testing it working file but not claude code
Author
Owner

@rick-github commented on GitHub (Feb 2, 2026):

1211 is an error returned from the Anthropic endpoint, which means Claude Code is not using your ollama endpoint.

<!-- gh-comment-id:3836326988 --> @rick-github commented on GitHub (Feb 2, 2026): 1211 is an error returned from the Anthropic endpoint, which means Claude Code is not using your ollama endpoint.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34930