[PR #14400] [CLOSED] cmd: config update to use native Ollama API for OpenClaw #40530

Closed
opened 2026-04-23 01:24:46 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/14400
Author: @hoyyeva
Created: 2/24/2026
Status: Closed

Base: mainHead: hoyyeva/openclaw-config


📝 Commits (1)

  • 99e470f cmd: config update to use native Ollama API for OpenClaw

📊 Changes

2 files changed (+2 additions, -2 deletions)

View changed files

📝 cmd/config/openclaw.go (+1 -1)
📝 cmd/config/openclaw_test.go (+1 -1)

📄 Description

Summary

  • Remove /v1 suffix from the OpenClaw provider baseUrl to use Ollama's native API
    (/api/chat) instead of the OpenAI-compatible endpoint (/v1/chat/completions)
  • The api field is already set to "ollama", which expects the base URL without /v1

Context

The OpenClaw config validator rejects the current configuration with:

invalid config at C:\openclaw.json:
  models.providers.ollama.api: invalid input

The api: "ollama" type expects a base URL pointing to the native Ollama endpoint (e.g.
http://127.0.0.1:11434), not the OpenAI-compatible /v1 endpoint. The native API also has
the advantage of properly supporting streaming with tool calls, which the
OpenAI-compatible endpoint drops (openclaw#11828).


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/14400 **Author:** [@hoyyeva](https://github.com/hoyyeva) **Created:** 2/24/2026 **Status:** ❌ Closed **Base:** `main` ← **Head:** `hoyyeva/openclaw-config` --- ### 📝 Commits (1) - [`99e470f`](https://github.com/ollama/ollama/commit/99e470fe9f7dbc0288fa8eccdbc2cc51a7a9aba6) cmd: config update to use native Ollama API for OpenClaw ### 📊 Changes **2 files changed** (+2 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `cmd/config/openclaw.go` (+1 -1) 📝 `cmd/config/openclaw_test.go` (+1 -1) </details> ### 📄 Description **Summary** - Remove /v1 suffix from the OpenClaw provider baseUrl to use Ollama's native API (/api/chat) instead of the OpenAI-compatible endpoint (/v1/chat/completions) - The api field is already set to "ollama", which expects the base URL without /v1 **Context** The OpenClaw config validator rejects the current configuration with: ``` invalid config at C:\openclaw.json: models.providers.ollama.api: invalid input ``` The api: "ollama" type expects a base URL pointing to the native Ollama endpoint (e.g. http://127.0.0.1:11434), not the OpenAI-compatible /v1 endpoint. The native API also has the advantage of properly supporting streaming with tool calls, which the OpenAI-compatible endpoint drops ([openclaw#11828](https://github.com/openclaw/openclaw/issues/11828)). --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-23 01:24:46 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#40530