[GH-ISSUE #15853] docs: add guide for using third-party OpenAI-compatible providers as remote backends #72162

Open
opened 2026-05-05 03:34:28 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @FuturMix on GitHub (Apr 28, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15853

Problem

Ollama supports OpenAI-compatible APIs, but the documentation doesn't cover how to use third-party API gateways as remote backends. Users who want to access cloud models (GPT, Claude, Gemini) alongside local Ollama models have to figure this out themselves.

Proposed Solution

Add a documentation page (e.g., docs/openai-providers.md) that explains how to configure Ollama tools and SDKs to work with OpenAI-compatible API gateways. The guide would cover:

  1. Setting OPENAI_API_BASE / base_url to point to a third-party provider
  2. Code examples in Python and JavaScript using the OpenAI SDK
  3. A pattern for switching between local Ollama and remote providers

This would complement the existing docs/cloud.mdx and docs/api/openai-compatibility.mdx pages.

Why This Is Important

Many Ollama users want a hybrid setup: local models for privacy/cost, cloud models for capability. A clear guide would reduce friction and help users get the most out of Ollama's OpenAI compatibility.

I'm happy to submit a PR with the documentation if you think this would be useful.

Originally created by @FuturMix on GitHub (Apr 28, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15853 ## Problem Ollama supports OpenAI-compatible APIs, but the documentation doesn't cover how to use third-party API gateways as remote backends. Users who want to access cloud models (GPT, Claude, Gemini) alongside local Ollama models have to figure this out themselves. ## Proposed Solution Add a documentation page (e.g., `docs/openai-providers.md`) that explains how to configure Ollama tools and SDKs to work with OpenAI-compatible API gateways. The guide would cover: 1. Setting `OPENAI_API_BASE` / `base_url` to point to a third-party provider 2. Code examples in Python and JavaScript using the OpenAI SDK 3. A pattern for switching between local Ollama and remote providers This would complement the existing `docs/cloud.mdx` and `docs/api/openai-compatibility.mdx` pages. ## Why This Is Important Many Ollama users want a hybrid setup: local models for privacy/cost, cloud models for capability. A clear guide would reduce friction and help users get the most out of Ollama's OpenAI compatibility. I'm happy to submit a PR with the documentation if you think this would be useful.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#72162