mirror of
https://github.com/ollama/ollama.git
synced 2026-03-08 23:04:13 -05:00
docs: format compat docs (#14678)
This commit is contained in:
@@ -12,7 +12,6 @@ To use Ollama with tools that expect the Anthropic API (like Claude Code), set t
|
||||
|
||||
```shell
|
||||
export ANTHROPIC_AUTH_TOKEN=ollama # required but ignored
|
||||
export ANTHROPIC_API_KEY="" # required but ignored
|
||||
export ANTHROPIC_BASE_URL=http://localhost:11434
|
||||
```
|
||||
|
||||
@@ -269,7 +268,7 @@ ollama launch claude --config
|
||||
Set the environment variables and run Claude Code:
|
||||
|
||||
```shell
|
||||
ANTHROPIC_AUTH_TOKEN=ollama ANTHROPIC_BASE_URL=http://localhost:11434 ANTHROPIC_API_KEY="" claude --model qwen3-coder
|
||||
ANTHROPIC_AUTH_TOKEN=ollama ANTHROPIC_BASE_URL=http://localhost:11434 claude --model qwen3-coder
|
||||
```
|
||||
|
||||
Or set the environment variables in your shell profile:
|
||||
@@ -277,7 +276,6 @@ Or set the environment variables in your shell profile:
|
||||
```shell
|
||||
export ANTHROPIC_AUTH_TOKEN=ollama
|
||||
export ANTHROPIC_BASE_URL=http://localhost:11434
|
||||
export ANTHROPIC_API_KEY=""
|
||||
```
|
||||
|
||||
Then run Claude Code with any Ollama model:
|
||||
|
||||
@@ -6,7 +6,7 @@ Ollama provides compatibility with parts of the [OpenAI API](https://platform.op
|
||||
|
||||
## Usage
|
||||
|
||||
### Simple `v1/chat/completions` example
|
||||
### Simple `/v1/chat/completions` example
|
||||
|
||||
<CodeGroup dropdown>
|
||||
|
||||
@@ -57,7 +57,7 @@ curl -X POST http://localhost:11434/v1/chat/completions \
|
||||
|
||||
</CodeGroup>
|
||||
|
||||
### Simple `v1/responses` example
|
||||
### Simple `/v1/responses` example
|
||||
|
||||
<CodeGroup dropdown>
|
||||
|
||||
@@ -103,7 +103,7 @@ curl -X POST http://localhost:11434/v1/responses \
|
||||
|
||||
</CodeGroup>
|
||||
|
||||
### v1/chat/completions with vision example
|
||||
### `/v1/chat/completions` with vision example
|
||||
|
||||
<CodeGroup dropdown>
|
||||
|
||||
|
||||
Reference in New Issue
Block a user