[GH-ISSUE #6177] run OI with OLLAMA SERVER IN NETWORK #3860

Closed
opened 2026-04-12 14:42:02 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @RM-S2 on GitHub (Aug 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6177

What is the issue?

i try to run OI with ollama server in another server computer in my network.and the command start OI is:
interpreter --model ollama/llama3.1 --api_base "http://192.168.3.13:11434" --api_key "fake_key"
but i get error say cant fine ollama in my computer ,what i thought is wrong because OI shoud find ollama in my server computer.
what should i do ?

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.1.32

Originally created by @RM-S2 on GitHub (Aug 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6177 ### What is the issue? i try to run OI with ollama server in another server computer in my network.and the command start OI is: interpreter --model ollama/llama3.1 --api_base "http://192.168.3.13:11434" --api_key "fake_key" but i get error say cant fine ollama in my computer ,what i thought is wrong because OI shoud find ollama in my server computer. what should i do ? ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the bug label 2026-04-12 14:42:02 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 5, 2024):

By default ollama binds to localhost, ie 127.0.0.1. If you want it to be available to other computers, you need to set OLLAMA_HOST=0.0.0.0:11434 in the server environment. Also, ollama version 0.1.32 will not support llama3.1, you need to upgrade to 0.3.0 or newer.

<!-- gh-comment-id:2269282538 --> @rick-github commented on GitHub (Aug 5, 2024): By default ollama binds to `localhost`, ie `127.0.0.1`. If you want it to be available to other computers, you need to [set](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-windows) `OLLAMA_HOST=0.0.0.0:11434` in the server environment. Also, ollama version 0.1.32 will not support llama3.1, you need to upgrade to 0.3.0 or newer.
Author
Owner

@RM-S2 commented on GitHub (Aug 8, 2024):

It work well now.thank a lot.

<!-- gh-comment-id:2276150040 --> @RM-S2 commented on GitHub (Aug 8, 2024): It work well now.thank a lot.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3860