[GH-ISSUE #12053] Regression: ollama serve only listens on localhost in v0.11.5/0.11.6 (no longer binds to external interfaces) #33765

Closed
opened 2026-04-22 16:45:36 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @rwellinger on GitHub (Aug 23, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12053

What is the issue?

Since upgrading from v0.11.4 to v0.11.5 (and also v0.11.6), ollama serve only listens on localhost/127.0.0.1.
In v0.11.4 it listened on all interfaces (0.0.0.0), so I could connect via my LAN IP (e.g. 10.0.1.120).
Now external connections from other devices on the network fail.

Steps to Reproduce:

Install Ollama v0.11.5 or v0.11.6

Run:

ollama serve

or even with:

OLLAMA_HOST=0.0.0.0:11434 ollama serve

From another machine, try:

curl http://10.0.1.120:11434/api/tags

Connection refused.

Expected Behavior:
Server should bind to all interfaces (or at least respect OLLAMA_HOST=0.0.0.0) and allow access via LAN IP.

Actual Behavior:

v0.11.5 / v0.11.6 → only accessible via 127.0.0.1

v0.11.4 → accessible via LAN IP (10.0.1.120)

Environment:

OS: MacOS 15.6.1

Architecture: arm64 (M1)

GPU/CPU setup: M1 Max

Notes:
Explicitly setting OLLAMA_HOST=0.0.0.0:11434 does not help in v0.11.5/0.11.6.

This looks like a regression introduced in v0.11.5 and 0.11.6.

Relevant log output


OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.11.6

Originally created by @rwellinger on GitHub (Aug 23, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12053 ### What is the issue? Since upgrading from v0.11.4 to v0.11.5 (and also v0.11.6), ollama serve only listens on localhost/127.0.0.1. In v0.11.4 it listened on all interfaces (0.0.0.0), so I could connect via my LAN IP (e.g. 10.0.1.120). Now external connections from other devices on the network fail. Steps to Reproduce: Install Ollama v0.11.5 or v0.11.6 Run: ollama serve # or even with: OLLAMA_HOST=0.0.0.0:11434 ollama serve From another machine, try: curl http://10.0.1.120:11434/api/tags Connection refused. Expected Behavior: Server should bind to all interfaces (or at least respect OLLAMA_HOST=0.0.0.0) and allow access via LAN IP. Actual Behavior: v0.11.5 / v0.11.6 → only accessible via 127.0.0.1 v0.11.4 → accessible via LAN IP (10.0.1.120) Environment: OS: MacOS 15.6.1 Architecture: arm64 (M1) GPU/CPU setup: M1 Max Notes: Explicitly setting OLLAMA_HOST=0.0.0.0:11434 does not help in v0.11.5/0.11.6. This looks like a regression introduced in v0.11.5 and 0.11.6. ### Relevant log output ```shell ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.11.6
GiteaMirror added the bug label 2026-04-22 16:45:36 -05:00
Author
Owner

@rwellinger commented on GitHub (Nov 14, 2025):

I can confirm, that it looks much better know with 0.12.11

NAME ID SIZE PROCESSOR CONTEXT UNTIL
gpt-oss:20b 17052f91a42e 14 GB 100% GPU 8192 59 minutes from now

He dos not use CPU anymore like before.

Thank you guys!!

<!-- gh-comment-id:3533741105 --> @rwellinger commented on GitHub (Nov 14, 2025): I can confirm, that it looks much better know with 0.12.11 NAME ID SIZE PROCESSOR CONTEXT UNTIL gpt-oss:20b 17052f91a42e 14 GB 100% GPU 8192 59 minutes from now He dos not use CPU anymore like before. Thank you guys!!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33765