[GH-ISSUE #10354] The 0.6.5 .tgz binary returns /404 on /models, but /api/tags works. Can you confirm it’s the correct build? #53313

Closed
opened 2026-04-29 02:36:30 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Just-us-Crash on GitHub (Apr 21, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10354

Description

I'm trying to integrate Ollama 0.6.5 with OpenWebUI and am running into a problem where the /models endpoint (introduced in recent versions) returns a 404 Not Found, but /api/tags works and returns model metadata correctly.

This suggests that the server binary in the .tgz release is either misbuilt or mislabeled — it appears to behave like a pre-0.6.5 version internally, despite reporting the version as 0.6.5 when running ollama --version.


Steps to Reproduce

  1. Download and install ollama-linux-amd64.tgz from:
    https://github.com/ollama/ollama/releases/download/v0.6.5/ollama-linux-amd64.tgz
  2. Extract and move the binary to /usr/local/bin/ollama
  3. Run:
    OLLAMA_HOST=0.0.0.0 ollama serve
  4. In another terminal, run:
    curl http://localhost:11434/models → returns 404 Not Found
    curl http://localhost:11434/api/tags returns expected model list

Expected Behavior

The /models endpoint should be available and return the list of installed models, as expected in Ollama version 0.6.5.


Actual Behavior

  • /models returns 404
  • /api/tags works
  • ollama --version reports: 0.6.5

System Info

  • OS: Pop!_OS 22.04
  • Install method: Manual from .tgz archive (GitHub release)
  • Hardware: RTX 4090, CUDA installed and working
  • Ollama binary path: /usr/local/bin/ollama

Request

Can you confirm whether the .tgz binary for v0.6.5 is correct and includes support for the /models endpoint? If not, could a fixed release be provided, or is there an alternative method to obtain a clean, fully featured 0.6.5 build?

Thank you!

Originally created by @Just-us-Crash on GitHub (Apr 21, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10354 ### Description I'm trying to integrate Ollama 0.6.5 with OpenWebUI and am running into a problem where the `/models` endpoint (introduced in recent versions) returns a `404 Not Found`, but `/api/tags` works and returns model metadata correctly. This suggests that the server binary in the `.tgz` release is either misbuilt or mislabeled — it appears to behave like a pre-0.6.5 version internally, despite reporting the version as 0.6.5 when running `ollama --version`. --- ### Steps to Reproduce 1. Download and install `ollama-linux-amd64.tgz` from: https://github.com/ollama/ollama/releases/download/v0.6.5/ollama-linux-amd64.tgz 2. Extract and move the binary to `/usr/local/bin/ollama` 3. Run: `OLLAMA_HOST=0.0.0.0 ollama serve` 4. In another terminal, run: `curl http://localhost:11434/models` → returns `404 Not Found` `curl http://localhost:11434/api/tags` → ✅ returns expected model list --- ### Expected Behavior The `/models` endpoint should be available and return the list of installed models, as expected in Ollama version 0.6.5. --- ### Actual Behavior - `/models` returns 404 - `/api/tags` works - `ollama --version` reports: `0.6.5` --- ### System Info - OS: Pop!_OS 22.04 - Install method: Manual from `.tgz` archive (GitHub release) - Hardware: RTX 4090, CUDA installed and working - Ollama binary path: `/usr/local/bin/ollama` --- ### Request Can you confirm whether the `.tgz` binary for `v0.6.5` is correct and includes support for the `/models` endpoint? If not, could a fixed release be provided, or is there an alternative method to obtain a clean, fully featured `0.6.5` build? Thank you!
Author
Owner

@JiWonOck commented on GitHub (Apr 21, 2025):

The 0.6.5 version doesn't seem to work as well as you think.

When operating offline, an error occurs in a similar context.

I uninstalled the existing ollama version and tried to upgrade ollama to the v.0.6.5.

Despite installing the version 0.6.5 that Ollama provided, the message ollama version is 0.2.1 / Warning: client version is 0.6.5 appears.

<!-- gh-comment-id:2817838435 --> @JiWonOck commented on GitHub (Apr 21, 2025): The 0.6.5 version doesn't seem to work as well as you think. When operating offline, an error occurs in a similar context. I uninstalled the existing ollama version and tried to upgrade ollama to the v.0.6.5. Despite installing the version 0.6.5 that Ollama provided, the message `ollama version is 0.2.1 / Warning: client version is 0.6.5` appears.
Author
Owner

@rick-github commented on GitHub (Apr 21, 2025):

/models is not an ollama endpoint. Perhaps you are thinking of /v1/models, which is an OpenAI compatibility endpoint.

<!-- gh-comment-id:2818065196 --> @rick-github commented on GitHub (Apr 21, 2025): `/models` is not an ollama endpoint. Perhaps you are thinking of `/v1/models`, which is an OpenAI compatibility endpoint.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53313