[GH-ISSUE #11629] latest release not working when updated from a previous version #7683

Open
opened 2026-04-12 19:47:29 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @gosocial2 on GitHub (Aug 1, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11629

What is the issue?

After more than a year of stable performance and reliable updates, the latest version of Ollama completely fails to start. Running any CLI command (e.g. ollama list) results in the error:

Error: ollama server not responding - timed out waiting for server to start
Error: ollama server not responding - timed out waiting for server to start

This behavior is consistent across restarts and reinstalls. The GUI also presents a blank, useless modal as shown in the attached screenshot — further proof that the frontend is responding to a backend that never initializes.

This regression was not present in older builds, meaning something introduced in the recent release broke a core functionality of the tool. As a long-time user and advocate of Ollama, this is not only disappointing but reflects a deeper, systemic issue in software change management.

Why this matters – and a wake-up call:
For over a year, Ollama demonstrated exemplary software stability, and I praised it often. But this release violates the fundamental principle of responsible development:

“If it works, don’t break it.”

Introducing breaking changes that render a tool unusable for loyal users with no rollback or compatibility check shows a disregard for the real-world consequences of poor release hygiene.

I invite you to read this post:
https://www.ozar.net/blog/software-development/introducing-software-stability-index
Software Stability Index (SSI) GitHub repo
SSI explainer video

SSI is a quantifiable framework to assess whether a development team is increasing or destroying a project’s long-term reliability. Right now, this release would score negative.

Recommendations:
Issue a hotfix or public rollback instructions
🔁 Implement regression and startup health checks into your CI pipeline
🧠 Consider adopting a Stability Index approach to ensure you’re improving—not deteriorating—the project

What might help your team:
If you’re feeling burned out, bored, etc—don’t push changes just to “do something.” It’s better to freeze a stable product than ship risky edits that break trust with your users. Broken releases help nobody, including yourselves.

Relevant log output

Error: ollama server not responding - timed out waiting for server to start
Error: ollama server not responding - timed out waiting for server to start

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

Warning: could not connect to a running Ollama instance Warning: client version is 0.10.1

Originally created by @gosocial2 on GitHub (Aug 1, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11629 ### What is the issue? After more than a year of stable performance and reliable updates, the latest version of Ollama completely fails to start. Running any CLI command (e.g. ollama list) results in the error: Error: ollama server not responding - timed out waiting for server to start Error: ollama server not responding - timed out waiting for server to start This behavior is consistent across restarts and reinstalls. The GUI also presents a blank, useless modal as shown in the attached screenshot — further proof that the frontend is responding to a backend that never initializes. This regression was not present in older builds, meaning something introduced in the recent release broke a core functionality of the tool. As a long-time user and advocate of Ollama, this is not only disappointing but reflects a deeper, systemic issue in software change management. Why this matters – and a wake-up call: For over a year, Ollama demonstrated exemplary software stability, and I praised it often. But this release violates the fundamental principle of responsible development: “If it works, don’t break it.” Introducing breaking changes that render a tool unusable for loyal users with no rollback or compatibility check shows a disregard for the real-world consequences of poor release hygiene. I invite you to read this post: • https://www.ozar.net/blog/software-development/introducing-software-stability-index • [Software Stability Index (SSI) GitHub repo](https://github.com/ozarnet/software-stability-index) • [SSI explainer video](https://www.youtube.com/watch?v=VaKVc4rprxg) SSI is a quantifiable framework to assess whether a development team is increasing or destroying a project’s long-term reliability. Right now, this release would score negative. Recommendations: • ✅ Issue a hotfix or public rollback instructions • 🔁 Implement regression and startup health checks into your CI pipeline • 🧠 Consider adopting a Stability Index approach to ensure you’re improving—not deteriorating—the project What might help your team: If you’re feeling burned out, bored, etc—don’t push changes just to “do something.” It’s better to freeze a stable product than ship risky edits that break trust with your users. Broken releases help nobody, including yourselves. ### Relevant log output ```shell Error: ollama server not responding - timed out waiting for server to start Error: ollama server not responding - timed out waiting for server to start ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version Warning: could not connect to a running Ollama instance Warning: client version is 0.10.1
GiteaMirror added the bug label 2026-04-12 19:47:29 -05:00
Author
Owner

@gosocial2 commented on GitHub (Aug 3, 2025):

UPDATE: Root cause identified – still a UX regression in the latest version. After some troubleshooting, I’ve identified the cause:

Ollama began auto-starting after the update — despite no explicit opt-in and even though “Open at Login” had been disabled in previous versions — while the external volume containing previously downloaded models was not yet mounted (and it is not configured to be automatically mounted on purpose). Once the volume was mounted manually, the issue resolved.

However, I’d like to emphasize that previous versions handled this scenario gracefully.
They allowed:

  • A clean fallback when no models were available
  • Re-downloading new models to the default directory
  • Or waiting silently until the user mounted the external volume

Now, the latest version:

  • Fails to launch the backend server entirely
  • Shows no helpful error or fallback mechanism
  • Breaks both CLI and GUI UX (blank window, timeout errors)

This confirms a UX regression — not just an edge case — and deserves proper triage (hence the recommendations in the original issue posting).

Open questions still worth investigating:

  • What happens if a user downloads new models to internal storage and then mounts the external volume with old models?
  • Will Ollama merge the libraries, ignore one, or become unstable?
  • Should users manually set the model path to avoid confusion?

Thanks for taking a look — I hope this helps improve resilience and error handling in future versions.

<!-- gh-comment-id:3148301523 --> @gosocial2 commented on GitHub (Aug 3, 2025): UPDATE: Root cause identified – still a UX regression in the latest version. After some troubleshooting, I’ve identified the cause: Ollama began auto-starting after the update — despite no explicit opt-in and even though “Open at Login” had been disabled in previous versions — while the external volume containing previously downloaded models was not yet mounted (and it is not configured to be automatically mounted on purpose). Once the volume was mounted manually, the issue resolved. However, I’d like to emphasize that previous versions handled this scenario gracefully. They allowed: - A clean fallback when no models were available - Re-downloading new models to the default directory - Or waiting silently until the user mounted the external volume Now, the latest version: - Fails to launch the backend server entirely - Shows no helpful error or fallback mechanism - Breaks both CLI and GUI UX (blank window, timeout errors) This confirms a UX regression — not just an edge case — and deserves proper triage (hence the recommendations in the original issue posting). Open questions still worth investigating: - What happens if a user downloads new models to internal storage and then mounts the external volume with old models? - Will Ollama merge the libraries, ignore one, or become unstable? - Should users manually set the model path to avoid confusion? Thanks for taking a look — I hope this helps improve resilience and error handling in future versions.
Author
Owner

@TungstenWolframite commented on GitHub (Aug 5, 2025):

Same issue on Windows.
There is no response with any CLI commands after updating to latest Ollama release.
Tried rolling back to previous releases, but the error persists.

<!-- gh-comment-id:3153872480 --> @TungstenWolframite commented on GitHub (Aug 5, 2025): Same issue on Windows. There is no response with any CLI commands after updating to latest Ollama release. Tried rolling back to previous releases, but the error persists.
Author
Owner

@jimmcslim commented on GitHub (Aug 5, 2025):

I also confirm the same issue identified by @gosocial2 on Mac; my ~/.ollama/models directory is symlinked to an external SSD. When I first downloaded the new Ollama app, I was presented with a blank screen. Plugging in the external SSD and the app loads as expected.

<!-- gh-comment-id:3155211930 --> @jimmcslim commented on GitHub (Aug 5, 2025): I also confirm the same issue identified by @gosocial2 on Mac; my `~/.ollama/models` directory is symlinked to an external SSD. When I first downloaded the new Ollama app, I was presented with a blank screen. Plugging in the external SSD and the app loads as expected.
Author
Owner

@peteretelej commented on GitHub (Aug 7, 2025):

Can confirm similar issue on Windows.
Updating to the new version (installing OllamaSetup.exe) when the previous version is installed results in Error: ollama server not responding - timed out waiting for server to start errors. I also get other failures (this is on git bash for windows):

$ ollama pull nomic-embed-text
2025/08/07 15:54:05 WARN Failed to rotate log older=C:\Users\peter\AppData\Local\Ollama\app-1.log newer=C:\Users\peter\AppData\Local\Ollama\app.log error="rename C:\\Users\\peter\\AppData\\Local\\Ollama\\app.log C:\\Users\\peter\\AppData\\Local\\Ollama\\app-1.log: The process cannot access the file because it is being used by another process."
time=2025-08-07T15:54:05.720+03:00 level=INFO source=app_windows.go:272 msg="starting Ollama" app=C:\Users\peter\AppData\Local\Programs\Ollama version=0.11.3 OS=Windows/10.0.26100
time=2025-08-07T15:54:06.012+03:00 level=INFO source=eventloop.go:324 msg="sent focus request to existing instance"
time=2025-08-07T15:54:06.012+03:00 level=INFO source=app_windows.go:79 msg="existing instance found and focused, exiting"
Error: ollama server not responding - timed out waiting for server to start

Where it then opened the new app's UI but still failed with the timeout error.

Resolved by fully uninstalling previous instance and installing new version afresh.
Re-installing the same new version (with it already installed) does not result in the same failure.

<!-- gh-comment-id:3164216008 --> @peteretelej commented on GitHub (Aug 7, 2025): Can confirm similar issue on Windows. Updating to the new version (installing OllamaSetup.exe) when the previous version is installed results in `Error: ollama server not responding - timed out waiting for server to start` errors. I also get other failures (this is on git bash for windows): ``` $ ollama pull nomic-embed-text 2025/08/07 15:54:05 WARN Failed to rotate log older=C:\Users\peter\AppData\Local\Ollama\app-1.log newer=C:\Users\peter\AppData\Local\Ollama\app.log error="rename C:\\Users\\peter\\AppData\\Local\\Ollama\\app.log C:\\Users\\peter\\AppData\\Local\\Ollama\\app-1.log: The process cannot access the file because it is being used by another process." time=2025-08-07T15:54:05.720+03:00 level=INFO source=app_windows.go:272 msg="starting Ollama" app=C:\Users\peter\AppData\Local\Programs\Ollama version=0.11.3 OS=Windows/10.0.26100 time=2025-08-07T15:54:06.012+03:00 level=INFO source=eventloop.go:324 msg="sent focus request to existing instance" time=2025-08-07T15:54:06.012+03:00 level=INFO source=app_windows.go:79 msg="existing instance found and focused, exiting" Error: ollama server not responding - timed out waiting for server to start ``` Where it then opened the new app's UI but still failed with the timeout error. Resolved by fully uninstalling previous instance and installing new version afresh. Re-installing the same new version (with it already installed) does not result in the same failure.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7683