[GH-ISSUE #6726] "No Healthy Upstream" Error on Multiple Networks and Devices #4238

Closed
opened 2026-04-12 15:10:19 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @shake-hakobyan-rau on GitHub (Sep 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6726

What is the issue?

I am experiencing a recurring issue when attempting to access the Ollama website and download models across multiple networks and computers. The error message "no healthy upstream" appears consistently in the following scenarios:

  1. When I access the Ollama website from different networks and computers, the site fails to load, returning "no healthy upstream."
  2. When trying to install or pull models using the command:
    curl -fsSL https://ollama.com/install.sh | sh
    
    I receive the following error:
    curl: (22) The requested URL returned error: 503
    
  3. Attempting to run a model (llama3.1) using the following command:
    ./ollama run llama3.1
    
    results in:
    pulling manifest
    Error: pull model manifest: 503: no healthy upstream
    

Steps to Reproduce:

  1. Open the Ollama website from multiple devices and different network setups.
  2. Run the installation command:
    curl -fsSL https://ollama.com/install.sh | sh
  3. Attempt to pull or run the llama3.1 model using:
    ./ollama run llama3.1

Expected Outcome:
I expect the website to load correctly and for the installation and model pulling processes to complete without errors.

Actual Outcome:
The website fails to load, and the commands return a "503: no healthy upstream" error across different networks and devices.

Environment:

  • OS: Ubuntu 24.04
  • Networks: Tested on multiple networks (office, phone, servere)
  • Devices: Multiple computers

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.3.6

Originally created by @shake-hakobyan-rau on GitHub (Sep 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6726 ### What is the issue? I am experiencing a recurring issue when attempting to access the Ollama website and download models across multiple networks and computers. The error message "no healthy upstream" appears consistently in the following scenarios: 1. When I access the Ollama website from different networks and computers, the site fails to load, returning "no healthy upstream." 2. When trying to install or pull models using the command: ``` curl -fsSL https://ollama.com/install.sh | sh ``` I receive the following error: ``` curl: (22) The requested URL returned error: 503 ``` 3. Attempting to run a model (llama3.1) using the following command: ``` ./ollama run llama3.1 ``` results in: ``` pulling manifest Error: pull model manifest: 503: no healthy upstream ``` **Steps to Reproduce:** 1. Open the Ollama website from multiple devices and different network setups. 2. Run the installation command: ```curl -fsSL https://ollama.com/install.sh | sh``` 3. Attempt to pull or run the llama3.1 model using: ```./ollama run llama3.1``` **Expected Outcome:** I expect the website to load correctly and for the installation and model pulling processes to complete without errors. **Actual Outcome:** The website fails to load, and the commands return a "503: no healthy upstream" error across different networks and devices. **Environment:** - OS: Ubuntu 24.04 - Networks: Tested on multiple networks (office, phone, servere) - Devices: Multiple computers ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.6
GiteaMirror added the bug label 2026-04-12 15:10:19 -05:00
Author
Owner

@jmorganca commented on GitHub (Sep 10, 2024):

This should be fixed now. Sorry about that!

<!-- gh-comment-id:2340724930 --> @jmorganca commented on GitHub (Sep 10, 2024): This should be fixed now. Sorry about that!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4238