[GH-ISSUE #1057] Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": EOF #519

Closed
opened 2026-04-12 10:12:57 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @fabianslife on GitHub (Nov 9, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1057

I am running Ubuntu 20.04 and wanted to try out ollama, but the oneliner does not seem to work:

When installing ollama with curl https://ollama.ai/install.sh | sh everything is ok, and the installation runs fine:

% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  7650    0  7650    0     0  43220      0 --:--:-- --:--:-- --:--:-- 43465
>>> Downloading ollama...
######################################################################## 100,0%######################################################################### 100,0%
>>> Installing ollama to /usr/local/bin...
[sudo] password for fabian_iki: 
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> NVIDIA GPU installed.

But when i then try to load a model: ollama pull llama2 I get the error:

pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": EOF

I tried it from the systems terminal.

Originally created by @fabianslife on GitHub (Nov 9, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1057 I am running Ubuntu 20.04 and wanted to try out ollama, but the oneliner does not seem to work: When installing ollama with `curl https://ollama.ai/install.sh | sh` everything is ok, and the installation runs fine: ``` % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 7650 0 7650 0 0 43220 0 --:--:-- --:--:-- --:--:-- 43465 >>> Downloading ollama... ######################################################################## 100,0%######################################################################### 100,0% >>> Installing ollama to /usr/local/bin... [sudo] password for fabian_iki: >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Enabling and starting ollama service... >>> NVIDIA GPU installed. ``` But when i then try to load a model: `ollama pull llama2` I get the error: ``` pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": EOF ``` I tried it from the systems terminal.
Author
Owner

@ghost commented on GitHub (Nov 30, 2023):

I was having a similar issue with Ubuntu 22.04 in WSL. It seems like there was a firewall issue.

While I still have some problems getting ollama to work perfectly, I have had major improvements by setting a new netfirewallrule. Pulling models still takes a lot of starting and stopping, but I can finally pull the model, eventually, and use it via ollama run with great performance.

Open Windows Powershell as an admin and copy-paste the following:

New-NetFirewallRule -DisplayName "WSL" -Direction Inbound -Action Allow

Hope it helps in your case.

<!-- gh-comment-id:1833027076 --> @ghost commented on GitHub (Nov 30, 2023): I was having a similar issue with Ubuntu 22.04 in WSL. It seems like there was a firewall issue. While I still have some problems getting ollama to work perfectly, I have had major improvements by setting a new netfirewallrule. Pulling models still takes a lot of starting and stopping, but I can finally pull the model, eventually, and use it via ollama run with great performance. Open Windows Powershell as an admin and copy-paste the following: New-NetFirewallRule -DisplayName "WSL" -Direction Inbound -Action Allow Hope it helps in your case.
Author
Owner

@jmorganca commented on GitHub (Dec 24, 2023):

Merging this with #1036 - thanks!

<!-- gh-comment-id:1868599520 --> @jmorganca commented on GitHub (Dec 24, 2023): Merging this with #1036 - thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#519