[GH-ISSUE #10587] ollama linux版本和windows版本拉取模型失败 #32726

Closed
opened 2026-04-22 14:33:41 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @tipes-git on GitHub (May 6, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10587

存在拉取模型失败主要出现两种情况,1:超时 。2:远程主机强制关闭现有连接

Image

Image

Originally created by @tipes-git on GitHub (May 6, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10587 存在拉取模型失败主要出现两种情况,1:超时 。2:远程主机强制关闭现有连接 ![Image](https://github.com/user-attachments/assets/db3834cf-9ad4-4738-83ee-a25cedf819d1) ![Image](https://github.com/user-attachments/assets/bc7c9c16-3dd4-407e-b01d-17620da8633f)
GiteaMirror added the model label 2026-04-22 14:33:41 -05:00
Author
Owner

@rick-github commented on GitHub (May 6, 2025):

What does the following return:

curl https://registry.ollama.ai/v2/library/llava/manifests/latest
<!-- gh-comment-id:2853891028 --> @rick-github commented on GitHub (May 6, 2025): What does the following return: ``` curl https://registry.ollama.ai/v2/library/llava/manifests/latest ```
Author
Owner

@tipes-git commented on GitHub (May 6, 2025):

以下内容返回什么:

curl https://registry.ollama.ai/v2/library/llava/manifests/latest

在linux系统中长时间无响应
在Windows中出现连接重置

Image

Image

<!-- gh-comment-id:2854069903 --> @tipes-git commented on GitHub (May 6, 2025): > 以下内容返回什么: > > ``` > curl https://registry.ollama.ai/v2/library/llava/manifests/latest > ``` 在linux系统中长时间无响应 在Windows中出现连接重置 ![Image](https://github.com/user-attachments/assets/6c233d0b-9c28-4e54-934e-33d2cc0b0850) ![Image](https://github.com/user-attachments/assets/a169abe9-d88f-43e6-8eff-e1017ad87c10)
Author
Owner

@rick-github commented on GitHub (May 6, 2025):

It works for me, so the likely problem is a network issue: firewall, proxy, network connectivity, etc.

<!-- gh-comment-id:2854089312 --> @rick-github commented on GitHub (May 6, 2025): It works for me, so the likely problem is a network issue: firewall, proxy, network connectivity, etc.
Author
Owner

@tipes-git commented on GitHub (May 6, 2025):

它对我有用,所以可能的问题是网络问题:防火墙、代理、网络连接等。

非常谢谢您,我继续尝试一下

<!-- gh-comment-id:2854214440 --> @tipes-git commented on GitHub (May 6, 2025): > 它对我有用,所以可能的问题是网络问题:防火墙、代理、网络连接等。 非常谢谢您,我继续尝试一下
Author
Owner

@tipes-git commented on GitHub (May 6, 2025):

It works for me, so the likely problem is a network issue: firewall, proxy, network connectivity, etc.

我在网络上找到了很多种解决ollama 拉取model失败相关的防火墙,网络等方法,无一例外都不可以解决问题。
会不会是因为我的网络是大学校园网络,它在防火墙上有我无法更改的权限呢?

<!-- gh-comment-id:2854251565 --> @tipes-git commented on GitHub (May 6, 2025): > It works for me, so the likely problem is a network issue: firewall, proxy, network connectivity, etc. 我在网络上找到了很多种解决ollama 拉取model失败相关的防火墙,网络等方法,无一例外都不可以解决问题。 会不会是因为我的网络是大学校园网络,它在防火墙上有我无法更改的权限呢?
Author
Owner

@rick-github commented on GitHub (May 6, 2025):

It's possible. You can try pulling a model from a different source:

ollama run hf.co/mradermacher/Qwen2.5-0.5B-Instruct-GGUF:Q4_K_M
<!-- gh-comment-id:2854366185 --> @rick-github commented on GitHub (May 6, 2025): It's possible. You can try pulling a model from a different source: ``` ollama run hf.co/mradermacher/Qwen2.5-0.5B-Instruct-GGUF:Q4_K_M ```
Author
Owner

@tipes-git commented on GitHub (May 6, 2025):

有可能。您可以尝试从其他来源提取模型:

ollama run hf.co/mradermacher/Qwen2.5-0.5B-Instruct-GGUF:Q4_K_M

谢谢你,我尝试一下

<!-- gh-comment-id:2854394517 --> @tipes-git commented on GitHub (May 6, 2025): > 有可能。您可以尝试从其他来源提取模型: > > ``` > ollama run hf.co/mradermacher/Qwen2.5-0.5B-Instruct-GGUF:Q4_K_M > ``` 谢谢你,我尝试一下
Author
Owner

@egg1234 commented on GitHub (May 6, 2025):

你应该是大学校园网络屏蔽了ollama相关的一些子域名或子域名对应的ip地址,你只能试一下MatsuriDayo/nekoray项目,然后使用TUN模式,全局流量强迫全走TUN接口应该就可以了

<!-- gh-comment-id:2855932474 --> @egg1234 commented on GitHub (May 6, 2025): 你应该是大学校园网络屏蔽了ollama相关的一些子域名或子域名对应的ip地址,你只能试一下MatsuriDayo/nekoray项目,然后使用TUN模式,全局流量强迫全走TUN接口应该就可以了
Author
Owner

@tipes-git commented on GitHub (May 7, 2025):

你应该是大学校园网络屏蔽了ollama相关的一些子域名或子域名对应的ip地址,你只能试一下MatsuriDayo/nekoray项目,然后使用TUN模式,全局流量强迫全走TUN接口应该就可以了

好的,我试一试

<!-- gh-comment-id:2857770802 --> @tipes-git commented on GitHub (May 7, 2025): > 你应该是大学校园网络屏蔽了ollama相关的一些子域名或子域名对应的ip地址,你只能试一下MatsuriDayo/nekoray项目,然后使用TUN模式,全局流量强迫全走TUN接口应该就可以了 好的,我试一试
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#32726