[GH-ISSUE #10851] Error Log Translation #7125

Closed
opened 2026-04-12 19:08:00 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @TigerWang95 on GitHub (May 25, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10851

What is the issue?

I am downloading ollama and have used a VPN global proxy, but an error occurred when trying to download. The error message is Error: pull model manifest: Get "https://registry.ollama.ai/v2/huihui_ai/qwen3-abliterated/manifests/4b": read tcp 192.168.31.172:6409->172.67.182.229:443: wsarecv: An existing connection was forcibly closed by the remote host.
pulling manifest

Relevant log output

time=2025-05-25T13:23:41.109+08:00 level=INFO source=logging.go:32 msg="ollama app started"
time=2025-05-25T13:23:41.213+08:00 level=INFO source=lifecycle.go:19 msg="app config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\Administrator\\.ollama\\models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]"
time=2025-05-25T13:23:41.285+08:00 level=INFO source=store.go:96 msg="wrote store: C:\\Users\\Administrator\\AppData\\Local\\Ollama\\config.json"
time=2025-05-25T13:23:41.287+08:00 level=INFO source=store.go:96 msg="wrote store: C:\\Users\\Administrator\\AppData\\Local\\Ollama\\config.json"
time=2025-05-25T13:23:41.290+08:00 level=INFO source=server.go:182 msg="unable to connect to server"
time=2025-05-25T13:23:41.290+08:00 level=INFO source=server.go:141 msg="starting server..."
time=2025-05-25T13:23:41.418+08:00 level=INFO source=server.go:127 msg="started ollama server with pid 14516"
time=2025-05-25T13:23:41.418+08:00 level=INFO source=server.go:129 msg="ollama server logs C:\\Users\\Administrator\\AppData\\Local\\Ollama\\server.log"
time=2025-05-25T13:23:43.563+08:00 level=INFO source=getstarted_windows.go:31 msg="opening getting started terminal with [C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe -noexit -ExecutionPolicy Bypass -nologo -file C:\\Users\\Administrator\\AppData\\Local\\Programs\\Ollama\\ollama_welcome.ps1]"
time=2025-05-25T13:40:48.773+08:00 level=INFO source=lifecycle.go:89 msg="Waiting for ollama server to shutdown..."
time=2025-05-25T13:40:48.785+08:00 level=INFO source=server.go:158 msg="server shutdown with exit code 0"
time=2025-05-25T13:40:48.787+08:00 level=INFO source=lifecycle.go:93 msg="Ollama app exiting"

OS

Windows

GPU

AMD

CPU

Intel

Ollama version

0.7.1

Originally created by @TigerWang95 on GitHub (May 25, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10851 ### What is the issue? I am downloading ollama and have used a VPN global proxy, but an error occurred when trying to download. The error message is Error: pull model manifest: Get "https://registry.ollama.ai/v2/huihui_ai/qwen3-abliterated/manifests/4b": read tcp 192.168.31.172:6409->172.67.182.229:443: wsarecv: An existing connection was forcibly closed by the remote host. pulling manifest ### Relevant log output ```shell time=2025-05-25T13:23:41.109+08:00 level=INFO source=logging.go:32 msg="ollama app started" time=2025-05-25T13:23:41.213+08:00 level=INFO source=lifecycle.go:19 msg="app config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\Administrator\\.ollama\\models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]" time=2025-05-25T13:23:41.285+08:00 level=INFO source=store.go:96 msg="wrote store: C:\\Users\\Administrator\\AppData\\Local\\Ollama\\config.json" time=2025-05-25T13:23:41.287+08:00 level=INFO source=store.go:96 msg="wrote store: C:\\Users\\Administrator\\AppData\\Local\\Ollama\\config.json" time=2025-05-25T13:23:41.290+08:00 level=INFO source=server.go:182 msg="unable to connect to server" time=2025-05-25T13:23:41.290+08:00 level=INFO source=server.go:141 msg="starting server..." time=2025-05-25T13:23:41.418+08:00 level=INFO source=server.go:127 msg="started ollama server with pid 14516" time=2025-05-25T13:23:41.418+08:00 level=INFO source=server.go:129 msg="ollama server logs C:\\Users\\Administrator\\AppData\\Local\\Ollama\\server.log" time=2025-05-25T13:23:43.563+08:00 level=INFO source=getstarted_windows.go:31 msg="opening getting started terminal with [C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe -noexit -ExecutionPolicy Bypass -nologo -file C:\\Users\\Administrator\\AppData\\Local\\Programs\\Ollama\\ollama_welcome.ps1]" time=2025-05-25T13:40:48.773+08:00 level=INFO source=lifecycle.go:89 msg="Waiting for ollama server to shutdown..." time=2025-05-25T13:40:48.785+08:00 level=INFO source=server.go:158 msg="server shutdown with exit code 0" time=2025-05-25T13:40:48.787+08:00 level=INFO source=lifecycle.go:93 msg="Ollama app exiting" ``` ### OS Windows ### GPU AMD ### CPU Intel ### Ollama version 0.7.1
GiteaMirror added the bug label 2026-04-12 19:08:00 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7125