[GH-ISSUE #13377] Ollama on Windows stuck in loading state when OLLAMA_HOST is set #70893

Closed
opened 2026-05-04 23:24:06 -05:00 by GiteaMirror · 15 comments
Owner

Originally created by @LeonZhu997 on GitHub (Dec 8, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13377

Originally assigned to: @hoyyeva on GitHub.

What is the issue?

Image

Why does the desktop visual interface of Ollama on my Windows keep loading?

Relevant log output


OS

Windows

GPU

RTX 5090

CPU

9950x3D

Ollama version

0.13.1

Originally created by @LeonZhu997 on GitHub (Dec 8, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13377 Originally assigned to: @hoyyeva on GitHub. ### What is the issue? <img width="600" height="450" alt="Image" src="https://github.com/user-attachments/assets/6c2e4637-7a1d-4775-945d-09981fddc919" /> Why does the desktop visual interface of Ollama on my Windows keep loading? ### Relevant log output ```shell ``` ### OS Windows ### GPU RTX 5090 ### CPU 9950x3D ### Ollama version 0.13.1
GiteaMirror added the appbugwindowsnvidia labels 2026-05-04 23:24:08 -05:00
Author
Owner

@dhiltgen commented on GitHub (Dec 8, 2025):

Something may be going wrong while doing GPU discovery. Quit the desktop app, then in a powershell terminal, run

$env:OLLAMA_DEBUG="2"
ollama serve 2>&1 | % ToString | tee-object serve.log

And wait until it reports the "inference compute" line, then you can ctrl-c and share the serve.log so we can see what might be going wrong.

<!-- gh-comment-id:3627783817 --> @dhiltgen commented on GitHub (Dec 8, 2025): Something may be going wrong while doing GPU discovery. Quit the desktop app, then in a powershell terminal, run ```powershell $env:OLLAMA_DEBUG="2" ollama serve 2>&1 | % ToString | tee-object serve.log ``` And wait until it reports the "inference compute" line, then you can `ctrl-c` and share the serve.log so we can see what might be going wrong.
Author
Owner

@LeonZhu997 commented on GitHub (Dec 9, 2025):

Image This is Ollama's serve.log
<!-- gh-comment-id:3631703349 --> @LeonZhu997 commented on GitHub (Dec 9, 2025): <img width="1705" height="824" alt="Image" src="https://github.com/user-attachments/assets/4392720d-5a37-4047-8174-f43904b750b8" /> This is Ollama's serve.log
Author
Owner

@dhiltgen commented on GitHub (Dec 9, 2025):

From the logs, it seems like it completed GPU discovery. I'm not quite sure yet what's causing it to hang.

Does the CLI hang as well?

Can you try a "curl" equivalent in powershell and see if that responds?

(Invoke-WebRequest -method POST -Body '{"model":"llama3.2", "prompt":"Why is the sky blue?", "stream": false}' -uri http://localhost:11434/api/generate ).Content | ConvertFrom-json
<!-- gh-comment-id:3633089949 --> @dhiltgen commented on GitHub (Dec 9, 2025): From the logs, it seems like it completed GPU discovery. I'm not quite sure yet what's causing it to hang. Does the CLI hang as well? Can you try a "curl" equivalent in powershell and see if that responds? ```powershell (Invoke-WebRequest -method POST -Body '{"model":"llama3.2", "prompt":"Why is the sky blue?", "stream": false}' -uri http://localhost:11434/api/generate ).Content | ConvertFrom-json ```
Author
Owner

@LeonZhu997 commented on GitHub (Dec 10, 2025):

Image The command-line interface has been working perfectly all along; what really confused me was that the app failed to display and function properly. Thank you for your help—I’ve finally resolved the issue. It seems to have been caused by an environment variable problem, and I ultimately fixed it by setting $env:OLLAMA_HOST="0.0.0.0:11435".
<!-- gh-comment-id:3637213990 --> @LeonZhu997 commented on GitHub (Dec 10, 2025): <img width="1726" height="1150" alt="Image" src="https://github.com/user-attachments/assets/b6595b32-e1f4-46a4-93d9-b0ae275b23fc" /> The command-line interface has been working perfectly all along; what really confused me was that the app failed to display and function properly. Thank you for your help—I’ve finally resolved the issue. It seems to have been caused by an environment variable problem, and I ultimately fixed it by setting $env:OLLAMA_HOST="0.0.0.0:11435".
Author
Owner

@hoyyeva commented on GitHub (Dec 11, 2025):

Hi @LeonZhu997 I am so sorry that you’re experiencing this. I’m currently working on a fix for another bug that I suspect may also resolve this issue. The PR is under review right now, and you can check its progress here. Hopefully it will be merged soon, and with any luck it will fix your issue as well!

<!-- gh-comment-id:3643607004 --> @hoyyeva commented on GitHub (Dec 11, 2025): Hi @LeonZhu997 I am so sorry that you’re experiencing this. I’m currently working on a fix for another bug that I suspect may also resolve this issue. The PR is under review right now, and you can check its progress [here](https://github.com/ollama/ollama/pull/13159). Hopefully it will be merged soon, and with any luck it will fix your issue as well!
Author
Owner

@RafaelForrer commented on GitHub (Dec 13, 2025):

dont like to pay for trash that didnt work.. one time more and we move our 320 clients companys to other cloud service that working correctly and who check first her updates before roll out.. dont work like a kids coding company... after update expose to network stuck loop,,, ports are free all same as before update...

<!-- gh-comment-id:3649142196 --> @RafaelForrer commented on GitHub (Dec 13, 2025): dont like to pay for trash that didnt work.. one time more and we move our 320 clients companys to other cloud service that working correctly and who check first her updates before roll out.. dont work like a kids coding company... after update expose to network stuck loop,,, ports are free all same as before update...
Author
Owner

@jasperhaak commented on GitHub (Dec 14, 2025):

I had the same issue, and fixed it by entering this command into Powershell:
setx OLLAMA_HOST "127.0.0.1:11434"

<!-- gh-comment-id:3650406100 --> @jasperhaak commented on GitHub (Dec 14, 2025): I had the same issue, and fixed it by entering this command into Powershell: setx OLLAMA_HOST "127.0.0.1:11434"
Author
Owner

@Maltz42 commented on GitHub (Dec 14, 2025):

@LeonZhu997 Be careful about setting OLLAMA_HOST to 0.0.0.0 - that exposes your Ollama server to your network so other devices can connect to it. I'd try 127.0.0.1 first.

<!-- gh-comment-id:3651538881 --> @Maltz42 commented on GitHub (Dec 14, 2025): @LeonZhu997 Be careful about setting OLLAMA_HOST to 0.0.0.0 - that exposes your Ollama server to your network so other devices can connect to it. I'd try 127.0.0.1 first.
Author
Owner

@orhamra commented on GitHub (Feb 19, 2026):

Run this:
netsh int ipv4 show excludedportrange protocol=tcp

Your Windows or HyperV is using that port.

These commands may help:
net stop winnat
net start winnat

<!-- gh-comment-id:3925987777 --> @orhamra commented on GitHub (Feb 19, 2026): Run this: netsh int ipv4 show excludedportrange protocol=tcp Your Windows or HyperV is using that port. These commands may help: net stop winnat net start winnat
Author
Owner

@bois6402307 commented on GitHub (Mar 12, 2026):

the same on my pc, the cli works okay, only the desktop app stucks at loading phase. Note that I install on a cleaned install Windows 11, no envar is set

<!-- gh-comment-id:4049434218 --> @bois6402307 commented on GitHub (Mar 12, 2026): the same on my pc, the cli works okay, only the desktop app stucks at loading phase. Note that I install on a cleaned install Windows 11, no envar is set
Author
Owner

@ljt6577 commented on GitHub (Mar 16, 2026):

just share another workground: Windows 11 24H2,RTX 4080, same problem, and solved it by the follow commands in Windows cmd(run as administrator)
# quit ollama
setx CUDA_VISIBLE_DEVICES "" /M

<!-- gh-comment-id:4067783081 --> @ljt6577 commented on GitHub (Mar 16, 2026): just share another workground: Windows 11 24H2,RTX 4080, same problem, and solved it by the follow commands in Windows cmd(run as administrator) `# quit ollama` `setx CUDA_VISIBLE_DEVICES "" /M`
Author
Owner

@HAMZAIqbalSharref commented on GitHub (Apr 4, 2026):

Exception 0xc0000005 0x8 0x7ff86ccdcb12 0x7ff86ccdcb12
PC=0x7ff86ccdcb12
signal arrived during external code execution
System.Management.Automation.RemoteException
runtime.cgocall(0x7ff708b9de10, 0xc000049da0)
runtime/cgocall.go:167 +0x3e fp=0xc000049d78 sp=0xc000049d10 pc=0x7ff707a7243e
github.com/ollama/ollama/x/imagegen/mlx._Cfunc_mlx_random_key(0xc0000682e0, 0x19d58e045c1)
_cgo_gotypes.go:1978 +0x50 fp=0xc000049da0 sp=0xc000049d78 pc=0x7ff7081004b0
github.com/ollama/ollama/x/imagegen/mlx.RandomKey.func1(...)
github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870
github.com/ollama/ollama/x/imagegen/mlx.RandomKey(0x19d58e045c1)
github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870 +0x5d fp=0xc000049dd8 sp=0xc000049da0 pc=0x7ff7081093dd
github.com/ollama/ollama/x/imagegen/mlx.init.0()
github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1848 +0xa9 fp=0xc000049e28 sp=0xc000049dd8 pc=0x7ff7081091e9
runtime.doInit1(0x7ff709f2b2c0)
runtime/proc.go:7350 +0xdd fp=0xc000049f50 sp=0xc000049e28 pc=0x7ff707a5343d
runtime.doInit(...)
runtime/proc.go:7317
runtime.main()
runtime/proc.go:254 +0x325 fp=0xc000049fe0 sp=0xc000049f50 pc=0x7ff707a44e85
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000049fe8 sp=0xc000049fe0 pc=0x7ff707a7db21
System.Management.Automation.RemoteException
goroutine 2 gp=0xc0000028c0 m=nil [force gc (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:435 +0xce fp=0xc000061fa8 sp=0xc000061f88 pc=0x7ff707a7598e
runtime.goparkunlock(...)
runtime/proc.go:441
runtime.forcegchelper()
runtime/proc.go:348 +0xb8 fp=0xc000061fe0 sp=0xc000061fa8 pc=0x7ff707a450f8
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000061fe8 sp=0xc000061fe0 pc=0x7ff707a7db21
created by runtime.init.7 in goroutine 1
runtime/proc.go:336 +0x1a
System.Management.Automation.RemoteException
goroutine 3 gp=0xc000002c40 m=nil [GC sweep wait]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:435 +0xce fp=0xc000063f80 sp=0xc000063f60 pc=0x7ff707a7598e
runtime.goparkunlock(...)
runtime/proc.go:441
runtime.bgsweep(0xc00006e000)
runtime/mgcsweep.go:276 +0x94 fp=0xc000063fc8 sp=0xc000063f80 pc=0x7ff707a2de74
runtime.gcenable.gowrap1()
runtime/mgc.go:204 +0x25 fp=0xc000063fe0 sp=0xc000063fc8 pc=0x7ff707a22285
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000063fe8 sp=0xc000063fe0 pc=0x7ff707a7db21
created by runtime.gcenable in goroutine 1
runtime/mgc.go:204 +0x66
System.Management.Automation.RemoteException
goroutine 4 gp=0xc000002e00 m=nil [GC scavenge wait]:
runtime.gopark(0xc00006e000?, 0x7ff7094780f0?, 0x1?, 0x0?, 0xc000002e00?)
runtime/proc.go:435 +0xce fp=0xc000075f78 sp=0xc000075f58 pc=0x7ff707a7598e
runtime.goparkunlock(...)
runtime/proc.go:441
runtime.(*scavengerState).park(0x7ff70a045c80)
runtime/mgcscavenge.go:425 +0x49 fp=0xc000075fa8 sp=0xc000075f78 pc=0x7ff707a2b909
runtime.bgscavenge(0xc00006e000)
runtime/mgcscavenge.go:653 +0x3c fp=0xc000075fc8 sp=0xc000075fa8 pc=0x7ff707a2be7c
runtime.gcenable.gowrap2()
runtime/mgc.go:205 +0x25 fp=0xc000075fe0 sp=0xc000075fc8 pc=0x7ff707a22225
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000075fe8 sp=0xc000075fe0 pc=0x7ff707a7db21
created by runtime.gcenable in goroutine 1
runtime/mgc.go:205 +0xa5
System.Management.Automation.RemoteException
goroutine 5 gp=0xc000003340 m=nil [finalizer wait]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:435 +0xce fp=0xc000077e30 sp=0xc000077e10 pc=0x7ff707a7598e
runtime.runfinq()
runtime/mfinal.go:196 +0x107 fp=0xc000077fe0 sp=0xc000077e30 pc=0x7ff707a21207
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000077fe8 sp=0xc000077fe0 pc=0x7ff707a7db21
created by runtime.createfing in goroutine 1
runtime/mfinal.go:166 +0x3d
rax 0x64
rbx 0x7ff86cda098c
rcx 0xb96971fec1b00000
rdx 0x0
rdi 0x2237bf10860
rsi 0x0
rbp 0xd23276f569
rsp 0xd23276f000
r8 0xffffffffffffffff
r9 0x8101010101010100
r10 0x80fcf8fefcfefefe
r11 0x22371ec5b50
r12 0xffffffffffffffff
r13 0x5d
r14 0x2236c942ac8
r15 0x0
rip 0x7ff86ccdcb12
rflags 0x10202
cs 0x33
fs 0x53
gs 0x2b

This is what i got when i ran the command to check the logs and also can anyone help how can i resolve this issue of constantly loading i tried to change the OLLAMA Host but it gave me an error kindly help me

<!-- gh-comment-id:4187192761 --> @HAMZAIqbalSharref commented on GitHub (Apr 4, 2026): Exception 0xc0000005 0x8 0x7ff86ccdcb12 0x7ff86ccdcb12 PC=0x7ff86ccdcb12 signal arrived during external code execution System.Management.Automation.RemoteException runtime.cgocall(0x7ff708b9de10, 0xc000049da0) runtime/cgocall.go:167 +0x3e fp=0xc000049d78 sp=0xc000049d10 pc=0x7ff707a7243e github.com/ollama/ollama/x/imagegen/mlx._Cfunc_mlx_random_key(0xc0000682e0, 0x19d58e045c1) _cgo_gotypes.go:1978 +0x50 fp=0xc000049da0 sp=0xc000049d78 pc=0x7ff7081004b0 github.com/ollama/ollama/x/imagegen/mlx.RandomKey.func1(...) github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870 github.com/ollama/ollama/x/imagegen/mlx.RandomKey(0x19d58e045c1) github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870 +0x5d fp=0xc000049dd8 sp=0xc000049da0 pc=0x7ff7081093dd github.com/ollama/ollama/x/imagegen/mlx.init.0() github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1848 +0xa9 fp=0xc000049e28 sp=0xc000049dd8 pc=0x7ff7081091e9 runtime.doInit1(0x7ff709f2b2c0) runtime/proc.go:7350 +0xdd fp=0xc000049f50 sp=0xc000049e28 pc=0x7ff707a5343d runtime.doInit(...) runtime/proc.go:7317 runtime.main() runtime/proc.go:254 +0x325 fp=0xc000049fe0 sp=0xc000049f50 pc=0x7ff707a44e85 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000049fe8 sp=0xc000049fe0 pc=0x7ff707a7db21 System.Management.Automation.RemoteException goroutine 2 gp=0xc0000028c0 m=nil [force gc (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000061fa8 sp=0xc000061f88 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.forcegchelper() runtime/proc.go:348 +0xb8 fp=0xc000061fe0 sp=0xc000061fa8 pc=0x7ff707a450f8 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000061fe8 sp=0xc000061fe0 pc=0x7ff707a7db21 created by runtime.init.7 in goroutine 1 runtime/proc.go:336 +0x1a System.Management.Automation.RemoteException goroutine 3 gp=0xc000002c40 m=nil [GC sweep wait]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000063f80 sp=0xc000063f60 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.bgsweep(0xc00006e000) runtime/mgcsweep.go:276 +0x94 fp=0xc000063fc8 sp=0xc000063f80 pc=0x7ff707a2de74 runtime.gcenable.gowrap1() runtime/mgc.go:204 +0x25 fp=0xc000063fe0 sp=0xc000063fc8 pc=0x7ff707a22285 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000063fe8 sp=0xc000063fe0 pc=0x7ff707a7db21 created by runtime.gcenable in goroutine 1 runtime/mgc.go:204 +0x66 System.Management.Automation.RemoteException goroutine 4 gp=0xc000002e00 m=nil [GC scavenge wait]: runtime.gopark(0xc00006e000?, 0x7ff7094780f0?, 0x1?, 0x0?, 0xc000002e00?) runtime/proc.go:435 +0xce fp=0xc000075f78 sp=0xc000075f58 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.(*scavengerState).park(0x7ff70a045c80) runtime/mgcscavenge.go:425 +0x49 fp=0xc000075fa8 sp=0xc000075f78 pc=0x7ff707a2b909 runtime.bgscavenge(0xc00006e000) runtime/mgcscavenge.go:653 +0x3c fp=0xc000075fc8 sp=0xc000075fa8 pc=0x7ff707a2be7c runtime.gcenable.gowrap2() runtime/mgc.go:205 +0x25 fp=0xc000075fe0 sp=0xc000075fc8 pc=0x7ff707a22225 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000075fe8 sp=0xc000075fe0 pc=0x7ff707a7db21 created by runtime.gcenable in goroutine 1 runtime/mgc.go:205 +0xa5 System.Management.Automation.RemoteException goroutine 5 gp=0xc000003340 m=nil [finalizer wait]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000077e30 sp=0xc000077e10 pc=0x7ff707a7598e runtime.runfinq() runtime/mfinal.go:196 +0x107 fp=0xc000077fe0 sp=0xc000077e30 pc=0x7ff707a21207 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000077fe8 sp=0xc000077fe0 pc=0x7ff707a7db21 created by runtime.createfing in goroutine 1 runtime/mfinal.go:166 +0x3d rax 0x64 rbx 0x7ff86cda098c rcx 0xb96971fec1b00000 rdx 0x0 rdi 0x2237bf10860 rsi 0x0 rbp 0xd23276f569 rsp 0xd23276f000 r8 0xffffffffffffffff r9 0x8101010101010100 r10 0x80fcf8fefcfefefe r11 0x22371ec5b50 r12 0xffffffffffffffff r13 0x5d r14 0x2236c942ac8 r15 0x0 rip 0x7ff86ccdcb12 rflags 0x10202 cs 0x33 fs 0x53 gs 0x2b This is what i got when i ran the command to check the logs and also can anyone help how can i resolve this issue of constantly loading i tried to change the OLLAMA Host but it gave me an error kindly help me
Author
Owner

@karamuja commented on GitHub (Apr 6, 2026):

Exception 0xc0000005 0x8 0x7ff86ccdcb12 0x7ff86ccdcb12 PC=0x7ff86ccdcb12 signal arrived during external code execution System.Management.Automation.RemoteException runtime.cgocall(0x7ff708b9de10, 0xc000049da0) runtime/cgocall.go:167 +0x3e fp=0xc000049d78 sp=0xc000049d10 pc=0x7ff707a7243e github.com/ollama/ollama/x/imagegen/mlx._Cfunc_mlx_random_key(0xc0000682e0, 0x19d58e045c1) _cgo_gotypes.go:1978 +0x50 fp=0xc000049da0 sp=0xc000049d78 pc=0x7ff7081004b0 github.com/ollama/ollama/x/imagegen/mlx.RandomKey.func1(...) github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870 github.com/ollama/ollama/x/imagegen/mlx.RandomKey(0x19d58e045c1) github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870 +0x5d fp=0xc000049dd8 sp=0xc000049da0 pc=0x7ff7081093dd github.com/ollama/ollama/x/imagegen/mlx.init.0() github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1848 +0xa9 fp=0xc000049e28 sp=0xc000049dd8 pc=0x7ff7081091e9 runtime.doInit1(0x7ff709f2b2c0) runtime/proc.go:7350 +0xdd fp=0xc000049f50 sp=0xc000049e28 pc=0x7ff707a5343d runtime.doInit(...) runtime/proc.go:7317 runtime.main() runtime/proc.go:254 +0x325 fp=0xc000049fe0 sp=0xc000049f50 pc=0x7ff707a44e85 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000049fe8 sp=0xc000049fe0 pc=0x7ff707a7db21 System.Management.Automation.RemoteException goroutine 2 gp=0xc0000028c0 m=nil [force gc (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000061fa8 sp=0xc000061f88 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.forcegchelper() runtime/proc.go:348 +0xb8 fp=0xc000061fe0 sp=0xc000061fa8 pc=0x7ff707a450f8 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000061fe8 sp=0xc000061fe0 pc=0x7ff707a7db21 created by runtime.init.7 in goroutine 1 runtime/proc.go:336 +0x1a System.Management.Automation.RemoteException goroutine 3 gp=0xc000002c40 m=nil [GC sweep wait]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000063f80 sp=0xc000063f60 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.bgsweep(0xc00006e000) runtime/mgcsweep.go:276 +0x94 fp=0xc000063fc8 sp=0xc000063f80 pc=0x7ff707a2de74 runtime.gcenable.gowrap1() runtime/mgc.go:204 +0x25 fp=0xc000063fe0 sp=0xc000063fc8 pc=0x7ff707a22285 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000063fe8 sp=0xc000063fe0 pc=0x7ff707a7db21 created by runtime.gcenable in goroutine 1 runtime/mgc.go:204 +0x66 System.Management.Automation.RemoteException goroutine 4 gp=0xc000002e00 m=nil [GC scavenge wait]: runtime.gopark(0xc00006e000?, 0x7ff7094780f0?, 0x1?, 0x0?, 0xc000002e00?) runtime/proc.go:435 +0xce fp=0xc000075f78 sp=0xc000075f58 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.(*scavengerState).park(0x7ff70a045c80) runtime/mgcscavenge.go:425 +0x49 fp=0xc000075fa8 sp=0xc000075f78 pc=0x7ff707a2b909 runtime.bgscavenge(0xc00006e000) runtime/mgcscavenge.go:653 +0x3c fp=0xc000075fc8 sp=0xc000075fa8 pc=0x7ff707a2be7c runtime.gcenable.gowrap2() runtime/mgc.go:205 +0x25 fp=0xc000075fe0 sp=0xc000075fc8 pc=0x7ff707a22225 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000075fe8 sp=0xc000075fe0 pc=0x7ff707a7db21 created by runtime.gcenable in goroutine 1 runtime/mgc.go:205 +0xa5 System.Management.Automation.RemoteException goroutine 5 gp=0xc000003340 m=nil [finalizer wait]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000077e30 sp=0xc000077e10 pc=0x7ff707a7598e runtime.runfinq() runtime/mfinal.go:196 +0x107 fp=0xc000077fe0 sp=0xc000077e30 pc=0x7ff707a21207 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000077fe8 sp=0xc000077fe0 pc=0x7ff707a7db21 created by runtime.createfing in goroutine 1 runtime/mfinal.go:166 +0x3d rax 0x64 rbx 0x7ff86cda098c rcx 0xb96971fec1b00000 rdx 0x0 rdi 0x2237bf10860 rsi 0x0 rbp 0xd23276f569 rsp 0xd23276f000 r8 0xffffffffffffffff r9 0x8101010101010100 r10 0x80fcf8fefcfefefe r11 0x22371ec5b50 r12 0xffffffffffffffff r13 0x5d r14 0x2236c942ac8 r15 0x0 rip 0x7ff86ccdcb12 rflags 0x10202 cs 0x33 fs 0x53 gs 0x2b

This is what i got when i ran the command to check the logs and also can anyone help how can i resolve this issue of constantly loading i tried to change the OLLAMA Host but it gave me an error kindly help me

Same problem, did you fix it?

<!-- gh-comment-id:4192052133 --> @karamuja commented on GitHub (Apr 6, 2026): > Exception 0xc0000005 0x8 0x7ff86ccdcb12 0x7ff86ccdcb12 PC=0x7ff86ccdcb12 signal arrived during external code execution System.Management.Automation.RemoteException runtime.cgocall(0x7ff708b9de10, 0xc000049da0) runtime/cgocall.go:167 +0x3e fp=0xc000049d78 sp=0xc000049d10 pc=0x7ff707a7243e github.com/ollama/ollama/x/imagegen/mlx._Cfunc_mlx_random_key(0xc0000682e0, 0x19d58e045c1) _cgo_gotypes.go:1978 +0x50 fp=0xc000049da0 sp=0xc000049d78 pc=0x7ff7081004b0 github.com/ollama/ollama/x/imagegen/mlx.RandomKey.func1(...) github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870 github.com/ollama/ollama/x/imagegen/mlx.RandomKey(0x19d58e045c1) github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1870 +0x5d fp=0xc000049dd8 sp=0xc000049da0 pc=0x7ff7081093dd github.com/ollama/ollama/x/imagegen/mlx.init.0() github.com/ollama/ollama/x/imagegen/mlx/mlx.go:1848 +0xa9 fp=0xc000049e28 sp=0xc000049dd8 pc=0x7ff7081091e9 runtime.doInit1(0x7ff709f2b2c0) runtime/proc.go:7350 +0xdd fp=0xc000049f50 sp=0xc000049e28 pc=0x7ff707a5343d runtime.doInit(...) runtime/proc.go:7317 runtime.main() runtime/proc.go:254 +0x325 fp=0xc000049fe0 sp=0xc000049f50 pc=0x7ff707a44e85 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000049fe8 sp=0xc000049fe0 pc=0x7ff707a7db21 System.Management.Automation.RemoteException goroutine 2 gp=0xc0000028c0 m=nil [force gc (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000061fa8 sp=0xc000061f88 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.forcegchelper() runtime/proc.go:348 +0xb8 fp=0xc000061fe0 sp=0xc000061fa8 pc=0x7ff707a450f8 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000061fe8 sp=0xc000061fe0 pc=0x7ff707a7db21 created by runtime.init.7 in goroutine 1 runtime/proc.go:336 +0x1a System.Management.Automation.RemoteException goroutine 3 gp=0xc000002c40 m=nil [GC sweep wait]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000063f80 sp=0xc000063f60 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.bgsweep(0xc00006e000) runtime/mgcsweep.go:276 +0x94 fp=0xc000063fc8 sp=0xc000063f80 pc=0x7ff707a2de74 runtime.gcenable.gowrap1() runtime/mgc.go:204 +0x25 fp=0xc000063fe0 sp=0xc000063fc8 pc=0x7ff707a22285 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000063fe8 sp=0xc000063fe0 pc=0x7ff707a7db21 created by runtime.gcenable in goroutine 1 runtime/mgc.go:204 +0x66 System.Management.Automation.RemoteException goroutine 4 gp=0xc000002e00 m=nil [GC scavenge wait]: runtime.gopark(0xc00006e000?, 0x7ff7094780f0?, 0x1?, 0x0?, 0xc000002e00?) runtime/proc.go:435 +0xce fp=0xc000075f78 sp=0xc000075f58 pc=0x7ff707a7598e runtime.goparkunlock(...) runtime/proc.go:441 runtime.(*scavengerState).park(0x7ff70a045c80) runtime/mgcscavenge.go:425 +0x49 fp=0xc000075fa8 sp=0xc000075f78 pc=0x7ff707a2b909 runtime.bgscavenge(0xc00006e000) runtime/mgcscavenge.go:653 +0x3c fp=0xc000075fc8 sp=0xc000075fa8 pc=0x7ff707a2be7c runtime.gcenable.gowrap2() runtime/mgc.go:205 +0x25 fp=0xc000075fe0 sp=0xc000075fc8 pc=0x7ff707a22225 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000075fe8 sp=0xc000075fe0 pc=0x7ff707a7db21 created by runtime.gcenable in goroutine 1 runtime/mgc.go:205 +0xa5 System.Management.Automation.RemoteException goroutine 5 gp=0xc000003340 m=nil [finalizer wait]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) runtime/proc.go:435 +0xce fp=0xc000077e30 sp=0xc000077e10 pc=0x7ff707a7598e runtime.runfinq() runtime/mfinal.go:196 +0x107 fp=0xc000077fe0 sp=0xc000077e30 pc=0x7ff707a21207 runtime.goexit({}) runtime/asm_amd64.s:1700 +0x1 fp=0xc000077fe8 sp=0xc000077fe0 pc=0x7ff707a7db21 created by runtime.createfing in goroutine 1 runtime/mfinal.go:166 +0x3d rax 0x64 rbx 0x7ff86cda098c rcx 0xb96971fec1b00000 rdx 0x0 rdi 0x2237bf10860 rsi 0x0 rbp 0xd23276f569 rsp 0xd23276f000 r8 0xffffffffffffffff r9 0x8101010101010100 r10 0x80fcf8fefcfefefe r11 0x22371ec5b50 r12 0xffffffffffffffff r13 0x5d r14 0x2236c942ac8 r15 0x0 rip 0x7ff86ccdcb12 rflags 0x10202 cs 0x33 fs 0x53 gs 0x2b > > This is what i got when i ran the command to check the logs and also can anyone help how can i resolve this issue of constantly loading i tried to change the OLLAMA Host but it gave me an error kindly help me Same problem, did you fix it?
Author
Owner

@karamuja commented on GitHub (Apr 6, 2026):

Step 1 — Download the old version directly:
https://github.com/ollama/ollama/releases/download/v0.17.7/OllamaSetup.exe
Step 2 — Run the installer
Step 3 — Verify it works:

powershell
ollama --version

Step 4 — To prevent it from auto-updating to the broken version, you can disable automatic updates in Ollama's system tray icon settings after installing.
This is a confirmed workaround that works for most Windows users until the Ollama team releases a proper fix.

<!-- gh-comment-id:4192098565 --> @karamuja commented on GitHub (Apr 6, 2026): Step 1 — Download the old version directly: https://github.com/ollama/ollama/releases/download/v0.17.7/OllamaSetup.exe Step 2 — Run the installer Step 3 — Verify it works: powershell ollama --version Step 4 — To prevent it from auto-updating to the broken version, you can disable automatic updates in Ollama's system tray icon settings after installing. This is a confirmed workaround that works for most Windows users until the Ollama team releases a proper fix.
Author
Owner

@HAMZAIqbalSharref commented on GitHub (Apr 11, 2026):

Step 1 — Download the old version directly: https://github.com/ollama/ollama/releases/download/v0.17.7/OllamaSetup.exe Step 2 — Run the installer Step 3 — Verify it works:

powershell ollama --version

Step 4 — To prevent it from auto-updating to the broken version, you can disable automatic updates in Ollama's system tray icon settings after installing. This is a confirmed workaround that works for most Windows users until the Ollama team releases a proper fix.
DID THAT AND IT WORKED LETS GOOOOO THANKS MAN REALLY HELPED ME

<!-- gh-comment-id:4229995573 --> @HAMZAIqbalSharref commented on GitHub (Apr 11, 2026): > Step 1 — Download the old version directly: https://github.com/ollama/ollama/releases/download/v0.17.7/OllamaSetup.exe Step 2 — Run the installer Step 3 — Verify it works: > > powershell ollama --version > > Step 4 — To prevent it from auto-updating to the broken version, you can disable automatic updates in Ollama's system tray icon settings after installing. This is a confirmed workaround that works for most Windows users until the Ollama team releases a proper fix. DID THAT AND IT WORKED LETS GOOOOO THANKS MAN REALLY HELPED ME
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70893