[GH-ISSUE #6839] ollama request llama3.1 fail. #4319

Closed
opened 2026-04-12 15:14:52 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @cswcss on GitHub (Sep 17, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6839

What is the issue?

like title i cant request llama3.1
on windows 10
it can be used one month ago,
cmd.exe:
C:\Users\123>ollama run llama3.1
2024/09/17 21:53:18 config.go:45: WARN invalid port, using default port="\Users\123\AppData\Local\Programs\Ollama\ollama app.exe" default=11434
2024/09/17 21:53:18 config.go:45: WARN invalid port, using default port="\Users\123\AppData\Local\Programs\Ollama\ollama app.exe" default=11434
Error: Head "http://C:11434/": dial tcp: lookup C: no such host

C:\Users\123 >ollama --version
2024/09/17 21:59:13 config.go:45: WARN invalid port, using default port="\Users\123\AppData\Local\Programs\Ollama\ollama app.exe" default=11434
2024/09/17 21:59:13 config.go:45: WARN invalid port, using default port="\Users\123\AppData\Local\Programs\Ollama\ollama app.exe" default=11434
Warning: could not connect to a running Ollama instance
Warning: client version is 0.3.9

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.3.9

Originally created by @cswcss on GitHub (Sep 17, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6839 ### What is the issue? like title i cant request llama3.1 on windows 10 it can be used one month ago, cmd.exe: C:\Users\123>ollama run llama3.1 2024/09/17 21:53:18 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434 2024/09/17 21:53:18 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434 Error: Head "http://C:11434/": dial tcp: lookup C: no such host C:\Users\123 >ollama --version 2024/09/17 21:59:13 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434 2024/09/17 21:59:13 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434 Warning: could not connect to a running Ollama instance Warning: client version is 0.3.9 ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.9
GiteaMirror added the bug label 2026-04-12 15:14:52 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 17, 2024):

You have OLLAMA_HOST set incorrectly.

$ OLLAMA_HOST='C:\Users\123\AppData\Local\Programs\Ollama\ollama app.exe'
$ ollama run llama3.1
2024/09/18 09:14:26 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434
2024/09/18 09:14:26 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434
Error: Head "http://C:11434/": dial tcp: lookup C: no such host

You should either unset it, or set it to the IP and port where your ollama server is running

$ OLLAMA_HOST=http://127.0.0.1:11434
$ ollama -v
ollama version is 0.3.10
<!-- gh-comment-id:2357216292 --> @rick-github commented on GitHub (Sep 17, 2024): You have OLLAMA_HOST set incorrectly. ```console $ OLLAMA_HOST='C:\Users\123\AppData\Local\Programs\Ollama\ollama app.exe' $ ollama run llama3.1 2024/09/18 09:14:26 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434 2024/09/18 09:14:26 config.go:45: WARN invalid port, using default port="\\Users\\123\\AppData\\Local\\Programs\\Ollama\\ollama app.exe" default=11434 Error: Head "http://C:11434/": dial tcp: lookup C: no such host ``` You should either unset it, or set it to the IP and port where your ollama server is running ```console $ OLLAMA_HOST=http://127.0.0.1:11434 $ ollama -v ollama version is 0.3.10 ```
Author
Owner

@cswcss commented on GitHub (Sep 21, 2024):

ok, thanks a lot i will try it!

<!-- gh-comment-id:2364998946 --> @cswcss commented on GitHub (Sep 21, 2024): ok, thanks a lot i will try it!
Author
Owner

@cswcss commented on GitHub (Sep 21, 2024):

螢幕擷取畫面 2024-09-21 142714
umm I think i followed this rules etting environment variables on Windows
On Windows, Ollama inherits your user and system environment variables.

First Quit Ollama by clicking on it in the task bar.

Start the Settings (Windows 11) or Control Panel (Windows 10) application and search for environment variables.

Click on Edit environment variables for your account.

Edit or create a new variable for your user account for , , etc.OLLAMA_HOSTOLLAMA_MODELS

Click OK/Apply to save.

Start the Ollama application from the Windows Start menu. but still cant use

<!-- gh-comment-id:2365025761 --> @cswcss commented on GitHub (Sep 21, 2024): ![螢幕擷取畫面 2024-09-21 142714](https://github.com/user-attachments/assets/ab2539e5-317c-4b21-81f2-7a0248aac7b2) umm I think i followed this rules etting environment variables on Windows On Windows, Ollama inherits your user and system environment variables. First Quit Ollama by clicking on it in the task bar. Start the Settings (Windows 11) or Control Panel (Windows 10) application and search for environment variables. Click on Edit environment variables for your account. Edit or create a new variable for your user account for , , etc.OLLAMA_HOSTOLLAMA_MODELS Click OK/Apply to save. Start the Ollama application from the Windows Start menu. but still cant use
Author
Owner

@rick-github commented on GitHub (Sep 21, 2024):

OLLAMA_HOST is wrong.
ollama_host
You should either unset it, or set it to the IP and port where your ollama server is running.

<!-- gh-comment-id:2365031428 --> @rick-github commented on GitHub (Sep 21, 2024): `OLLAMA_HOST` is wrong. ![ollama_host](https://github.com/user-attachments/assets/9825144c-0fdf-4302-8daf-4bbccb0ca80e) You should either unset it, or set it to the IP and port where your ollama server is running.
Author
Owner

@cswcss commented on GitHub (Oct 28, 2024):

it's finally run again, thank you so much for helping!

<!-- gh-comment-id:2440765809 --> @cswcss commented on GitHub (Oct 28, 2024): it's finally run again, thank you so much for helping!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4319