[GH-ISSUE #1671] Error: connect ECONNREFUSED 127.0.0.1:11434 #26702

Closed
opened 2026-04-22 03:08:49 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @LTtt456c on GitHub (Dec 22, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1671

Hello everyone!
My ollama in My docker
docker Start ollama command is docker run -e OLLAMA_HOST=0.0.0.0:11434 -d -v ollama serve -p 11434:11434 --name ollama ollama/ollama
Then I in vscode open chatbot-ollama And then input npm run dev And then Report an error

↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓ Here is the error log ↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓

PS G:\AI\chatbot-ollama> npm run dev

chatbot-ollama@0.1.0 dev
next dev

▲ Next.js 13.5.6

Local: http://localhost:3000/
✓ Ready in 2.9s
○ Compiling / ...
✓ Compiled / in 3.3s (1652 modules)
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
✓ Compiled in 1699ms (1652 modules)
✓ Compiled in 519ms (1652 modules)
✓ Compiled /api/models in 245ms (68 modules)
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
✓ Compiled in 620ms (1720 modules)
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}

Originally created by @LTtt456c on GitHub (Dec 22, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1671 Hello everyone! My ollama in My docker docker Start ollama command is docker run -e OLLAMA_HOST=0.0.0.0:11434 -d -v ollama serve -p 11434:11434 --name ollama ollama/ollama Then I in vscode open chatbot-ollama And then input npm run dev And then Report an error ↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓ Here is the error log ↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓ PS G:\AI\chatbot-ollama> npm run dev chatbot-ollama@0.1.0 dev next dev ▲ Next.js 13.5.6 Local: http://localhost:3000/ ✓ Ready in 2.9s ○ Compiling / ... ✓ Compiled / in 3.3s (1652 modules) ⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload ✓ Compiled in 1699ms (1652 modules) ✓ Compiled in 519ms (1652 modules) ✓ Compiled /api/models in 245ms (68 modules) [TypeError: fetch failed] { cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] { errno: -4078, code: 'ECONNREFUSED', syscall: 'connect', address: '127.0.0.1', port: 11434 } } ✓ Compiled in 620ms (1720 modules) [TypeError: fetch failed] { cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] { errno: -4078, code: 'ECONNREFUSED', syscall: 'connect', address: '127.0.0.1', port: 11434 } } [TypeError: fetch failed] { cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] { errno: -4078, code: 'ECONNREFUSED', syscall: 'connect', address: '127.0.0.1', port: 11434 } } [TypeError: fetch failed] { cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] { errno: -4078, code: 'ECONNREFUSED', syscall: 'connect', address: '127.0.0.1', port: 11434 } }
Author
Owner

@BruceMacD commented on GitHub (Dec 22, 2023):

Hi @LTtt456c, at a high level this looks correct. Could you try curling the ollama container directly to check that it's not a UI issue?

curl http://127.0.0.1:11434

<!-- gh-comment-id:1867788162 --> @BruceMacD commented on GitHub (Dec 22, 2023): Hi @LTtt456c, at a high level this looks correct. Could you try curling the ollama container directly to check that it's not a UI issue? `curl http://127.0.0.1:11434`
Author
Owner

@LTtt456c commented on GitHub (Dec 23, 2023):

你好@LTtt456c,从高层次来看,这看起来是正确的。您可以尝试直接卷曲 ollama 容器来检查这不是 UI 问题吗?

curl http://127.0.0.1:11434

hello
I tried to reinstall ollama and docker today, and executed the command docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
Then execute the command docker exec -it ollama ollama run llama2. It can be run in my terminal. I reopen a terminal window and execute it.
curl http://127.0.0.1:11434 prompts Ollama is running

<!-- gh-comment-id:1868221848 --> @LTtt456c commented on GitHub (Dec 23, 2023): > 你好@LTtt456c,从高层次来看,这看起来是正确的。您可以尝试直接卷曲 ollama 容器来检查这不是 UI 问题吗? > > `curl http://127.0.0.1:11434` hello I tried to reinstall ollama and docker today, and executed the command docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama Then execute the command docker exec -it ollama ollama run llama2. It can be run in my terminal. I reopen a terminal window and execute it. curl http://127.0.0.1:11434 prompts Ollama is running
Author
Owner

@BruceMacD commented on GitHub (Dec 24, 2023):

Thanks @LTtt456c, it looks like Ollama should be accessible in the container in that case. I'd suggest trying to open an issue with the chatbot-ollama project you are using here in that case, the issue could be in there.

<!-- gh-comment-id:1868530546 --> @BruceMacD commented on GitHub (Dec 24, 2023): Thanks @LTtt456c, it looks like Ollama should be accessible in the container in that case. I'd suggest trying to open an issue with the `chatbot-ollama` project you are using here in that case, the issue could be in there.
Author
Owner

@LTtt456c commented on GitHub (Dec 28, 2023):

I found that this may be a bug of docker. I tried to restart the computer, reopen docker and re-run ollama, and the UI could be accessed.

<!-- gh-comment-id:1870844688 --> @LTtt456c commented on GitHub (Dec 28, 2023): I found that this may be a bug of docker. I tried to restart the computer, reopen docker and re-run ollama, and the UI could be accessed.
Author
Owner

@BruceMacD commented on GitHub (Dec 29, 2023):

Thanks for the update, I'll close this for now as it seems to be an issue outside Ollama. If people continue to encounter this please let me know.

<!-- gh-comment-id:1872103963 --> @BruceMacD commented on GitHub (Dec 29, 2023): Thanks for the update, I'll close this for now as it seems to be an issue outside Ollama. If people continue to encounter this please let me know.
Author
Owner

@FarooqAlaulddin commented on GitHub (May 10, 2024):

I just ran into this issue. I was using ubuntu via Windows Linux Subsystem and was trying to query http://localhost:11434/api/generate within linux while the Ollama server is running on Ollama app for windows. ECONNREFUSED seems to happen in my case that way. I dont think its related to node as I tried to query the same endpoint on python from within linux and the issue persisted. I ran the application I was working from windows (it was a vscode extension) and its working.

<!-- gh-comment-id:2104258064 --> @FarooqAlaulddin commented on GitHub (May 10, 2024): I just ran into this issue. I was using ubuntu via Windows Linux Subsystem and was trying to query `http://localhost:11434/api/generate` within linux while the Ollama server is running on Ollama app for windows. `ECONNREFUSED` seems to happen in my case that way. I dont think its related to node as I tried to query the same endpoint on python from within linux and the issue persisted. I ran the application I was working from windows (it was a vscode extension) and its working.
Author
Owner

@srccd-dev commented on GitHub (May 18, 2024):

I found that this may be a bug of docker. I tried to restart the computer, reopen docker and re-run ollama, and the UI could be accessed.

This worked for me as well. reloaded and rebooted and finally connected.

<!-- gh-comment-id:2118665553 --> @srccd-dev commented on GitHub (May 18, 2024): > I found that this may be a bug of docker. I tried to restart the computer, reopen docker and re-run ollama, and the UI could be accessed. This worked for me as well. reloaded and rebooted and finally connected.
Author
Owner

@krmateusiak-hippo commented on GitHub (Jun 14, 2024):

instead of http://localhost:11434/api try to use http://127.0.0.1:11434/api

<!-- gh-comment-id:2167771017 --> @krmateusiak-hippo commented on GitHub (Jun 14, 2024): instead of `http://localhost:11434/api` try to use `http://127.0.0.1:11434/api`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26702