[GH-ISSUE #1153] CodeGPT extension cannot connect to locally served ollama Error: connect ECONNREFUSED ::1:11434 #583

Closed
opened 2026-04-12 10:17:09 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @wahreChrist on GitHub (Nov 16, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1153

Im trying to make CodeGPT extension work, to interact with Ollama in VS code, but it gives me this error in devtools console:

[Extension Host] No active text editor found.
log.ts:441   ERR [Extension Host] Error: Error: connect ECONNREFUSED ::1:11434
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)

Ollama itself works fine in CLI, http://127.0.0.1:11434/ also works and says that its up and running. Can't seem to figure out why Ollama refuses the connection. Can it have something to do with it not being able to locate default ssh key? It creates one every time when I run ollama serve, but then just exits with an error that port 11434 is already taken

Originally created by @wahreChrist on GitHub (Nov 16, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1153 Im trying to make CodeGPT extension work, to interact with Ollama in VS code, but it gives me this error in devtools console: ``` [Extension Host] No active text editor found. log.ts:441 ERR [Extension Host] Error: Error: connect ECONNREFUSED ::1:11434 at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16) ``` Ollama itself works fine in CLI, http://127.0.0.1:11434/ also works and says that its up and running. Can't seem to figure out why Ollama refuses the connection. Can it have something to do with it not being able to locate default ssh key? It creates one every time when I run `ollama serve`, but then just exits with an error that port 11434 is already taken
Author
Owner

@horw commented on GitHub (Nov 16, 2023):

Hello, @wahreChrist, I just tried the Ollama and mistral models, and they worked for me. I have not encountered any errors.

<!-- gh-comment-id:1814585908 --> @horw commented on GitHub (Nov 16, 2023): Hello, @wahreChrist, I just tried the Ollama and mistral models, and they worked for me. I have not encountered any errors.
Author
Owner

@wahreChrist commented on GitHub (Nov 16, 2023):

Hello @horw, I know that its possible I just don't understand what might be causing this issue in my particular case

<!-- gh-comment-id:1814732633 --> @wahreChrist commented on GitHub (Nov 16, 2023): Hello @horw, I know that its possible I just don't understand what might be causing this issue in my particular case
Author
Owner

@orkutmuratyilmaz commented on GitHub (Nov 16, 2023):

@wahreChrist, have you checked this issue?

<!-- gh-comment-id:1814893245 --> @orkutmuratyilmaz commented on GitHub (Nov 16, 2023): @wahreChrist, have you checked [this issue](https://github.com/davila7/code-gpt-docs/issues/192)?
Author
Owner

@wahreChrist commented on GitHub (Nov 16, 2023):

@orkutmuratyilmaz just checked now, unfortunately mentioned solutions there didn't help me, promptLayer.js wasn't mentioned in my logs though, its only complaining about the connection being refused on that port

<!-- gh-comment-id:1815351249 --> @wahreChrist commented on GitHub (Nov 16, 2023): @orkutmuratyilmaz just checked now, unfortunately mentioned solutions there didn't help me, promptLayer.js wasn't mentioned in my logs though, its only complaining about the connection being refused on that port
Author
Owner

@mxyng commented on GitHub (Nov 17, 2023):

ECONNREFUSED ::1:11434

http://127.0.0.1:11434/

These addresses are not the same. Ensure CodeGPT is configured correctly with 127.0.0.1:11434. Alternatively, configure ollama with OLLAMA_HOST=0.0.0.0

<!-- gh-comment-id:1815550215 --> @mxyng commented on GitHub (Nov 17, 2023): > ECONNREFUSED ::1:11434 > http://127.0.0.1:11434/ These addresses are not the same. Ensure CodeGPT is configured correctly with 127.0.0.1:11434. Alternatively, configure ollama with `OLLAMA_HOST=0.0.0.0`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#583