Ollama API endpoint Connection Issue #2451

Closed
opened 2025-11-11 15:07:35 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @Natedorr on GitHub (Oct 25, 2024).

Using the ollama api endpoint has had issues when using with continue.dev since the 0.3.22 release. My config.json for continue.dev looks like this:

{
  "model": "AUTODETECT",
  "title": "Ollama-Bad",
  "provider": "ollama",
  "apiBase": "http://localhost:8080/ollama/",
  "contextLength": 7999,
  "completionOptions": {},
  "requestOptions": {
    "headers": {
      "Authorization": "Bearer sk-xxx"
    }
  }
},

This works with open-webui 0.3.21 as well as using the localhost:11434 port. When i try to chat in the vscode continue.dev plugin, I get the models populating properly, but there is no response. It spins and then returns empty. I pulled the following from the vscode output for 'Continue - LLM Prompt/Completion'

==========================================================================
==========================================================================
Settings:
contextLength: 7999
model: nemotron:70b-instruct-q3_K_M
maxTokens: 1024
stop: <|start_header_id|>,<|end_header_id|>,<|eot_id|>
log: undefined

############################################

<user>
Are you working

==========================================================================
==========================================================================
Completion:



This error has been happening with multiple versions of the base ollama (separate install than open-webui). Also i rolled back continue releases and still had the issue.
I have replicated this issue on Linux and a WSL windows environment.

Originally created by @Natedorr on GitHub (Oct 25, 2024). Using the ollama api endpoint has had issues when using with continue.dev since the 0.3.22 release. My config.json for continue.dev looks like this: { "model": "AUTODETECT", "title": "Ollama-Bad", "provider": "ollama", "apiBase": "http://localhost:8080/ollama/", "contextLength": 7999, "completionOptions": {}, "requestOptions": { "headers": { "Authorization": "Bearer sk-xxx" } } }, This works with open-webui 0.3.21 as well as using the localhost:11434 port. When i try to chat in the vscode continue.dev plugin, I get the models populating properly, but there is no response. It spins and then returns empty. I pulled the following from the vscode output for 'Continue - LLM Prompt/Completion' ``` ========================================================================== ========================================================================== Settings: contextLength: 7999 model: nemotron:70b-instruct-q3_K_M maxTokens: 1024 stop: <|start_header_id|>,<|end_header_id|>,<|eot_id|> log: undefined ############################################ <user> Are you working ========================================================================== ========================================================================== Completion: ``` This error has been happening with multiple versions of the base ollama (separate install than open-webui). Also i rolled back continue releases and still had the issue. I have replicated this issue on Linux and a WSL windows environment.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#2451