/ollama/api/chat API proxy broken since v0.3.22 #2427

Closed
opened 2025-11-11 15:06:57 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @strobel1x on GitHub (Oct 22, 2024).

Bug Report

Installation Method

Docker compose

Environment

  • Open WebUI Version: v0.3.22 - latest (v0.3.32)

  • Ollama (if applicable): v0.3.14

  • Operating System: Ubuntu 20.04.6 LTS

  • Browser (if applicable): Chrome v130

Expected Behavior:

Chat from Continue.dev plugin in VSCode responding to prompt

Actual Behavior:

No response in VSCode plugin.

Description

Bug Summary:
The builtin chat api proxy in open-webui does not respond the generated output of the ollama chat api

Reproduction Details

Steps to Reproduce:
Install Open-WebUI budled with ollama (e.g. via Docker)
Configure Continue.dev plugin in VSCode chat model as per documentation:
"apiBase": "http://[open-webui host]/ollama"
Open Continue.dev chat in VSCode and enter any prompt

Logs and Screenshots

Docker Container Logs:

ollama      | [GIN] 2024/10/22 - 08:23:39 | 200 |  3.415570612s |      XXX.XXX.XXX.XXX | POST     "/api/chat"
open-webui  | ERROR [asyncio] Unclosed client session
open-webui  | client_session: <aiohttp.client.ClientSession object at 0x7fad30b7eb90>
open-webui  | ERROR [asyncio] Unclosed connector
open-webui  | connections: ['[(<aiohttp.client_proto.ResponseHandler object at 0x7fad30bd8910>, 61391.639)]']
open-webui  | connector: <aiohttp.connector.TCPConnector object at 0x7fad30f4b850>
open-webui  | INFO:     XXX.XXX.XXX.XXX:0 - "POST /ollama/api/chat HTTP/1.1" 200 OK

Additional Information

When switching back to open-webui v0.3.21 the chat api works as expected

Originally created by @strobel1x on GitHub (Oct 22, 2024). # Bug Report ## Installation Method Docker compose ## Environment - **Open WebUI Version:** v0.3.22 - latest (v0.3.32) - **Ollama (if applicable):** v0.3.14 - **Operating System:** Ubuntu 20.04.6 LTS - **Browser (if applicable):** Chrome v130 ## Expected Behavior: Chat from Continue.dev plugin in VSCode responding to prompt ## Actual Behavior: No response in VSCode plugin. ## Description **Bug Summary:** The builtin chat api proxy in open-webui does not respond the generated output of the ollama chat api ## Reproduction Details **Steps to Reproduce:** Install Open-WebUI budled with ollama (e.g. via Docker) Configure Continue.dev plugin in VSCode chat model as per documentation: `"apiBase": "http://[open-webui host]/ollama"` Open Continue.dev chat in VSCode and enter any prompt ## Logs and Screenshots **Docker Container Logs:** ``` ollama | [GIN] 2024/10/22 - 08:23:39 | 200 | 3.415570612s | XXX.XXX.XXX.XXX | POST "/api/chat" open-webui | ERROR [asyncio] Unclosed client session open-webui | client_session: <aiohttp.client.ClientSession object at 0x7fad30b7eb90> open-webui | ERROR [asyncio] Unclosed connector open-webui | connections: ['[(<aiohttp.client_proto.ResponseHandler object at 0x7fad30bd8910>, 61391.639)]'] open-webui | connector: <aiohttp.connector.TCPConnector object at 0x7fad30f4b850> open-webui | INFO: XXX.XXX.XXX.XXX:0 - "POST /ollama/api/chat HTTP/1.1" 200 OK ``` ## Additional Information When switching back to open-webui v0.3.21 the chat api works as expected
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#2427