mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-22 14:13:08 -05:00
[PR #2731] [MERGED] fix: ollama and openai stream cancellation #7869
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/2731
Author: @cheahjs
Created: 6/2/2024
Status: ✅ Merged
Merged: 6/2/2024
Merged by: @tjbck
Base:
dev← Head:fix/ollama-cancellation📝 Commits (5)
24c35c3fix: stream defaults to true, return request ID4dd51bafix: ollama streaming cancellation using aiohttp7f74426fix: openai streaming cancellation using aiohttpb5b2b70fix: bad payload refactorc5ff4c2Merge branch 'dev' into fix/ollama-cancellation📊 Changes
7 files changed (+186 additions, -503 deletions)
View changed files
📝
backend/apps/ollama/main.py(+54 -347)📝
backend/apps/openai/main.py(+29 -10)📝
src/lib/apis/ollama/index.ts(+3 -22)📝
src/lib/components/chat/Chat.svelte(+50 -63)📝
src/lib/components/chat/ModelSelector/Selector.svelte(+21 -21)📝
src/lib/components/chat/Settings/Models.svelte(+28 -27)📝
src/lib/components/workspace/Playground.svelte(+1 -13)📄 Description
Pull Request Checklist
Before submitting, make sure you've checked the following:
devbranch.Changelog Entry
Description
There was some wonky behaviour with the previous approach to cancelling Ollama requests, where
Response.close()was not being called at the right time (https://github.com/psf/requests/issues/5372 might be related, but didn't dig too deep). Instead, fix cancellation of streaming responses on both Ollama and OpenAI by:requeststoaiohttpfor making streaming requestsclose()when the response is doneAbortControlleron the frontend, and closing the connection on the frontend.The request ID was also broken because it was only sent if
stream: truewas set in the request, but Ollama's API defaults it to true if absent, and the frontend doesn't sendstream: truefor streaming chat completions.🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.