mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #15886] issue: Notes - Chat can't used streaming Model from API. #17706
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @peuportier on GitHub (Jul 20, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/15886
Check Existing Issues
Installation Method
Pip Install
Open WebUI Version
0.6.18
Ollama Version (if applicable)
0.9.6
Operating System
MacOS 15.4.1
Browser (if applicable)
Safari 18.4 (20621.1.15.11.10)
Confirmation
README.md.Expected Behavior
The chat should successfully connect to the model and stream responses without encountering a 422 Client Error. In normal chat no problem.
Actual Behavior
Description:
When attempting to use the chat functionality with a model that streams responses within the Notes tool, the system repeatedly returns a 422 Client Error: Unprocessable Entity for the URL https://api.mistral.ai/v1/chat/completions.
Steps to Reproduce
Logs & Screenshots
Additional Information
This issue occurs specifically when trying to stream responses from the chat model.
The error suggests that the request being sent to the API endpoint is not properly formatted or is missing required parameters.
@tjbck commented on GitHub (Jul 20, 2025):
Is this from direct connections?
@peuportier commented on GitHub (Jul 21, 2025):
Hey @tjbck — hope you’re doing well and not too exhausted from all the projects!
Quick heads up:
I created a function and shared it with the community that lets you load Mistral models directly from the Mistral API (using your API key) into OWUI, and then stream responses from there.
Just to clarify:
This code makes a direct connection to the Mistral API using HTTPS requests (via the Python requests library). I’m not sure if you allow this kind of direct connection from the chat side, but wanted to check in and see if that’s okay, or if you have any restrictions around this.
Let me know, thanks again for all efforts throughout Open Webui. It's help our research a lot.
@rgaricano commented on GitHub (Jul 21, 2025):
It seem an error due to a malformed request,
if you wantn't share the whole function, could you share just the request call that is sent against mistral completions?
@peuportier commented on GitHub (Jul 22, 2025):
@rgaricano No problem , I can share the whole function. nothing to hide here . I thanks for any clue that can solve the issue.
`import os
import json
import requests
import time
from typing import List, Union, Dict, Generator # Added Generator import
from pydantic import BaseModel, Field
import base64
class Pipe:
class Valves(BaseModel):
"""Configuration for Mistral API."""
`
Hope this can help.
Thanks again
@rgaricano commented on GitHub (Jul 22, 2025):
Ok, probably it's because in Chat notes stream is true
5fbfe2bdca/src/lib/components/notes/NoteEditor/Chat.svelte (L191)and files are sent:
5fbfe2bdca/src/lib/components/notes/NoteEditor/Chat.svelte (L170-L175)but the mistral completion endpoint doesn't support stream when sending files:
5fbfe2bdca/backend/open_webui/retrieval/loaders/mistral.py (L252-L261)@rgaricano commented on GitHub (Jul 22, 2025):
Solutions?
set stream=false in notes chat to complain with mistral endpoint (not recommended) or do the check in your pipe to avoid this mistral issue, setting stream=false if there are files in the payload.