mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #15688] issue: /api/v1/chats endpoint stops functioning on Production after about 5-6 chat messages (403 for each request) #56305
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @G1anduin on GitHub (Jul 13, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/15688
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
0.5.2
Ollama Version (if applicable)
latest
Operating System
MacOS Sequoia 15.5 (24F74)
Browser (if applicable)
Arc, Chrome, Firefox (all)
Confirmation
README.md.Expected Behavior
Using the chat for more than a few messages
Actual Behavior
Chat endpoints stop functioning (return 403 forbidden for each API request) after around 5-6 messages, in every chat, always - quicker if the messages are long.
Steps to Reproduce
Starting to use chat, after 5-6 messages the next POST request (for a submitted prompt) to
{BASE_URL}/api/v1/chats/{CHAT_ID}results inForbidden 403and so does every subsequent API requestLogs & Screenshots
Additional Information
No response
@rgaricano commented on GitHub (Jul 13, 2025):
It seems like a restriction from your provider, perhaps a context limit, a speed limit, a timeout...
Unable to reproduce.
@G1anduin commented on GitHub (Jul 13, 2025):
@rgaricano Thanks. Did you attempt to reproduce this with version 0.5.2 or latest?
@aleximmer commented on GitHub (Jul 13, 2025):
I observe the same problem with all models (anthropic through custom functions, but also using openai v1 default API) and confirm it depends somehow on the overall context length, either responses will break after a few small messages or already after one large prompt. I am on the latest release running on ubuntu 22 within docker.
@rgaricano commented on GitHub (Jul 13, 2025):