mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #20146] issue: Ask / Explain popups no longer work #19101
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @druellan on GitHub (Dec 23, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/20146
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
0.6.43
Ollama Version (if applicable)
No response
Operating System
Windows 10 / Ubuntu 24.04.1 LTS
Browser (if applicable)
Chrome / Firefox
Confirmation
README.md.Expected Behavior
When I select a fragment of a text in the model responses, a popup shows and I can ASK or EXPLAIN the answer. If I click on ASK or EXPLAIN, the model provides extra information about the text selected.
Actual Behavior
The popup shows up, but trying to use ASK or EXPLAIN results in an error message
An error occurred while fetching the explanationand nothing happens.Steps to Reproduce
Logs & Screenshots
The browser network activity shows a call to:
/api/chat/completionsreturning a 400 server error with the json{"detail":"'NoneType' object has no attribute 'get'"}OWUI console shows only this line:
2025-12-23 15:23:39.240 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:53721 - "POST /api/chat/completions HTTP/1.1" 400Additional Information
I'm using external models using the OpenAI completions API.
Tested locally via GIT clone and in a server using Docker.
Tested with different models but ALL of them external models (no ollama models)
Might be related to this bug report: https://github.com/open-webui/open-webui/issues/15579
@owui-terminator[bot] commented on GitHub (Dec 23, 2025):
🔍 Similar Issues Found
I found some existing issues that might be related to this one. Please check if any of these are duplicates or contain helpful solutions:
#19877 issue:
by dotmobo • Dec 11, 2025 •
bug#19861 issue:
by QuitHub • Dec 10, 2025 •
bug#20019 issue:
by j63440490 • Dec 17, 2025 •
bug#19777 issue:
by Yaute7 • Dec 05, 2025 •
bug#20092 issue:
by VideoRyan • Dec 22, 2025 •
bugShow 5 more related issues
#20046 issue:
by pierrelouisbescond • Dec 19, 2025 •
bug#20107 issue:
by mengdeer589 • Dec 22, 2025 •
bug#19864 issue:
by Haervwe • Dec 10, 2025 •
bug#20059 issue: Chat response is not working
by navilg • Dec 20, 2025 •
bug#19563 issue:
by naruto7g • Nov 28, 2025 •
bug💡 Tips:
This comment was generated automatically by a bot. Please react with a 👍 if this comment was helpful, or a 👎 if it was not.
@druellan commented on GitHub (Dec 23, 2025):
The issue was confirmed by another user in Discord.
@Ithanil commented on GitHub (Dec 24, 2025):
I think the following log message is correlated with the issue:
open_webui.main:chat_completion:1670 - Error processing chat metadata: 'NoneType' object has no attribute 'get'It has level DEBUG, so likely why @druellan didn't see it in his logs. But it matches what he observed on the browser console.
EDIT: Likely introduced by
f1bf4f20c5@silentoplayz commented on GitHub (Dec 27, 2025):
I am also able to reproduce this issue on the latest
devcommit. Sending a message to the model this way results in a failed request to the/api/chat/completionsendpoint. I have tested this with my llama.cpp server external connection in Open WebUI.@Classic298 commented on GitHub (Dec 28, 2025):
fixed by https://github.com/open-webui/open-webui/pull/20212
@Classic298 commented on GitHub (Dec 30, 2025):
fixed in dev