mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-08 04:16:03 -05:00
[GH-ISSUE #9419] Update v0.5.9 version,the url is not work #15499
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @StruggleJia on GitHub (Feb 5, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/9419
Bug Report
loginfo like:
INFO: xxxxx:63954 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO [open_webui.routers.openai] get_all_models()
ERROR [open_webui.routers.openai] 503, message='Attempt to decode JSON with unexpected mimetype: ', url='http://127.0.0.1:10000/chat/completions'
ERROR [open_webui.routers.openai] 503, message='Service Unavailable', url='http://127.0.0.1:10000/chat/completions'
Traceback (most recent call last):
File "/root/.cache/uv/archive-v0/XQde7WErJLJfjY_PGcZJC/lib/python3.11/site-packages/open_webui/routers/openai.py", line 696, in generate_chat_completion
r.raise_for_status()
File "/root/.cache/uv/archive-v0/XQde7WErJLJfjY_PGcZJC/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 1161, in raise_for_status
raise ClientResponseError(
aiohttp.client_exceptions.ClientResponseError: 503, message='Service Unavailable', url='http://127.0.0.1:10000/chat/completions'
INFO: 211.154.168.125:63954 - "POST /api/chat/completions HTTP/1.1" 400 Bad Request
INFO: 211.154.168.125:63954 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
backend :llama.cpp run deepseek-r1 on port 10000
openai webui v0.5.7 request url is http://127.0.0.1:10000/completions and work fine
but the latest request is http://127.0.0.1:10000/chat/completions,and errinfo:
ERROR [open_webui.routers.openai] 503, message='Attempt to decode JSON with unexpected mimetype: ', url='http://127.0.0.1:10000/chat/completions'
in setting is openai url:http://127.0.0.1:10000
thk
@yanyinglin commented on GitHub (Feb 6, 2025):
Same issue.
I've double check the response content-type is appropriate. I am using vllm with openai api. How did you fix this issue? @StruggleJia
@theshyPika commented on GitHub (Feb 17, 2025):
There's one solution, execute the below setup before start open-webui serve:
export ENABLE_OLLAMA_API=false
export OPENAI_API_BASE_URL=http://0.0.0.0:11434/v1