[GH-ISSUE #9419] Update v0.5.9 version,the url is not work #15499

Closed
opened 2026-04-19 21:41:04 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @StruggleJia on GitHub (Feb 5, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/9419

Bug Report

loginfo like:
INFO: xxxxx:63954 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO [open_webui.routers.openai] get_all_models()
ERROR [open_webui.routers.openai] 503, message='Attempt to decode JSON with unexpected mimetype: ', url='http://127.0.0.1:10000/chat/completions'
ERROR [open_webui.routers.openai] 503, message='Service Unavailable', url='http://127.0.0.1:10000/chat/completions'
Traceback (most recent call last):
File "/root/.cache/uv/archive-v0/XQde7WErJLJfjY_PGcZJC/lib/python3.11/site-packages/open_webui/routers/openai.py", line 696, in generate_chat_completion
r.raise_for_status()
File "/root/.cache/uv/archive-v0/XQde7WErJLJfjY_PGcZJC/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 1161, in raise_for_status
raise ClientResponseError(
aiohttp.client_exceptions.ClientResponseError: 503, message='Service Unavailable', url='http://127.0.0.1:10000/chat/completions'
INFO: 211.154.168.125:63954 - "POST /api/chat/completions HTTP/1.1" 400 Bad Request
INFO: 211.154.168.125:63954 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK

backend :llama.cpp run deepseek-r1 on port 10000
openai webui v0.5.7 request url is http://127.0.0.1:10000/completions and work fine
but the latest request is http://127.0.0.1:10000/chat/completions,and errinfo:
ERROR [open_webui.routers.openai] 503, message='Attempt to decode JSON with unexpected mimetype: ', url='http://127.0.0.1:10000/chat/completions'

in setting is openai url:http://127.0.0.1:10000
thk

Originally created by @StruggleJia on GitHub (Feb 5, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/9419 # Bug Report loginfo like: INFO: xxxxx:63954 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK INFO [open_webui.routers.openai] get_all_models() ERROR [open_webui.routers.openai] 503, message='Attempt to decode JSON with unexpected mimetype: ', url='http://127.0.0.1:10000/chat/completions' ERROR [open_webui.routers.openai] 503, message='Service Unavailable', url='http://127.0.0.1:10000/chat/completions' Traceback (most recent call last): File "/root/.cache/uv/archive-v0/XQde7WErJLJfjY_PGcZJC/lib/python3.11/site-packages/open_webui/routers/openai.py", line 696, in generate_chat_completion r.raise_for_status() File "/root/.cache/uv/archive-v0/XQde7WErJLJfjY_PGcZJC/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 1161, in raise_for_status raise ClientResponseError( aiohttp.client_exceptions.ClientResponseError: 503, message='Service Unavailable', url='http://127.0.0.1:10000/chat/completions' INFO: 211.154.168.125:63954 - "POST /api/chat/completions HTTP/1.1" 400 Bad Request INFO: 211.154.168.125:63954 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK backend :llama.cpp run deepseek-r1 on port 10000 openai webui v0.5.7 request url is http://127.0.0.1:10000/completions and work fine but the latest request is http://127.0.0.1:10000/chat/completions,and errinfo: ERROR [open_webui.routers.openai] 503, message='Attempt to decode JSON with unexpected mimetype: ', url='http://127.0.0.1:10000/chat/completions' in setting is openai url:http://127.0.0.1:10000 thk
Author
Owner

@yanyinglin commented on GitHub (Feb 6, 2025):

Same issue.

I've double check the response content-type is appropriate. I am using vllm with openai api. How did you fix this issue? @StruggleJia

Image

Image

<!-- gh-comment-id:2640897870 --> @yanyinglin commented on GitHub (Feb 6, 2025): Same issue. I've double check the response content-type is appropriate. I am using vllm with openai api. How did you fix this issue? @StruggleJia ![Image](https://github.com/user-attachments/assets/1bea01f3-b834-4f0e-a5b5-e96e8d0db69e) ![Image](https://github.com/user-attachments/assets/2d6d3339-a34d-467f-a5ba-6a95e087e5cb)
Author
Owner

@theshyPika commented on GitHub (Feb 17, 2025):

There's one solution, execute the below setup before start open-webui serve:
export ENABLE_OLLAMA_API=false
export OPENAI_API_BASE_URL=http://0.0.0.0:11434/v1

<!-- gh-comment-id:2662320707 --> @theshyPika commented on GitHub (Feb 17, 2025): There's one solution, execute the below setup before start open-webui serve: export ENABLE_OLLAMA_API=false export OPENAI_API_BASE_URL=http://0.0.0.0:11434/v1
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#15499