mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-22 06:02:06 -05:00
issue: Ollama proxy for POST /api/show endpoint returns Status 422 with correct parameters
#5699
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @gmacario on GitHub (Jul 4, 2025).
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.6.15
Ollama Version (if applicable)
v0.9.3
Operating System
Ubuntu 24.04
Browser (if applicable)
N.A.
Confirmation
README.md.Expected Behavior
According to Ollama API documentation at https://ollama.readthedocs.io/en/api/#show-model-information,
endpoint
POST /api/showexpects the following parameters:model: name of the model to showverbose: (optional) if set totrue, returns full data for verbose response fieldsThis is confirmed by invoking Ollama API directly.
According to section "Ollama API Proxy Support" as documented at https://docs.openwebui.com/getting-started/api-endpoints/#-ollama-api-proxy-support, this API endpoint should also be available through Open WebUI at
http://<openwebui_host>:<openwebui_port>/ollama/api/showActual Behavior
Instead of getting Status 200, the Ollama API
POST /api/showinvoked with correct parameters but proxied through Open WebUI returns Status 422 with the following error.This behaviour causes many Ollama clients which use Open WebUI proxy to break, most notably
Copilot Chat extension for VS Code.
Steps to Reproduce
Login to the host where
http://localhost:3000http://10.1.204.21:11434then inspect an installed model (example:
llama3.2) calling either Ollama directlyor through the Ollama API Proxy provided by Open WebUI (for security reasons I partially obfuscated my API Key):
Logs & Screenshots
Results invoking Ollama API directly with the same parameters --> OK (NOTE: I trimmed the contents of model.out for clarity)
Results invoking Ollama API through Open WebUI --> ERROR
Excerpt from
docker logs -f open-webuiAdditional Information
No response
@gmacario commented on GitHub (Jul 4, 2025):
Ensuring that the Content-type of the request is "application/json"
Result:
@gmacario commented on GitHub (Jul 4, 2025):
It looks like the API is correctly handled if parameter
modelis instead calledname:Result:
Excerpt from
docker logs -f open-webui:@rgaricano commented on GitHub (Jul 4, 2025):
Confirmed,
Open-webUI ollama/api/show endpoint respond to
{"name":"llama3.2:latest"}and not to
{"model": "llama3.2:latest"}It should be adapted to fit with ollama API.
@rgaricano commented on GitHub (Jul 4, 2025):
@gmacario : to can see a swagger with all endpoints in
http://yourOpenWebUI/docs(in dev mode)
@rgaricano commented on GitHub (Jul 4, 2025):
Same for other ollama endpoints, as unload, delete or pull.
i'll try to test changing name by model in
59ba21bdf8/backend/open_webui/routers/ollama.py (L636-L637)@gmacario commented on GitHub (Jul 4, 2025):
I was trying the same in https://github.com/gmacario/open-webui/blob/gmacario-fix-api-show/backend/open_webui/routers/ollama.py but I am just learning how to setup the development environment :-)
@gmacario commented on GitHub (Jul 4, 2025):
It looks like sometime around November 2024 Ollama applied some parameter name change (from
nametomodel) to the following API endpoints:POST /api/createPOST /api/showDELETE /api/deletePOST /api/pullPOST /api/pushAt least this is what I have inferred from https://github.com/ollama/ollama/pull/7731/files
@rgaricano commented on GitHub (Jul 4, 2025):
how do you run Open-WebUI (cloned, docker, pip)?
(for chatting about best on discord channel https://discord.gg/3jZam3YK )
I do it in a easy way, cloned, and build
& start Open-WebUI with backend/start.sh
(I have it as service)
then if I change something in src I rebuild it (
NODE_OPTIONS=--max_old_space_size=12000 npm run build),if changes in backend just restart openwebui.
I follow code on github and edit with nano in my local copy.
@gmacario commented on GitHub (Jul 5, 2025):
Fix via https://github.com/open-webui/open-webui/pull/15527