[GH-ISSUE #22260] issue: attempting to unload a model from ollama while using owui as proxy fails due to incorrect field verification #58345

Closed
opened 2026-05-05 22:58:48 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @Umutayb on GitHub (Mar 5, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/22260

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.8.8

Ollama Version (if applicable)

0.17.6

Operating System

Ubuntu 22.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

curl --location --request POST 'http://localhost:11435/api/generate'
--header 'Content-Type: application/json'
--data-raw '{
"model": "gemma3:4b",
"keep_alive": 0
}'
{"model":"gemma3:4b","created_at":"2026-03-05T10:51:52.284097808Z","response":"","done":true,"done_reason":"unload"}%

Actual Behavior

curl --location --request POST 'http://localhost:3000/ollama/api/generate'
--header 'Content-Type: application/json'
--data-raw '{
"model": "gemma3:4b",
"keep_alive": 0
}'
{"detail":[{"type":"missing","loc":["body","prompt"],"msg":"Field required","input":{"model":"gemma3:4b","keep_alive":0}}]}

Check https://ollama.apidog.io/unload-a-model-14808502e0 for the api spec.

Steps to Reproduce

start owui via docker.
add a model (gemma3:4b)
call ollama service via docker image started by owui docker compose.

curl --location --request POST 'http://localhost:11435/api/generate'
--header 'Content-Type: application/json'
--data-raw '{
"model": "gemma3:4b",
"keep_alive": 0
}'

observe that the request succeeds:
{"model":"gemma3:4b","created_at":"2026-03-05T10:51:52.284097808Z","response":"","done":true,"done_reason":"unload"}%

now do the same but call ollama with owui as the proxy:

curl --location --request POST 'http://localhost:3000/ollama/api/generate'
--header 'Content-Type: application/json'
--data-raw '{
"model": "gemma3:4b",
"keep_alive": 0
}'

observe failure:
{"detail":[{"type":"missing","loc":["body","prompt"],"msg":"Field required","input":{"model":"gemma3:4b","keep_alive":0}}]}

Logs & Screenshots

curl --location --request POST 'http://localhost:3000/ollama/api/generate'
--header 'Content-Type: application/json'
--data-raw '{
"model": "gemma3:4b",
"keep_alive": 0
}'
{"detail":[{"type":"missing","loc":["body","prompt"],"msg":"Field required","input":{"model":"gemma3:4b","keep_alive":0}}]}

Additional Information

No response

Originally created by @Umutayb on GitHub (Mar 5, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/22260 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.8.8 ### Ollama Version (if applicable) 0.17.6 ### Operating System Ubuntu 22.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior curl --location --request POST 'http://localhost:11435/api/generate' \ --header 'Content-Type: application/json' \ --data-raw '{ "model": "gemma3:4b", "keep_alive": 0 }' {"model":"gemma3:4b","created_at":"2026-03-05T10:51:52.284097808Z","response":"","done":true,"done_reason":"unload"}% ### Actual Behavior curl --location --request POST 'http://localhost:3000/ollama/api/generate' \ --header 'Content-Type: application/json' --data-raw '{ "model": "gemma3:4b", "keep_alive": 0 }' {"detail":[{"type":"missing","loc":["body","prompt"],"msg":"Field required","input":{"model":"gemma3:4b","keep_alive":0}}]} Check https://ollama.apidog.io/unload-a-model-14808502e0 for the api spec. ### Steps to Reproduce start owui via docker. add a model (gemma3:4b) call ollama service via docker image started by owui docker compose. curl --location --request POST 'http://localhost:11435/api/generate' \ --header 'Content-Type: application/json' \ --data-raw '{ "model": "gemma3:4b", "keep_alive": 0 }' observe that the request succeeds: {"model":"gemma3:4b","created_at":"2026-03-05T10:51:52.284097808Z","response":"","done":true,"done_reason":"unload"}% now do the same but call ollama with owui as the proxy: curl --location --request POST 'http://localhost:3000/ollama/api/generate' \ --header 'Content-Type: application/json' --data-raw '{ "model": "gemma3:4b", "keep_alive": 0 }' observe failure: {"detail":[{"type":"missing","loc":["body","prompt"],"msg":"Field required","input":{"model":"gemma3:4b","keep_alive":0}}]} ### Logs & Screenshots curl --location --request POST 'http://localhost:3000/ollama/api/generate' \ --header 'Content-Type: application/json' --data-raw '{ "model": "gemma3:4b", "keep_alive": 0 }' {"detail":[{"type":"missing","loc":["body","prompt"],"msg":"Field required","input":{"model":"gemma3:4b","keep_alive":0}}]} ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-05 22:58:48 -05:00
Author
Owner

@tjbck commented on GitHub (Mar 8, 2026):

95b65ff751

<!-- gh-comment-id:4017734865 --> @tjbck commented on GitHub (Mar 8, 2026): 95b65ff751f91131b633cb128ff2decdd87c4a85
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58345