Getting a 400 error when using the docker images with ollama bundled #3380

Closed
opened 2025-11-11 15:30:35 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @colinrees on GitHub (Jan 23, 2025).

Installation Method

Installed using the docker instructions for GPU with Ollama installed

Environment

Latest versions of dockerdesktop
I have also tried the dev version
I have done a fresh install of docker desktop, ollama
also tried with ollama running locally on the machine not in docker.

Confirmation:

  • [X ] I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Expected to get a response from the llm

Actual Behavior:

receive the following error:
400: 1 validation error for GenerateChatCompletionForm
format
Input should be a valid dictionary [type=dict_type, input_value='json', input_type=str]
For further information visit https://errors.pydantic.dev/2.9/v/dict_type

Description

Reproduction Details

Steps to Reproduce:
load an llm and type hi into the box.

Logs and Screenshots

Docker Container Logs:
INFO: 172.17.0.1:52398 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
2025-01-23 11:29:12 ERROR [open_webui.routers.ollama] 1 validation error for GenerateChatCompletionForm
2025-01-23 11:29:12 format
2025-01-23 11:29:12 Input should be a valid dictionary [type=dict_type, input_value='json', input_type=str]
2025-01-23 11:29:12 For further information visit https://errors.pydantic.dev/2.9/v/dict_type
2025-01-23 11:29:12 Traceback (most recent call last):
2025-01-23 11:29:12 File "/app/backend/open_webui/routers/ollama.py", line 981, in generate_chat_completion
2025-01-23 11:29:12 form_data = GenerateChatCompletionForm(**form_data)
2025-01-23 11:29:12 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-23 11:29:12 File "/usr/local/lib/python3.11/site-packages/pydantic/main.py", line 212, in init
2025-01-23 11:29:12 validated_self = self.pydantic_validator.validate_python(data, self_instance=self)
2025-01-23 11:29:12 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-23 11:29:12 pydantic_core._pydantic_core.ValidationError: 1 validation error for GenerateChatCompletionForm
2025-01-23 11:29:12 format
2025-01-23 11:29:12 Input should be a valid dictionary [type=dict_type, input_value='json', input_type=str]
2025-01-23 11:29:12 For further information visit https://errors.pydantic.dev/2.9/v/dict_type

Originally created by @colinrees on GitHub (Jan 23, 2025). ## Installation Method Installed using the docker instructions for GPU with Ollama installed ## Environment Latest versions of dockerdesktop I have also tried the dev version I have done a fresh install of docker desktop, ollama also tried with ollama running locally on the machine not in docker. **Confirmation:** - [X ] I have read and followed all the instructions provided in the README.md. - [X] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Expected to get a response from the llm ## Actual Behavior: receive the following error: 400: 1 validation error for GenerateChatCompletionForm format Input should be a valid dictionary [type=dict_type, input_value='json', input_type=str] For further information visit https://errors.pydantic.dev/2.9/v/dict_type ## Description ## Reproduction Details **Steps to Reproduce:** load an llm and type hi into the box. ## Logs and Screenshots **Docker Container Logs:** INFO: 172.17.0.1:52398 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK 2025-01-23 11:29:12 ERROR [open_webui.routers.ollama] 1 validation error for GenerateChatCompletionForm 2025-01-23 11:29:12 format 2025-01-23 11:29:12 Input should be a valid dictionary [type=dict_type, input_value='json', input_type=str] 2025-01-23 11:29:12 For further information visit https://errors.pydantic.dev/2.9/v/dict_type 2025-01-23 11:29:12 Traceback (most recent call last): 2025-01-23 11:29:12 File "/app/backend/open_webui/routers/ollama.py", line 981, in generate_chat_completion 2025-01-23 11:29:12 form_data = GenerateChatCompletionForm(**form_data) 2025-01-23 11:29:12 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-01-23 11:29:12 File "/usr/local/lib/python3.11/site-packages/pydantic/main.py", line 212, in __init__ 2025-01-23 11:29:12 validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) 2025-01-23 11:29:12 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-01-23 11:29:12 pydantic_core._pydantic_core.ValidationError: 1 validation error for GenerateChatCompletionForm 2025-01-23 11:29:12 format 2025-01-23 11:29:12 Input should be a valid dictionary [type=dict_type, input_value='json', input_type=str] 2025-01-23 11:29:12 For further information visit https://errors.pydantic.dev/2.9/v/dict_type
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3380