[GH-ISSUE #14538] [models] Bad Request Error, 500 Server Error: Internal Server Error #35191

Closed
opened 2026-04-22 19:33:21 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @shy20221121 on GitHub (Mar 2, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14538

What is the issue?

[models] Bad Request Error, 500 Server Error: Internal Server Error

Api address can be normal visit .

Image

However, the embedded model called when uploading files to the DIFY knowledge base throws an error.

Image

Relevant log output


OS

Linux

GPU

AMD

CPU

Intel

Ollama version

0.17.4

Originally created by @shy20221121 on GitHub (Mar 2, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14538 ### What is the issue? [models] Bad Request Error, 500 Server Error: Internal Server Error Api address can be normal visit . <img width="377" height="110" alt="Image" src="https://github.com/user-attachments/assets/495c01db-28bf-4299-a34a-2013e4a4b181" /> However, the embedded model called when uploading files to the DIFY knowledge base throws an error. <img width="281" height="331" alt="Image" src="https://github.com/user-attachments/assets/dd9980bd-d154-4e6f-95b4-4af8f4bcc2a9" /> ### Relevant log output ```shell ``` ### OS Linux ### GPU AMD ### CPU Intel ### Ollama version 0.17.4
GiteaMirror added the needs more infobug labels 2026-04-22 19:33:22 -05:00
Author
Owner

@liorgross commented on GitHub (Mar 2, 2026):

i believe it is the same as:
https://github.com/ollama/ollama/issues/13803

I am experiencing the same issue

<!-- gh-comment-id:3981671854 --> @liorgross commented on GitHub (Mar 2, 2026): i believe it is the same as: https://github.com/ollama/ollama/issues/13803 I am experiencing the same issue
Author
Owner

@LingFan11 commented on GitHub (Mar 2, 2026):

Thank you for reporting this issue!

This appears to be related to embedding model calls in DIFY knowledge base. Here are some troubleshooting steps:

  1. Check Ollama version - You are using 0.17.4. Please ensure you have the latest version.

  2. Verify the embedding model is available:
    ollama list

  3. Check Ollama server logs:
    journalctl -u ollama --no-pager -n 50

  4. AMD GPU users - There have been reported issues with AMD GPUs. Ensure you have the latest GPU drivers.

This issue seems similar to #13803. Could you provide more details about which embedding model you are using and full error logs?

<!-- gh-comment-id:3981841157 --> @LingFan11 commented on GitHub (Mar 2, 2026): Thank you for reporting this issue! This appears to be related to embedding model calls in DIFY knowledge base. Here are some troubleshooting steps: 1. **Check Ollama version** - You are using 0.17.4. Please ensure you have the latest version. 2. **Verify the embedding model is available**: `ollama list` 3. **Check Ollama server logs**: `journalctl -u ollama --no-pager -n 50` 4. **AMD GPU users** - There have been reported issues with AMD GPUs. Ensure you have the latest GPU drivers. This issue seems similar to #13803. Could you provide more details about which embedding model you are using and full error logs?
Author
Owner

@rick-github commented on GitHub (Mar 2, 2026):

@LingFan11 Please don't post AI slop.

<!-- gh-comment-id:3982838251 --> @rick-github commented on GitHub (Mar 2, 2026): @LingFan11 Please don't post AI slop.
Author
Owner

@rick-github commented on GitHub (Mar 2, 2026):

Server logs will aid in debugging.

<!-- gh-comment-id:3982840545 --> @rick-github commented on GitHub (Mar 2, 2026): [Server logs](https://docs.ollama.com/troubleshooting) will aid in debugging.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#35191