[GH-ISSUE #4587] Choosing Ollama Embedding Model Non-Functional #13664

Closed
opened 2026-04-19 20:19:28 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @C4RP3N0CT3M on GitHub (Aug 14, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/4587

Bug Report

Installation Method

Docker in WSL2

Environment

  • Open WebUI Version: v0.3.12

  • Ollama: v0.3.4

  • Operating System: Windows 11, WSL2 Ubuntu

  • Browser: Edge

  • I have read and followed all the instructions provided in the README.md.

  • I am on the latest version of both Open WebUI and Ollama.

  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

I expect to be able to use the nomic-embed-text model from Ollama for RAG ingestion

Actual Behavior:

Upon using the # for ingested documents, no context is used by the LLM (Codestral in this case), but is used when running the default sentence transformer.

Description

Bug Summary:

Upon using the # for ingested documents, no context is used by the LLM (Codestral in this case), but is used when running the default sentence transformer.

Reproduction Details

Choose the settings as shown in the picture below.

##Images
Ollama Embedding Model Not Working

Originally created by @C4RP3N0CT3M on GitHub (Aug 14, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/4587 # Bug Report ## Installation Method Docker in WSL2 ## Environment - Open WebUI Version: v0.3.12 - Ollama: v0.3.4 - Operating System: Windows 11, WSL2 Ubuntu - Browser: Edge - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: I expect to be able to use the nomic-embed-text model from Ollama for RAG ingestion ## Actual Behavior: Upon using the # for ingested documents, no context is used by the LLM (Codestral in this case), but is used when running the default sentence transformer. ## Description **Bug Summary:** Upon using the # for ingested documents, no context is used by the LLM (Codestral in this case), but is used when running the default sentence transformer. ## Reproduction Details Choose the settings as shown in the picture below. ##Images ![Ollama Embedding Model Not Working](https://github.com/user-attachments/assets/0e960e4a-bd34-4eb3-b805-b2eb2506ed0e)
Author
Owner

@justinh-rahb commented on GitHub (Aug 14, 2024):

There was a bug in the Ollama embedding API endpoint, try updating it to 0.3.6

<!-- gh-comment-id:2287941158 --> @justinh-rahb commented on GitHub (Aug 14, 2024): There was a bug in the Ollama embedding API endpoint, try updating it to 0.3.6
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13664