[GH-ISSUE #2140] Embedding api returns null (sometimes) #63260

Closed
opened 2026-05-03 12:46:14 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Gal-Lahat on GitHub (Jan 22, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2140

Originally assigned to: @jmorganca on GitHub.

This is my code (C# .NET):

string url = "http://localhost:11434/api/embeddings";
string json = "{ \"model\": \"llama2:text\",\"prompt\": \"" + jsonSafeText + "\" }";

// get the response field from the json response
HttpClient client = new HttpClient();
var response = client.PostAsync(url, new StringContent(json, System.Text.Encoding.UTF8, "application/json")).Result;

if (response.StatusCode != System.Net.HttpStatusCode.OK)
{
Debug.LogError("Error getting embedding for: " + jsonSafeText);
return new float[0];
}

string responseString = response.Content.ReadAsStringAsync().Result;

On about 50% of the calls i get:
{"embedding":null}
as response with no errors.

The issue persists on all models that I've tested (llama2, llama2:text, mistral, mistran:text)

The first run is always fine, but from the second run onwards it fail randomly with no error.

Originally created by @Gal-Lahat on GitHub (Jan 22, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2140 Originally assigned to: @jmorganca on GitHub. This is my code (C# .NET): ```cs string url = "http://localhost:11434/api/embeddings"; string json = "{ \"model\": \"llama2:text\",\"prompt\": \"" + jsonSafeText + "\" }"; // get the response field from the json response HttpClient client = new HttpClient(); var response = client.PostAsync(url, new StringContent(json, System.Text.Encoding.UTF8, "application/json")).Result; if (response.StatusCode != System.Net.HttpStatusCode.OK) { Debug.LogError("Error getting embedding for: " + jsonSafeText); return new float[0]; } string responseString = response.Content.ReadAsStringAsync().Result; ``` On about 50% of the calls i get: `{"embedding":null}` as response with no errors. The issue persists on all models that I've tested (llama2, llama2:text, mistral, mistran:text) The first run is always fine, but from the second run onwards it fail randomly with no error.
GiteaMirror added the bug label 2026-05-03 12:46:14 -05:00
Author
Owner

@alpe commented on GitHub (Feb 2, 2024):

I was only able to replicate the issue on my box when the prompt is empty. For example:

curl -X POST http://localhost:11434/api/embeddings -d "{ \"model\": \"llama2\",\"prompt\": \"\" }"

Interestingly, the first call completes with the {"embedding":null} response but a second call freezes the instance. 🤷 This is a 🐛 . I can open a PR with a simple fix that rejects empty inputs. That should help.

I was running the server on OSX 14.3 with Apple M2.

<!-- gh-comment-id:1923645246 --> @alpe commented on GitHub (Feb 2, 2024): I was only able to replicate the issue on my box when the prompt is empty. For example: ```sh curl -X POST http://localhost:11434/api/embeddings -d "{ \"model\": \"llama2\",\"prompt\": \"\" }" ``` Interestingly, the first call completes with the `{"embedding":null}` response but a second call freezes the instance. 🤷 This is a 🐛 . I can open a PR with a simple fix that rejects empty inputs. That should help. I was running the server on OSX 14.3 with Apple M2.
Author
Owner

@jmorganca commented on GitHub (Mar 13, 2024):

This should be fixed now in 0.1.29. Sorry about this!

<!-- gh-comment-id:1996051862 --> @jmorganca commented on GitHub (Mar 13, 2024): This should be fixed now in 0.1.29. Sorry about this!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#63260