[GH-ISSUE #3824] Server error when submitting a request through OpenAI client #2367

Closed
opened 2026-04-12 12:41:05 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @mishushakov on GitHub (Apr 22, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3824

What is the issue?

Using Vercel AI SDK and llama2

Request:

const content = [
    { type: 'text', text: prompt },
    { type: 'text', text: page.content },
 ]

const result = await experimental_generateObject({
    model,
    schema,
    messages: [{ role: 'user', content }],
    temperature,
  })

Response:

mish@Mishs-MBP llm-scraper % bun examples/hn.ts                                                            
12 |     statusCode === 429 || // too many requests
13 |     statusCode >= 500),
14 |     // server error
15 |     data
16 |   }) {
17 |     super(message);
         ^
AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string
 url: "http://127.0.0.1:11434/v1/chat/completions"

      at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5)
      at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12
12 |     statusCode === 429 || // too many requests
13 |     statusCode >= 500),
14 |     // server error
15 |     data
16 |   }) {
17 |     super(message);
         ^
AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string
 url: "http://127.0.0.1:11434/v1/chat/completions"

      at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5)
      at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12

Possibly related to #3690?

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.1.32

Originally created by @mishushakov on GitHub (Apr 22, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3824 ### What is the issue? Using Vercel AI SDK and llama2 Request: ```ts const content = [ { type: 'text', text: prompt }, { type: 'text', text: page.content }, ] const result = await experimental_generateObject({ model, schema, messages: [{ role: 'user', content }], temperature, }) ``` Response: ``` mish@Mishs-MBP llm-scraper % bun examples/hn.ts 12 | statusCode === 429 || // too many requests 13 | statusCode >= 500), 14 | // server error 15 | data 16 | }) { 17 | super(message); ^ AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string url: "http://127.0.0.1:11434/v1/chat/completions" at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5) at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12 12 | statusCode === 429 || // too many requests 13 | statusCode >= 500), 14 | // server error 15 | data 16 | }) { 17 | super(message); ^ AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string url: "http://127.0.0.1:11434/v1/chat/completions" at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5) at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12 ``` Possibly related to #3690? ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.1.32
GiteaMirror added the bug label 2026-04-12 12:41:05 -05:00
Author
Owner

@thinkverse commented on GitHub (Apr 22, 2024):

Ollama's OpenAI compatibility doesn't support content arrays, only text contents, you can read about the supported request fields in the documentation: https://github.com/ollama/ollama/blob/main/docs/openai.md#supported-request-fields.

If you want to send more messages, you must send them separately.

[
    {
        "role": "system",
        "content": "You are a helpful assistant."
    },
    {
        "role": "user",
        "content": "Hello!"
    }
]
<!-- gh-comment-id:2069746422 --> @thinkverse commented on GitHub (Apr 22, 2024): Ollama's OpenAI compatibility doesn't support content arrays, only text contents, you can read about the supported request fields in the documentation: https://github.com/ollama/ollama/blob/main/docs/openai.md#supported-request-fields. If you want to send more messages, you must send them separately. ```json [ { "role": "system", "content": "You are a helpful assistant." }, { "role": "user", "content": "Hello!" } ] ```
Author
Owner

@mishushakov commented on GitHub (Apr 22, 2024):

Alright, will keep that in mind 😉

<!-- gh-comment-id:2070176682 --> @mishushakov commented on GitHub (Apr 22, 2024): Alright, will keep that in mind 😉
Author
Owner

@mishushakov commented on GitHub (Apr 22, 2024):

I'm getting this now:

mish@Mishs-MBP llm-scraper % bun examples/hn.ts
12 |     statusCode === 429 || // too many requests
13 |     statusCode >= 500),
14 |     // server error
15 |     data
16 |   }) {
17 |     super(message);
         ^
AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string
 url: "http://127.0.0.1:11434/v1/chat/completions"

      at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5)
      at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12
12 |     statusCode === 429 || // too many requests
13 |     statusCode >= 500),
14 |     // server error
15 |     data
16 |   }) {
17 |     super(message);
         ^
AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string
 url: "http://127.0.0.1:11434/v1/chat/completions"

      at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5)
      at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12

Code:

const result = await experimental_generateObject({
    model,
    schema,
    messages: [{ role: 'assistant', content: '' }, { role: 'user', content: '' }],
    temperature,
 })
<!-- gh-comment-id:2070868418 --> @mishushakov commented on GitHub (Apr 22, 2024): I'm getting this now: ``` mish@Mishs-MBP llm-scraper % bun examples/hn.ts 12 | statusCode === 429 || // too many requests 13 | statusCode >= 500), 14 | // server error 15 | data 16 | }) { 17 | super(message); ^ AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string url: "http://127.0.0.1:11434/v1/chat/completions" at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5) at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12 12 | statusCode === 429 || // too many requests 13 | statusCode >= 500), 14 | // server error 15 | data 16 | }) { 17 | super(message); ^ AI_APICallError: json: cannot unmarshal array into Go struct field Message.messages.content of type string url: "http://127.0.0.1:11434/v1/chat/completions" at new APICallError (/Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider/dist/index.mjs:17:5) at /Users/mish/Documents/Projects/llm-scraper/node_modules/@ai-sdk/provider-utils/dist/index.mjs:279:12 ``` Code: ```ts const result = await experimental_generateObject({ model, schema, messages: [{ role: 'assistant', content: '' }, { role: 'user', content: '' }], temperature, }) ```
Author
Owner

@thinkverse commented on GitHub (Apr 22, 2024):

From what I understand of Vercel AI SDK, it seems they convert messages.content into an array behind the scenes in the convertToLanguageModelPrompt function.

For instance, the if-statement on line 34 checks if the messages.content is a string, and if it is it'll convert it to an array.

Given that Ollama's OpenAI compatibility doesn't support that yet, Vercel's AI Core sadly isn't compatible either.

<!-- gh-comment-id:2071006813 --> @thinkverse commented on GitHub (Apr 22, 2024): From what I understand of Vercel AI SDK, it seems they convert `messages.content` into an array behind the scenes in the [convertToLanguageModelPrompt](https://github.com/vercel/ai/blob/c056579a34a7f71ebce87f0f83c3fc3da20e0a8e/packages/core/core/prompt/convert-to-language-model-prompt.ts#L11) function. For instance, [the if-statement on line 34](https://github.com/vercel/ai/blob/c056579a34a7f71ebce87f0f83c3fc3da20e0a8e/packages/core/core/prompt/convert-to-language-model-prompt.ts#L34-L39) checks if the `messages.content` is a string, and if it is it'll convert it to an array. Given that Ollama's OpenAI compatibility doesn't support that yet, Vercel's AI Core sadly isn't compatible either.
Author
Owner

@mishushakov commented on GitHub (Apr 22, 2024):

I have started an issue there too. Since this is not an issue with Ollama per se., I'm closing.
Thank you for all your help 😄

<!-- gh-comment-id:2071021031 --> @mishushakov commented on GitHub (Apr 22, 2024): I have started an issue there too. Since this is not an issue with Ollama per se., I'm closing. Thank you for all your help 😄
Author
Owner

@MaxLeiter commented on GitHub (May 5, 2024):

I believe https://github.com/ollama/ollama/pull/2506 would fix this, along with improving overall OpenAI compatibility.

<!-- gh-comment-id:2094909731 --> @MaxLeiter commented on GitHub (May 5, 2024): I believe https://github.com/ollama/ollama/pull/2506 would fix this, along with improving overall OpenAI compatibility.
Author
Owner

@mishushakov commented on GitHub (May 6, 2024):

Glad to hear that!

<!-- gh-comment-id:2095578951 --> @mishushakov commented on GitHub (May 6, 2024): Glad to hear that!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2367