[GH-ISSUE #4171] Inconsistent or unresponsive response in llama v0.1.33 using llava model #64631

Closed
opened 2026-05-03 18:23:27 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @iwannabewater on GitHub (May 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4171

What is the issue?

Environment:

Operating System: Ubuntu 22.04
Hardware: NVIDIA RTX 4090 GPU and Intel Xeon Gold 6326 CPU
ollama Version: v0.1.33
Model Used: llava:34b-v1.6-q4_0

Description:
I am experiencing issues with the llava model in ollama v0.1.33, where it fails to respond appropriately to queries or provides random and unrelated answers. This problem occurs when attempting to analyze images. For example, after running the model and querying about an image (car.jpg), the expected behavior is a detailed description relevant to the image content. However, the model either does not respond or describes content that does not match the image provided.

Steps to Reproduce:

Start the model using the command: ollama run llava:34b-v1.6-q4_0
Input commands to describe an image, for example, >>> describe it. the path is: car.jpg
Observe the lack of appropriate response or incorrect descriptions.

Expected Behavior:
The model should consistently provide accurate and relevant responses to the image content queries.

Actual Behavior:
Responses are either missing, delayed, or incorrect, significantly hindering project progress where image analysis is crucial.

Additional Context:
This behavior has been consistently reproducible, impacting our ability to efficiently use the model for critical tasks. Any insights or fixes would be greatly appreciated.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

v0.1.33

Originally created by @iwannabewater on GitHub (May 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4171 ### What is the issue? **Environment:** Operating System: Ubuntu 22.04 Hardware: NVIDIA RTX 4090 GPU and Intel Xeon Gold 6326 CPU ollama Version: v0.1.33 Model Used: llava:34b-v1.6-q4_0 **Description:** I am experiencing issues with the llava model in ollama v0.1.33, where it fails to respond appropriately to queries or provides random and unrelated answers. This problem occurs when attempting to analyze images. For example, after running the model and querying about an image (car.jpg), the expected behavior is a detailed description relevant to the image content. However, the model either does not respond or describes content that does not match the image provided. **Steps to Reproduce:** Start the model using the command: ollama run llava:34b-v1.6-q4_0 Input commands to describe an image, for example, >>> describe it. the path is: car.jpg Observe the lack of appropriate response or incorrect descriptions. **Expected Behavior:** The model should consistently provide accurate and relevant responses to the image content queries. **Actual Behavior:** Responses are either missing, delayed, or incorrect, significantly hindering project progress where image analysis is crucial. **Additional Context:** This behavior has been consistently reproducible, impacting our ability to efficiently use the model for critical tasks. Any insights or fixes would be greatly appreciated. ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version v0.1.33
GiteaMirror added the bug label 2026-05-03 18:23:28 -05:00
Author
Owner

@thinkverse commented on GitHub (May 5, 2024):

Possible duplicate of https://github.com/ollama/ollama/issues/4163, fix is being worked on https://github.com/ollama/ollama/pull/4164.

<!-- gh-comment-id:2094838862 --> @thinkverse commented on GitHub (May 5, 2024): Possible duplicate of https://github.com/ollama/ollama/issues/4163, fix is being worked on https://github.com/ollama/ollama/pull/4164.
Author
Owner

@iwannabewater commented on GitHub (May 5, 2024):

Thank you for your timely reply. If the bug has been fixed, please let me know. Thx!

<!-- gh-comment-id:2094845592 --> @iwannabewater commented on GitHub (May 5, 2024): Thank you for your timely reply. If the bug has been fixed, please let me know. Thx!
Author
Owner

@jmorganca commented on GitHub (May 5, 2024):

@iwannabewater sorry this happened. Working on fixing it with a new release very soon. In the meantime I'll close this as a duplicate of #4163

<!-- gh-comment-id:2094847387 --> @jmorganca commented on GitHub (May 5, 2024): @iwannabewater sorry this happened. Working on fixing it with a new release very soon. In the meantime I'll close this as a duplicate of #4163
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64631