[GH-ISSUE #2779] Feature request: Additional Console Outputs for more efficient logging and debugging #27437

Closed
opened 2026-04-22 04:47:27 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @LumiWasTaken on GitHub (Feb 27, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2779

Originally assigned to: @dhiltgen on GitHub.

Heya, i have the common issue that for example when using LLAVA 34b on a small-ish GPU with CPU offloading it sometimes gets stuck.
I can't really trace the issue anywhere, is it the BLAST Batch Processing, is it a OOM error, what is it?

key clip.vision.image_grid_pinpoints not found in file
key clip.vision.mm_patch_merge_type not found in file
key clip.vision.image_crop_resolution not found in file

Is the most i can get out of a log but that's about it... no metrics, no nothing :/

Originally created by @LumiWasTaken on GitHub (Feb 27, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2779 Originally assigned to: @dhiltgen on GitHub. Heya, i have the common issue that for example when using LLAVA 34b on a small-ish GPU with CPU offloading it sometimes gets stuck. I can't really trace the issue anywhere, is it the BLAST Batch Processing, is it a OOM error, what is it? ``` key clip.vision.image_grid_pinpoints not found in file key clip.vision.mm_patch_merge_type not found in file key clip.vision.image_crop_resolution not found in file ``` Is the most i can get out of a log but that's about it... no metrics, no nothing :/
GiteaMirror added the memory label 2026-04-22 04:47:27 -05:00
Author
Owner

@dhiltgen commented on GitHub (Jul 24, 2024):

It's possible this might be an OOM problem...

If you're still hitting this, please make sure to upgrade to the latest version, and share a bit more details about your setup. What is your GPU size? Server logs may help diagnose it as well.

<!-- gh-comment-id:2249016575 --> @dhiltgen commented on GitHub (Jul 24, 2024): It's possible this might be an OOM problem... If you're still hitting this, please make sure to upgrade to the latest version, and share a bit more details about your setup. What is your GPU size? Server logs may help diagnose it as well.
Author
Owner

@LumiWasTaken commented on GitHub (Jul 25, 2024):

This has been up for too long and i don't even emember anymore. Hence why. #closed

<!-- gh-comment-id:2249977568 --> @LumiWasTaken commented on GitHub (Jul 25, 2024): This has been up for too long and i don't even emember anymore. Hence why. #closed
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27437