[GH-ISSUE #9733] Ollama 0.6.0 with Gemma3: panic: failed to sample token: no tokens to sample from #6363

Closed
opened 2026-04-12 17:52:43 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @chigkim on GitHub (Mar 13, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9733

Originally assigned to: @ParthSareen on GitHub.

What is the issue?

Ollama with Gemma3:27b sometimes throws an error if I include an image in first message.
panic: failed to sample token: no tokens to sample from
However, if I prime it by sending a message without image first, get a response, and then send a new message with the exact same message but with an image, it works.

Relevant log output

panic: failed to sample token: no tokens to sample from
Full log is with debug is here:
https://pastebin.com/raw/9bMdEtuj

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.6.0

Originally created by @chigkim on GitHub (Mar 13, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9733 Originally assigned to: @ParthSareen on GitHub. ### What is the issue? Ollama with Gemma3:27b sometimes throws an error if I include an image in first message. panic: failed to sample token: no tokens to sample from However, if I prime it by sending a message without image first, get a response, and then send a new message with the exact same message but with an image, it works. ### Relevant log output ```shell panic: failed to sample token: no tokens to sample from Full log is with debug is here: https://pastebin.com/raw/9bMdEtuj ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.6.0
GiteaMirror added the bug label 2026-04-12 17:52:43 -05:00
Author
Owner

@ParthSareen commented on GitHub (Mar 13, 2025):

Hey! Sorry you're running into this. Is this running from the default parameters?

<!-- gh-comment-id:2721592311 --> @ParthSareen commented on GitHub (Mar 13, 2025): Hey! Sorry you're running into this. Is this running from the default parameters?
Author
Owner

@chigkim commented on GitHub (Mar 13, 2025):

Default parameter as in like temperature, top_k, etc? I have temperature 0.1, top_k 64, and top_p 0.95.

<!-- gh-comment-id:2722761596 --> @chigkim commented on GitHub (Mar 13, 2025): Default parameter as in like temperature, top_k, etc? I have temperature 0.1, top_k 64, and top_p 0.95.
Author
Owner

@chigkim commented on GitHub (Mar 18, 2025):

I just upgraded to 0.6.2, and it doesn't seem to crash anymore when I feed both text and image for the first time.
Should I close this or leave it open?

<!-- gh-comment-id:2731553499 --> @chigkim commented on GitHub (Mar 18, 2025): I just upgraded to 0.6.2, and it doesn't seem to crash anymore when I feed both text and image for the first time. Should I close this or leave it open?
Author
Owner

@ParthSareen commented on GitHub (Mar 18, 2025):

I just upgraded to 0.6.2, and it doesn't seem to crash anymore when I feed both text and image for the first time.
Should I close this or leave it open?

Awesome 🙏

Let's close it out and I'll follow up on the other thread as well. Thanks for testing so quick!

<!-- gh-comment-id:2731555155 --> @ParthSareen commented on GitHub (Mar 18, 2025): > I just upgraded to 0.6.2, and it doesn't seem to crash anymore when I feed both text and image for the first time. > Should I close this or leave it open? Awesome 🙏 Let's close it out and I'll follow up on the other thread as well. Thanks for testing so quick!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6363