[GH-ISSUE #2286] Codellama70b runs, but Codellama70b-Instruct spins forever after downloading #27077

Closed
opened 2026-04-22 04:00:29 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @ewebgh33 on GitHub (Jan 31, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2286

Originally assigned to: @bmizerany on GitHub.

Wondering if this is a config issue or something else? IE are any of the additional model files that are downloaded alongside the 38gb main file, borked in any way?

Ollama is via WSL in windows.

ollama run codellama:70b
works and gives me code

ollama run codellama:70b-instruct
downloads but has the spinning dots thing and never progresses.
That is to say
Verifies
"removing any unused layers"
"success"
But then nothing. Can't prompt, it just sits here and spins.

Exit and restart, try it again. Already downloaded so it skips re-downloading... but spins again for infinite amount of time.

Since 70b vanilla runs, it can't be memory or GPUs?
I have 2x 4090 and 64gb RAM. Sure 128 would be better but as I said, 70b vanilla runs fine.

Thanks

Originally created by @ewebgh33 on GitHub (Jan 31, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2286 Originally assigned to: @bmizerany on GitHub. Wondering if this is a config issue or something else? IE are any of the additional model files that are downloaded alongside the 38gb main file, borked in any way? Ollama is via WSL in windows. `ollama run codellama:70b` works and gives me code `ollama run codellama:70b-instruct` downloads but has the spinning dots thing and never progresses. That is to say Verifies "removing any unused layers" "success" But then nothing. Can't prompt, it just sits here and spins. Exit and restart, try it again. Already downloaded so it skips re-downloading... but spins again for infinite amount of time. Since 70b vanilla runs, it can't be memory or GPUs? I have 2x 4090 and 64gb RAM. Sure 128 would be better but as I said, 70b vanilla runs fine. Thanks
GiteaMirror added the needs more infowindows labels 2026-04-22 04:00:29 -05:00
Author
Owner

@bmizerany commented on GitHub (Mar 12, 2024):

Are you still seeing this issue with the latest release of Ollama? We are unable to reproduce with a similar environment.

If the issue persists, can you run with OLLAMA_DEBUG=1 and see if anything stands out in the logs at ~/.ollama/logs?

<!-- gh-comment-id:1992517297 --> @bmizerany commented on GitHub (Mar 12, 2024): Are you still seeing this issue with the latest release of Ollama? We are unable to reproduce with a similar environment. If the issue persists, can you run with `OLLAMA_DEBUG=1` and see if anything stands out in the logs at `~/.ollama/logs`?
Author
Owner

@pdevine commented on GitHub (Jul 19, 2024):

codellama:70b-instruct should be working fine right now. I know the issue was from a while ago, but let's go ahead and close it. We can reopen if you're still seeing problems.

<!-- gh-comment-id:2240242079 --> @pdevine commented on GitHub (Jul 19, 2024): `codellama:70b-instruct` should be working fine right now. I know the issue was from a while ago, but let's go ahead and close it. We can reopen if you're still seeing problems.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27077