[PR #11845] add cuda 13.x support #13633

Closed
opened 2026-04-13 00:31:40 -05:00 by GiteaMirror · 0 comments
Owner

Original Pull Request: https://github.com/ollama/ollama/pull/11845

State: closed
Merged: No


fix ollama can't use nvidia gpu when use ollama with cuda 13.0, meet error log:

Error: 500 Internal Server Error: llama runner process has terminated: cudaMalloc failed: out of memory
ggml_gallocr_reserve_n: failed to allocate CUDA0 buffer of size 8891928576
**Original Pull Request:** https://github.com/ollama/ollama/pull/11845 **State:** closed **Merged:** No --- fix ollama can't use nvidia gpu when use ollama with cuda 13.0, meet error log: Error: 500 Internal Server Error: llama runner process has terminated: cudaMalloc failed: out of memory ggml_gallocr_reserve_n: failed to allocate CUDA0 buffer of size 8891928576
GiteaMirror added the pull-request label 2026-04-13 00:31:40 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#13633