[GH-ISSUE #2445] Ollama stuck on "CUDA Compute Capability detected: 7.5" #1427

Closed
opened 2026-04-12 11:18:13 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Rhimzy on GitHub (Feb 11, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2445

WIndows 11
Ubuntu WSL

Logs:

> OLLAMA_HOST=127.0.0.1:11435 ollama serve
time=2024-02-11T11:04:49.410+05:30 level=INFO source=images.go:863 msg="total blobs: 0"
time=2024-02-11T11:04:49.410+05:30 level=INFO source=images.go:870 msg="total unused blobs removed: 0"
time=2024-02-11T11:04:49.410+05:30 level=INFO source=routes.go:999 msg="Listening on 127.0.0.1:11435 (version 0.1.24)"
time=2024-02-11T11:04:49.411+05:30 level=INFO source=payload_common.go:106 msg="Extracting dynamic libraries..."
time=2024-02-11T11:04:51.905+05:30 level=INFO source=payload_common.go:145 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 rocm_v5 rocm_v6 cpu cuda_v11]"
time=2024-02-11T11:04:51.905+05:30 level=INFO source=gpu.go:94 msg="Detecting GPU type"
time=2024-02-11T11:04:51.905+05:30 level=INFO source=gpu.go:242 msg="Searching for GPU management library libnvidia-ml.so"
time=2024-02-11T11:04:53.334+05:30 level=INFO source=gpu.go:288 msg="Discovered GPU libraries: [/usr/lib/wsl/lib/libnvidia-ml.so.1 /usr/lib/wsl/drivers/nvami.inf_amd64_99c8019dbacde1b2/libnvidia-ml.so.1]"
time=2024-02-11T11:04:54.300+05:30 level=INFO source=gpu.go:99 msg="Nvidia GPU detected"
time=2024-02-11T11:04:54.301+05:30 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-11T11:04:54.307+05:30 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 7.5"

And it just gets stuck there
I am not very familiar with how it goes after that..

Originally created by @Rhimzy on GitHub (Feb 11, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2445 WIndows 11 Ubuntu WSL Logs: ``` > OLLAMA_HOST=127.0.0.1:11435 ollama serve time=2024-02-11T11:04:49.410+05:30 level=INFO source=images.go:863 msg="total blobs: 0" time=2024-02-11T11:04:49.410+05:30 level=INFO source=images.go:870 msg="total unused blobs removed: 0" time=2024-02-11T11:04:49.410+05:30 level=INFO source=routes.go:999 msg="Listening on 127.0.0.1:11435 (version 0.1.24)" time=2024-02-11T11:04:49.411+05:30 level=INFO source=payload_common.go:106 msg="Extracting dynamic libraries..." time=2024-02-11T11:04:51.905+05:30 level=INFO source=payload_common.go:145 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 rocm_v5 rocm_v6 cpu cuda_v11]" time=2024-02-11T11:04:51.905+05:30 level=INFO source=gpu.go:94 msg="Detecting GPU type" time=2024-02-11T11:04:51.905+05:30 level=INFO source=gpu.go:242 msg="Searching for GPU management library libnvidia-ml.so" time=2024-02-11T11:04:53.334+05:30 level=INFO source=gpu.go:288 msg="Discovered GPU libraries: [/usr/lib/wsl/lib/libnvidia-ml.so.1 /usr/lib/wsl/drivers/nvami.inf_amd64_99c8019dbacde1b2/libnvidia-ml.so.1]" time=2024-02-11T11:04:54.300+05:30 level=INFO source=gpu.go:99 msg="Nvidia GPU detected" time=2024-02-11T11:04:54.301+05:30 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" time=2024-02-11T11:04:54.307+05:30 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 7.5" ``` And it just gets stuck there I am not very familiar with how it goes after that..
Author
Owner

@easp commented on GitHub (Feb 12, 2024):

Ollama serve just blocks and waits for an API request. What happens if you open another shell window and ollama run phi?

<!-- gh-comment-id:1937968278 --> @easp commented on GitHub (Feb 12, 2024): Ollama serve just blocks and waits for an API request. What happens if you open another shell window and `ollama run phi`?
Author
Owner

@Rhimzy commented on GitHub (Feb 20, 2024):

Ollama serve just blocks and waits for an API request. What happens if you open another shell window and ollama run phi?

Thanks man, that worked.

<!-- gh-comment-id:1953647252 --> @Rhimzy commented on GitHub (Feb 20, 2024): > Ollama serve just blocks and waits for an API request. What happens if you open another shell window and `ollama run phi`? Thanks man, that worked.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1427