[GH-ISSUE #7692] Getting no compatible GPUs were discovered yet I have gpu #4912

Closed
opened 2026-04-12 15:58:02 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @mosquet on GitHub (Nov 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7692

What is the issue?

When my pc goes to sleep sometime the gpu connection is lost

2024/11/15 19:56:13 routes.go:1189: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" 2024-11-15T19:56:13.362448649Z time=2024-11-15T19:56:13.362Z level=INFO source=images.go:755 msg="total blobs: 37" 2024-11-15T19:56:13.364066191Z time=2024-11-15T19:56:13.363Z level=INFO source=images.go:762 msg="total unused blobs removed: 0" 2024-11-15T19:56:13.365638182Z time=2024-11-15T19:56:13.365Z level=INFO source=routes.go:1240 msg="Listening on [::]:11434 (version 0.4.1)" 2024-11-15T19:56:13.368269602Z time=2024-11-15T19:56:13.367Z level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[cpu_avx cpu_avx2 cuda_v11 cuda_v12 cpu]" 2024-11-15T19:56:13.368604044Z time=2024-11-15T19:56:13.368Z level=INFO source=gpu.go:221 msg="looking for compatible GPUs" 2024-11-15T19:56:13.383488354Z time=2024-11-15T19:56:13.383Z level=INFO source=gpu.go:386 msg="no compatible GPUs were discovered" 2024-11-15T19:56:13.383536438Z time=2024-11-15T19:56:13.383Z level=INFO source=types.go:123 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="31.1 GiB" available="24.6 GiB"

Fri Nov 15 19:59:28 2024 2024-11-15T19:59:28.584372444Z +-----------------------------------------------------------------------------------------+ 2024-11-15T19:59:28.584388296Z | NVIDIA-SMI 560.35.03 Driver Version: 560.35.03 CUDA Version: 12.6 | 2024-11-15T19:59:28.584399840Z |-----------------------------------------+------------------------+----------------------+ 2024-11-15T19:59:28.584410958Z | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | 2024-11-15T19:59:28.584422578Z | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | 2024-11-15T19:59:28.584437552Z | | | MIG M. | 2024-11-15T19:59:28.584447617Z |=========================================+========================+======================| 2024-11-15T19:59:28.674749982Z | 0 NVIDIA GeForce RTX 3090 Off | 00000000:01:00.0 Off | N/A | 2024-11-15T19:59:28.674803184Z | 0% 41C P5 30W / 350W | 958MiB / 24576MiB | 42% Default | 2024-11-15T19:59:28.674861798Z | | | N/A | 2024-11-15T19:59:28.674868490Z +-----------------------------------------+------------------------+----------------------+ 2024-11-15T19:59:28.675044806Z 2024-11-15T19:59:28.675082544Z +-----------------------------------------------------------------------------------------+ 2024-11-15T19:59:28.675090168Z | Processes: | 2024-11-15T19:59:28.675098139Z | GPU GI CI PID Type Process name GPU Memory | 2024-11-15T19:59:28.675105563Z | ID ID Usage | 2024-11-15T19:59:28.675112134Z |=========================================================================================| 2024-11-15T19:59:28.677696019Z | No running processes found | 2024-11-15T19:59:28.677746593Z +-----------------------------------------------------------------------------------------+

OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.4.1

Originally created by @mosquet on GitHub (Nov 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7692 ### What is the issue? When my pc goes to sleep sometime the gpu connection is lost `2024/11/15 19:56:13 routes.go:1189: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" 2024-11-15T19:56:13.362448649Z time=2024-11-15T19:56:13.362Z level=INFO source=images.go:755 msg="total blobs: 37" 2024-11-15T19:56:13.364066191Z time=2024-11-15T19:56:13.363Z level=INFO source=images.go:762 msg="total unused blobs removed: 0" 2024-11-15T19:56:13.365638182Z time=2024-11-15T19:56:13.365Z level=INFO source=routes.go:1240 msg="Listening on [::]:11434 (version 0.4.1)" 2024-11-15T19:56:13.368269602Z time=2024-11-15T19:56:13.367Z level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[cpu_avx cpu_avx2 cuda_v11 cuda_v12 cpu]" 2024-11-15T19:56:13.368604044Z time=2024-11-15T19:56:13.368Z level=INFO source=gpu.go:221 msg="looking for compatible GPUs" 2024-11-15T19:56:13.383488354Z time=2024-11-15T19:56:13.383Z level=INFO source=gpu.go:386 msg="no compatible GPUs were discovered" 2024-11-15T19:56:13.383536438Z time=2024-11-15T19:56:13.383Z level=INFO source=types.go:123 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="31.1 GiB" available="24.6 GiB" ` `Fri Nov 15 19:59:28 2024 2024-11-15T19:59:28.584372444Z +-----------------------------------------------------------------------------------------+ 2024-11-15T19:59:28.584388296Z | NVIDIA-SMI 560.35.03 Driver Version: 560.35.03 CUDA Version: 12.6 | 2024-11-15T19:59:28.584399840Z |-----------------------------------------+------------------------+----------------------+ 2024-11-15T19:59:28.584410958Z | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | 2024-11-15T19:59:28.584422578Z | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | 2024-11-15T19:59:28.584437552Z | | | MIG M. | 2024-11-15T19:59:28.584447617Z |=========================================+========================+======================| 2024-11-15T19:59:28.674749982Z | 0 NVIDIA GeForce RTX 3090 Off | 00000000:01:00.0 Off | N/A | 2024-11-15T19:59:28.674803184Z | 0% 41C P5 30W / 350W | 958MiB / 24576MiB | 42% Default | 2024-11-15T19:59:28.674861798Z | | | N/A | 2024-11-15T19:59:28.674868490Z +-----------------------------------------+------------------------+----------------------+ 2024-11-15T19:59:28.675044806Z 2024-11-15T19:59:28.675082544Z +-----------------------------------------------------------------------------------------+ 2024-11-15T19:59:28.675090168Z | Processes: | 2024-11-15T19:59:28.675098139Z | GPU GI CI PID Type Process name GPU Memory | 2024-11-15T19:59:28.675105563Z | ID ID Usage | 2024-11-15T19:59:28.675112134Z |=========================================================================================| 2024-11-15T19:59:28.677696019Z | No running processes found | 2024-11-15T19:59:28.677746593Z +-----------------------------------------------------------------------------------------+` ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.4.1
GiteaMirror added the linuxnvidiabug labels 2026-04-12 15:58:02 -05:00
Author
Owner

@mosquet commented on GitHub (Nov 15, 2024):

another issue is now even after restart this issue is not fixed i just updated to the receent version

<!-- gh-comment-id:2479833878 --> @mosquet commented on GitHub (Nov 15, 2024): another issue is now even after restart this issue is not fixed i just updated to the receent version
Author
Owner

@dhiltgen commented on GitHub (Nov 19, 2024):

This sounds like a dup of #5464

In #7669 users have found adding a sleep to the startup script may be a viable workaround until we can wire up dependencies to ensure ollama starts after the GPU is fully woken back up and ready.

<!-- gh-comment-id:2484459003 --> @dhiltgen commented on GitHub (Nov 19, 2024): This sounds like a dup of #5464 In #7669 users have found adding a sleep to the startup script may be a viable workaround until we can wire up dependencies to ensure ollama starts after the GPU is fully woken back up and ready.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4912