[GH-ISSUE #1438] Openchat in Ollama #62806

Closed
opened 2026-05-03 10:22:52 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @itscvenk on GitHub (Dec 8, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1438

Hello

Nvidia, CUDA, are all installed and working fine. Phew.

How do I verify that Ollama is actually using the GPU while responding. I am using the openchat model

Thanks a million for Ollama and especially for including the openchat model. Stay blessed & happy folks!

Regards

Originally created by @itscvenk on GitHub (Dec 8, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1438 Hello Nvidia, CUDA, are all installed and working fine. Phew. How do I verify that Ollama is actually using the GPU while responding. I am using the openchat model Thanks a million for Ollama and especially for including the openchat model. Stay blessed & happy folks! Regards
Author
Owner

@pdevine commented on GitHub (Dec 8, 2023):

The nvidia-smi command should show which processes are using the GPU. If you do something like watch -n 0.5 nvidia-smi it'll update as you're running inference. You can also use /set verbose in the REPL to show how many tokens/sec are being generated.

Hope that helps!

<!-- gh-comment-id:1847708548 --> @pdevine commented on GitHub (Dec 8, 2023): The `nvidia-smi` command should show which processes are using the GPU. If you do something like `watch -n 0.5 nvidia-smi` it'll update as you're running inference. You can also use `/set verbose` in the REPL to show how many tokens/sec are being generated. Hope that helps!
Author
Owner

@itscvenk commented on GitHub (Dec 9, 2023):

Yes it did @pdevine

Have several wonderful years ahead

`+-----------------------------------------------------------------------------+
| NVIDIA-SMI 525.147.05 Driver Version: 525.147.05 CUDA Version: 12.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 Tesla T4 On | 00000001:00:00.0 Off | 0 |
| N/A 44C P0 26W / 70W | 5615MiB / 15360MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 6185 C ...ld/cuda/bin/ollama-runner 5610MiB |
+-----------------------------------------------------------------------------+`

<!-- gh-comment-id:1848294014 --> @itscvenk commented on GitHub (Dec 9, 2023): Yes it did @pdevine Have several wonderful years ahead `+-----------------------------------------------------------------------------+ | NVIDIA-SMI 525.147.05 Driver Version: 525.147.05 CUDA Version: 12.0 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla T4 On | 00000001:00:00.0 Off | 0 | | N/A 44C P0 26W / 70W | 5615MiB / 15360MiB | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | 0 N/A N/A 6185 C ...ld/cuda/bin/ollama-runner 5610MiB | +-----------------------------------------------------------------------------+`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62806