[GH-ISSUE #10619] ollama ps显示GPU100%,任务管理器显示GPU无占用 #6987

Closed
opened 2026-04-12 18:52:57 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @zhz-hh-boy on GitHub (May 8, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10619

What is the issue?

R7 7735H+8GB*2+RTX4050Laptop
运行时ollama ps截图,任务管理器截图及日志文件如下:
Ollama.zip
Image
Image

Relevant log output


OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.6.8

Originally created by @zhz-hh-boy on GitHub (May 8, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10619 ### What is the issue? R7 7735H+8GB*2+RTX4050Laptop 运行时ollama ps截图,任务管理器截图及日志文件如下: [Ollama.zip](https://github.com/user-attachments/files/20100623/Ollama.zip) ![Image](https://github.com/user-attachments/assets/abb2d7cb-56a5-4f32-80fc-259a98ed275f) ![Image](https://github.com/user-attachments/assets/99d2472b-9e26-4c80-8e56-6e91d3a8ea92) ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.6.8
GiteaMirror added the bug label 2026-04-12 18:52:57 -05:00
Author
Owner

@rick-github commented on GitHub (May 8, 2025):

ollama ps显示GPU100%,任务管理器显示GPU无占用

ollama ps does not show GPU 100%. olllama has loaded 81% of the model in VRAM, 19% in system RAM because it didn't thiink there was enough VRAM to host the model. Server logs will show memory calculations.

<!-- gh-comment-id:2862822219 --> @rick-github commented on GitHub (May 8, 2025): > ollama ps显示GPU100%,任务管理器显示GPU无占用 ollama ps does not show GPU 100%. olllama has loaded 81% of the model in VRAM, 19% in system RAM because it didn't thiink there was enough VRAM to host the model. [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will show memory calculations.
Author
Owner

@mu-dan commented on GitHub (May 9, 2025):

ollama ps显示GPU100%,任务管理器显示GPU无占用

ollama ps 不显示 GPU 100%。olllama 在 VRAM 中加载了 81% 的模型,在系统 RAM 中加载了 19% 的模型,因为它没有足够的 VRAM 来托管模型。服务器日志将显示内存计算。

In this case, can it be understood that the hardware performance of the server where you are ollama does not support this model (insufficient resources)
出现这种情况,是不是可以理解为,你ollama所在的服务器硬件性能不支持这个模型(资源不够)

<!-- gh-comment-id:2864872526 --> @mu-dan commented on GitHub (May 9, 2025): > > ollama ps显示GPU100%,任务管理器显示GPU无占用 > > ollama ps 不显示 GPU 100%。olllama 在 VRAM 中加载了 81% 的模型,在系统 RAM 中加载了 19% 的模型,因为它没有足够的 VRAM 来托管模型。[服务器日志](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues)将显示内存计算。 In this case, can it be understood that the hardware performance of the server where you are ollama does not support this model (insufficient resources) 出现这种情况,是不是可以理解为,你ollama所在的服务器硬件性能不支持这个模型(资源不够)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6987