[GH-ISSUE #9093] H800 Support Need #67973

Closed
opened 2026-05-04 12:08:52 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @CimaChongD on GitHub (Feb 14, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9093

I need to run ollama on an 8-GPU H800 server, but it is currently not supported. I hope to get support for this as soon as possible.

Image

Originally created by @CimaChongD on GitHub (Feb 14, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9093 I need to run ollama on an 8-GPU H800 server, but it is currently not supported. I hope to get support for this as soon as possible. ![Image](https://github.com/user-attachments/assets/e6eb8441-d28d-42d3-9410-504269d80149)
GiteaMirror added the feature request label 2026-05-04 12:08:52 -05:00
Author
Owner

@ice6 commented on GitHub (Feb 14, 2025):

@CimaChongD use vllm? may I know the price of it? I want to buy one. 😃

<!-- gh-comment-id:2658531261 --> @ice6 commented on GitHub (Feb 14, 2025): @CimaChongD use `vllm`? may I know the price of it? I want to buy one. 😃
Author
Owner

@CimaChongD commented on GitHub (Feb 14, 2025):

@CimaChongD use vllm? may I know the price of it? I want to buy one. 😃

More than 2,300,000 Yuan

<!-- gh-comment-id:2658567232 --> @CimaChongD commented on GitHub (Feb 14, 2025): > [@CimaChongD](https://github.com/CimaChongD) use `vllm`? may I know the price of it? I want to buy one. 😃 More than 2,300,000 Yuan
Author
Owner

@rick-github commented on GitHub (Feb 14, 2025):

There's a report here of ollama running on an H800. What errors are you seeing? What's in the server log?

<!-- gh-comment-id:2659118229 --> @rick-github commented on GitHub (Feb 14, 2025): There's a report [here](https://github.com/ollama/ollama/issues/2826) of ollama running on an H800. What errors are you seeing? What's in the [server log](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues)?
Author
Owner

@ice6 commented on GitHub (Feb 15, 2025):

@CimaChongD thanks for the information. too expensive for personal usage 😅

<!-- gh-comment-id:2660638706 --> @ice6 commented on GitHub (Feb 15, 2025): @CimaChongD thanks for the information. too expensive for personal usage 😅
Author
Owner

@CimaChongD commented on GitHub (Feb 15, 2025):

There's a report here of ollama running on an H800. What errors are you seeing? What's in the server log?

The log indicates that there is no compatible graphics card, and the screenshots taken previously were not saved.

<!-- gh-comment-id:2660711533 --> @CimaChongD commented on GitHub (Feb 15, 2025): > There's a report [here](https://github.com/ollama/ollama/issues/2826) of ollama running on an H800. What errors are you seeing? What's in the [server log](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues)? The log indicates that there is no compatible graphics card, and the screenshots taken previously were not saved.
Author
Owner

@rick-github commented on GitHub (Feb 15, 2025):

No log and no errors means there's little that can be done. Since #2826 indicates that it does work, it would seem to be a configuration problem on the machine and not an ollama issue.

<!-- gh-comment-id:2660895372 --> @rick-github commented on GitHub (Feb 15, 2025): No log and no errors means there's little that can be done. Since #2826 indicates that it does work, it would seem to be a configuration problem on the machine and not an ollama issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#67973