[GH-ISSUE #10402] Official RTX 5090 Support #68892

Closed
opened 2026-05-04 15:35:22 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @stanblesk on GitHub (Apr 25, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10402

Dear developers, I would like to ask when official support for RTX 5090 will appear? We are planning to purchase them to start working with Ollama. For now, the lack of official support scares us a little.
Thank you in advance for your answer!

Originally created by @stanblesk on GitHub (Apr 25, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10402 Dear developers, I would like to ask when official support for RTX 5090 will appear? We are planning to purchase them to start working with Ollama. For now, the lack of official support scares us a little. Thank you in advance for your answer!
GiteaMirror added the model label 2026-05-04 15:35:22 -05:00
Author
Owner

@TheArKaID commented on GitHub (Apr 25, 2025):

I'm able to run Ollama within Docker. Or you mentioning other kind of "official support"?

Image

<!-- gh-comment-id:2829853194 --> @TheArKaID commented on GitHub (Apr 25, 2025): I'm able to run Ollama within Docker. Or you mentioning other kind of "official support"? ![Image](https://github.com/user-attachments/assets/6df868e9-4c4d-4128-8ef7-e249e91da42e)
Author
Owner

@FlippingBinary commented on GitHub (Apr 25, 2025):

Same here. My only pain point so far had nothing to do with Ollama directly, but I'll mention it because it was Ollama-adjacent. I was using Kokoro-FastAPI for text to speech in Open-WebUI to read Ollama's output until I upgraded to the 5090 and discovered Kokoro-FastAPI depends on PyTorch 2.6, which doesn't support Blackwell. However, that shouldn't be a problem much longer because PyTorch 2.7 was just released two days ago with support for Blackwell.

In other words, you really shouldn't have a problem.

<!-- gh-comment-id:2830606428 --> @FlippingBinary commented on GitHub (Apr 25, 2025): Same here. My only pain point so far had nothing to do with Ollama directly, but I'll mention it because it was Ollama-adjacent. I was using Kokoro-FastAPI for text to speech in Open-WebUI to read Ollama's output until I upgraded to the 5090 and discovered Kokoro-FastAPI depends on PyTorch 2.6, which doesn't support Blackwell. However, that shouldn't be a problem much longer because PyTorch 2.7 was just released two days ago with support for Blackwell. In other words, you really shouldn't have a problem.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68892