issue: OpenWebui Dockers not compatable with 5090 gpus #4953

Closed
opened 2025-11-11 16:07:50 -06:00 by GiteaMirror · 3 comments
Owner

Originally created by @LazyDataScientistGit on GitHub (Apr 24, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

0.6.5

Ollama Version (if applicable)

0.6.6

Operating System

Windows

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

I am using Dockers, with GPU:
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda

I get an error message in the Docker console that cudaBlast is not available. From my limited knowlegde it believe the 5090 needs the latest and nighlty version of PyTorch in order to function.

Actual Behavior

CuBlast Error message with Whisper models.

Steps to Reproduce

Requires a 5090 gpu. Just run the docker command with GPU functionality.

Logs & Screenshots

NA

Additional Information

From my limited knowlegde it believe the 5090 needs the latest and nighlty version of PyTorch in order to function.

Originally created by @LazyDataScientistGit on GitHub (Apr 24, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version 0.6.5 ### Ollama Version (if applicable) 0.6.6 ### Operating System Windows ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior I am using Dockers, with GPU: `docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda` I get an error message in the Docker console that cudaBlast is not available. From my limited knowlegde it believe the 5090 needs the latest and nighlty version of PyTorch in order to function. ### Actual Behavior CuBlast Error message with Whisper models. ### Steps to Reproduce Requires a 5090 gpu. Just run the docker command with GPU functionality. ### Logs & Screenshots NA ### Additional Information From my limited knowlegde it believe the 5090 needs the latest and nighlty version of PyTorch in order to function.
GiteaMirror added the bug label 2025-11-11 16:07:50 -06:00
Author
Owner

@Umutayb commented on GitHub (Apr 27, 2025):

It works on my 5090, perhaps you have to install coda 12.8

@Umutayb commented on GitHub (Apr 27, 2025): It works on my 5090, perhaps you have to install coda 12.8
Author
Owner

@LazyDataScientistGit commented on GitHub (Apr 27, 2025):

I am using dockers so it should not matter.

@LazyDataScientistGit commented on GitHub (Apr 27, 2025): I am using dockers so it should not matter.
Author
Owner

@tjbck commented on GitHub (Apr 28, 2025):

Using docker absolutely does matter, should be addressed in dev.

Related: #13187

@tjbck commented on GitHub (Apr 28, 2025): Using docker absolutely does matter, should be addressed in dev. Related: #13187
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4953