mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-10 07:43:10 -05:00
Inconsistency in command line regarding "--gpus" #3497
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @rabinnh on GitHub (Jan 30, 2025).
A note, the READE.md page shows:
To run Open WebUI with Nvidia GPU support, use this command:
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cudaBut under the heading "With bundled Ollama"
With GPU Support: Utilize GPU resources by running the following command:
docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollamaSo the question is, do I need the '=' sign, will either work, or is it actually different depending on whether or not you're running in the same container as Ollama?
@wxfred commented on GitHub (Feb 11, 2025):
I'm using the
But only cpu is running.
@senhao-xu commented on GitHub (Feb 13, 2025):
I also encountered the same problem
@wxfred commented on GitHub (Feb 13, 2025):
I saw the log of the container, compatible gpu is not detected. But running nvidia-smi in the container shows my gpu info correctly.
I installed the windows bundle ollama direclty, it can use my GPU after i install the CUDA toolkits. Then i restarted my container, no magic happened.