fix: Apple Silicon test fails pytorch not imported #3205

Closed
opened 2025-11-11 15:25:42 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @richtong on GitHub (Jan 6, 2025).

Bug Report

Installation Method

Installation was pipx install open-webui and also git clone of repo and running npm run dev for frontend and dev.sh in backend

Environment

  • Open WebUI Version: v0.5.4

  • Ollama (if applicable): 0.5.4

  • Operating System: MacOS Sequoia 15.2

  • Browser (if applicable): Brave v1.73.104

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

On Apple Silicon, DEVICE_TYPE="mps" to enable GPU acceleration

Actual Behavior:

On Apple Silicon M4 Max, the check for a pytorch device silently fails because the exception that torch is not loaded, the import for torch currently only happens if CUDA_DEVICE is requested.

Description

Bug Summary:

The current test is, but torch is not imported (this only happens if USE_CUDA=true, there is no top level import. The test fails because torch is not loaded and since the exceptions are ignored, DEVICE_TYPE="cpu" is left as it is and the user gets no explanation. Fix is in https://github.com/open-webui/open-webui/pull/8366

try:
  if torch.backends.mps.is available() and torch.backends.mps.is_built():
     DEVICE_TYPE="mps"
except Exception:
  pass

The fix (although ugly) is to import torch in the try loop and also to print the actual
exception (it looks like logging is not used in this part of the code and print to stdout)
is what happens

try:
  import torch
  if torch.backends.mps.is available() and torch.backends.mps.is_built():
     DEVICE_TYPE="mps"
except Exception as e:
  print(f"Apple Silicon test failed {e=}")

Reproduction Details

Steps to Reproduce:

  1. On your Apple Silicon machine
  2. pipx install open-webui (or git clone)
  3. open-webui serve &
  4. Go to localhost:8080
  5. Goto Chat and upload a large 200-300KB document
  6. Observe the CPU meter and GPU meter in Activity Monitor, CPU goes to 100%

Logs and Screenshots

Browser Console Logs:
N/A

Docker Container Logs:
N/A install from repo or pipx install

Screenshots/Screen Recordings (if applicable):
I can include an Activity monitor if you want

Additional Information

Having two import pytorch in the Apple Silicon and the CUDA test is kind of ugly but the minimum code change, you probably just want to wrap the entire test section in a Try/except rather than having two, but presume you want the msalled rewrite

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @richtong on GitHub (Jan 6, 2025). # Bug Report ## Installation Method Installation was pipx install open-webui and also git clone of repo and running npm run dev for frontend and dev.sh in backend ## Environment - **Open WebUI Version:** v0.5.4 - **Ollama (if applicable):** 0.5.4 - **Operating System:** MacOS Sequoia 15.2 - **Browser (if applicable):** Brave v1.73.104 **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: On Apple Silicon, DEVICE_TYPE="mps" to enable GPU acceleration ## Actual Behavior: On Apple Silicon M4 Max, the check for a pytorch device silently fails because the exception that torch is not loaded, the import for torch currently only happens if CUDA_DEVICE is requested. ## Description **Bug Summary:** The current test is, but torch is not imported (this only happens if USE_CUDA=true, there is no top level import. The test fails because torch is not loaded and since the exceptions are ignored, DEVICE_TYPE="cpu" is left as it is and the user gets no explanation. Fix is in https://github.com/open-webui/open-webui/pull/8366 ``` try: if torch.backends.mps.is available() and torch.backends.mps.is_built(): DEVICE_TYPE="mps" except Exception: pass ``` The fix (although ugly) is to import torch in the try loop and also to print the actual exception (it looks like logging is not used in this part of the code and print to stdout) is what happens ``` try: import torch if torch.backends.mps.is available() and torch.backends.mps.is_built(): DEVICE_TYPE="mps" except Exception as e: print(f"Apple Silicon test failed {e=}") ``` ## Reproduction Details **Steps to Reproduce:** 1. On your Apple Silicon machine 2. pipx install open-webui (or git clone) 3. open-webui serve & 4. Go to localhost:8080 5. Goto Chat and upload a large 200-300KB document 6. Observe the CPU meter and GPU meter in Activity Monitor, CPU goes to 100% ## Logs and Screenshots **Browser Console Logs:** N/A **Docker Container Logs:** N/A install from repo or pipx install **Screenshots/Screen Recordings (if applicable):** I can include an Activity monitor if you want ## Additional Information Having two import pytorch in the Apple Silicon and the CUDA test is kind of ugly but the minimum code change, you probably just want to wrap the entire test section in a Try/except rather than having two, but presume you want the msalled rewrite ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@richtong commented on GitHub (Jan 18, 2025):

Thanks all!

@richtong commented on GitHub (Jan 18, 2025): Thanks all!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3205