[GH-ISSUE #9317] Unsupported JetPack version detected. GPU may not be supported #68136

Open
opened 2026-05-04 12:37:05 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @fedekrum on GitHub (Feb 24, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9317

Hi.
I boot my Jetson Nano dev Kit in console mode.
Ollama performance was horrible so I checked with jtop utility while using it

Image

Re-installing Ollama I got this

>>> Installing ollama to /usr/local
>>> Downloading Linux arm64 bundle
######################################################################## 100,0%
**WARNING: Unsupported JetPack version detected.  GPU may not be supported**
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> NVIDIA JetPack ready.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
cat /etc/nv_tegra_release
# R32 (release), REVISION: 7.6, GCID: 38171779, BOARD: t210ref, EABI: aarch64, DATE: Tue Nov  5 07:46:14 UTC 2024

ollama ps
NAME                ID              SIZE      PROCESSOR          UNTIL
deepseek-r1:1.5b    a42b25d8c10a    2.1 GB    31%/69% CPU/GPU    4 minutes from now

Any idea on how to make GPU work with Ollama?
Thanks

OS

Linux

GPU

Nvidia

CPU

Other

Ollama version

0.5.11

Originally created by @fedekrum on GitHub (Feb 24, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9317 Hi. I boot my Jetson Nano dev Kit in console mode. Ollama performance was horrible so I checked with jtop utility while using it <img width="1015" alt="Image" src="https://github.com/user-attachments/assets/ac0d707a-f3cf-4b47-bdc2-f0e9a7e23a90" /> Re-installing Ollama I got this ``` >>> Installing ollama to /usr/local >>> Downloading Linux arm64 bundle ######################################################################## 100,0% **WARNING: Unsupported JetPack version detected. GPU may not be supported** >>> Adding ollama user to video group... >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Enabling and starting ollama service... >>> NVIDIA JetPack ready. >>> The Ollama API is now available at 127.0.0.1:11434. >>> Install complete. Run "ollama" from the command line. ``` ``` cat /etc/nv_tegra_release # R32 (release), REVISION: 7.6, GCID: 38171779, BOARD: t210ref, EABI: aarch64, DATE: Tue Nov 5 07:46:14 UTC 2024 ollama ps NAME ID SIZE PROCESSOR UNTIL deepseek-r1:1.5b a42b25d8c10a 2.1 GB 31%/69% CPU/GPU 4 minutes from now ``` Any idea on how to make GPU work with Ollama? Thanks ### OS Linux ### GPU Nvidia ### CPU Other ### Ollama version 0.5.11
GiteaMirror added the bug label 2026-05-04 12:37:05 -05:00
Author
Owner

@easis commented on GitHub (Mar 25, 2025):

+1

<!-- gh-comment-id:2751664170 --> @easis commented on GitHub (Mar 25, 2025): +1
Author
Owner

@driversti commented on GitHub (Apr 21, 2025):

Hey folks. I've found a workaround when running in Docker. Please take a look at my comment here.

<!-- gh-comment-id:2817887701 --> @driversti commented on GitHub (Apr 21, 2025): Hey folks. I've found a workaround when running in Docker. Please take a look at my comment [here](https://github.com/ollama/ollama/issues/9503#issuecomment-2817867166).
Author
Owner

@easis commented on GitHub (Apr 22, 2025):

Hey folks. I've found a workaround when running in Docker. Please take a look at my comment here.

Bear in mind that we're talking about Jetson Nano (not the Orin one) :)

<!-- gh-comment-id:2821243150 --> @easis commented on GitHub (Apr 22, 2025): > Hey folks. I've found a workaround when running in Docker. Please take a look at my comment [here](https://github.com/ollama/ollama/issues/9503#issuecomment-2817867166). Bear in mind that we're talking about Jetson Nano (not the Orin one) :)
Author
Owner

@ZhuoQian111 commented on GitHub (May 11, 2025):

+1

<!-- gh-comment-id:2870007767 --> @ZhuoQian111 commented on GitHub (May 11, 2025): +1
Author
Owner

@ZhuoQian111 commented on GitHub (May 11, 2025):

I have the same issue, my device is Jetson nano. I'm not sure if Jetson Nano supports Jetpack

<!-- gh-comment-id:2870008674 --> @ZhuoQian111 commented on GitHub (May 11, 2025): I have the same issue, my device is Jetson nano. I'm not sure if Jetson Nano supports Jetpack
Author
Owner

@ZhuoQian111 commented on GitHub (May 11, 2025):

Image Image
<!-- gh-comment-id:2870008876 --> @ZhuoQian111 commented on GitHub (May 11, 2025): <img width="745" alt="Image" src="https://github.com/user-attachments/assets/ee141e5e-a867-449a-823f-3c3cd044df67" /> <img width="745" alt="Image" src="https://github.com/user-attachments/assets/8bf8a8a7-be7d-4bce-ac4c-c7039cc2fdd9" />
Author
Owner

@ZhuoQian111 commented on GitHub (May 11, 2025):

I have checked the script

Image Image It's only supports Jetpack5 and higher. the offical of nvidia have answered that jetpack5 will not support jetson nano and there is no support plan . Image In order to solve this issue, we can only hope that Ollama's official to supports the lower version of jetpack🤣😂 @fedekrum
<!-- gh-comment-id:2870018174 --> @ZhuoQian111 commented on GitHub (May 11, 2025): I have checked the script <img width="961" alt="Image" src="https://github.com/user-attachments/assets/6f6e30de-1b9b-4698-9eaa-f340c0ad0df1" /> <img width="945" alt="Image" src="https://github.com/user-attachments/assets/c43eb04d-d6e7-4e6d-badc-f9cca763e80a" /> It's only supports Jetpack5 and higher. the offical of nvidia have answered that jetpack5 will not support jetson nano and there is no support plan . <img width="1082" alt="Image" src="https://github.com/user-attachments/assets/fc254fae-359e-4a1a-9b0d-6ba2e1e49a30" /> In order to solve this issue, we can only hope that Ollama's official to supports the lower version of jetpack🤣😂 @fedekrum
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68136