[GH-ISSUE #5001] please add support for rk3588 NPU #65201

Open
opened 2026-05-03 19:59:55 -05:00 by GiteaMirror · 12 comments
Owner

Originally created by @mrobinson-opi on GitHub (Jun 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5001

Please add support for rk3588 NPU. Thanks.

Originally created by @mrobinson-opi on GitHub (Jun 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5001 Please add support for rk3588 NPU. Thanks.
GiteaMirror added the feature request label 2026-05-03 19:59:56 -05:00
Author
Owner

@wozwdaqian commented on GitHub (Aug 14, 2024):

Please tell me when can I support rk3588, thank you very much

<!-- gh-comment-id:2288073927 --> @wozwdaqian commented on GitHub (Aug 14, 2024): Please tell me when can I support rk3588, thank you very much
Author
Owner

@z1qaz1qa commented on GitHub (Aug 26, 2024):

Please add support for rk3588 NPU. Thanks.

But it need rk3588 engineer support rkllama

<!-- gh-comment-id:2309552630 --> @z1qaz1qa commented on GitHub (Aug 26, 2024): > Please add support for rk3588 NPU. Thanks. But it need rk3588 engineer support rkllama
Author
Owner

@displague commented on GitHub (Nov 14, 2024):

In the meanwhile, check out https://github.com/Pelochus/ezrknn-llm

<!-- gh-comment-id:2476682982 --> @displague commented on GitHub (Nov 14, 2024): In the meanwhile, check out https://github.com/Pelochus/ezrknn-llm
Author
Owner

@RonkyTang commented on GitHub (Feb 12, 2025):

In the meanwhile, check out https://github.com/Pelochus/ezrknn-llm

i hope use one ollama service to run the deepseek model.

<!-- gh-comment-id:2653994072 --> @RonkyTang commented on GitHub (Feb 12, 2025): > In the meanwhile, check out https://github.com/Pelochus/ezrknn-llm i hope use one ollama service to run the deepseek model.
Author
Owner

@displague commented on GitHub (Feb 12, 2025):

@RonkyTang see https://github.com/airockchip/rknn-llm

<!-- gh-comment-id:2654419130 --> @displague commented on GitHub (Feb 12, 2025): @RonkyTang see https://github.com/airockchip/rknn-llm
Author
Owner

@Offshore21 commented on GitHub (Feb 12, 2025):

FYI

Do u see is this?

https://huggingface.co/spaces

With deepseek without tracker…

On Wed, Feb 12, 2025 at 2:38 PM Marques Johansson @.***>
wrote:

@RonkyTang https://github.com/RonkyTang see
https://github.com/airockchip/rknn-llm


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/5001#issuecomment-2654419130,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/BLYS6N7IWIWCP3T6HTAK62L2POBJ3AVCNFSM6AAAAABJGHP3J2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNJUGQYTSMJTGA
.
You are receiving this because you are subscribed to this thread.Message
ID: @.***>

<!-- gh-comment-id:2654429570 --> @Offshore21 commented on GitHub (Feb 12, 2025): FYI Do u see is this? https://huggingface.co/spaces With deepseek without tracker… On Wed, Feb 12, 2025 at 2:38 PM Marques Johansson ***@***.***> wrote: > @RonkyTang <https://github.com/RonkyTang> see > https://github.com/airockchip/rknn-llm > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/5001#issuecomment-2654419130>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/BLYS6N7IWIWCP3T6HTAK62L2POBJ3AVCNFSM6AAAAABJGHP3J2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNJUGQYTSMJTGA> > . > You are receiving this because you are subscribed to this thread.Message > ID: ***@***.***> >
Author
Owner

@RonkyTang commented on GitHub (Feb 13, 2025):

@RonkyTang see https://github.com/airockchip/rknn-llm

Thank you!
But how to merge to ollama service? i want only use ollama, because need use other model in ollama.

<!-- gh-comment-id:2655404514 --> @RonkyTang commented on GitHub (Feb 13, 2025): > [@RonkyTang](https://github.com/RonkyTang) see https://github.com/airockchip/rknn-llm Thank you! But how to merge to ollama service? i want only use ollama, because need use other model in ollama.
Author
Owner

@goetzc commented on GitHub (Apr 19, 2025):

Also see https://github.com/NotPunchnox/rkllama

<!-- gh-comment-id:2816444804 --> @goetzc commented on GitHub (Apr 19, 2025): Also see https://github.com/NotPunchnox/rkllama
Author
Owner

@mweinelt commented on GitHub (Dec 5, 2025):

The rocket NPU driver is now available on 6.18.0 mainline.

[    6.024524] [drm] Initialized rocket 0.0.0 for rknn on minor 0
[    6.026023] rocket fdab0000.npu: Rockchip NPU core 0 version: 1179210309
[    6.065163] rocket fdac0000.npu: Rockchip NPU core 1 version: 1179210309
[    6.075303] rocket fdad0000.npu: Rockchip NPU core 2 version: 1179210309
# ls -lah /dev/accel/accel0 
crw-rw-rw- 1 root render 261, 0 Dec  5 00:50 /dev/accel/accel0
# uname -a
Linux haumea 6.18.0 #1-NixOS SMP Sun Nov 30 22:42:10 UTC 2025 aarch64 GNU/Linux
<!-- gh-comment-id:3617255824 --> @mweinelt commented on GitHub (Dec 5, 2025): The rocket NPU driver is now available on 6.18.0 mainline. ``` [ 6.024524] [drm] Initialized rocket 0.0.0 for rknn on minor 0 [ 6.026023] rocket fdab0000.npu: Rockchip NPU core 0 version: 1179210309 [ 6.065163] rocket fdac0000.npu: Rockchip NPU core 1 version: 1179210309 [ 6.075303] rocket fdad0000.npu: Rockchip NPU core 2 version: 1179210309 ``` ``` # ls -lah /dev/accel/accel0 crw-rw-rw- 1 root render 261, 0 Dec 5 00:50 /dev/accel/accel0 ``` ``` # uname -a Linux haumea 6.18.0 #1-NixOS SMP Sun Nov 30 22:42:10 UTC 2025 aarch64 GNU/Linux ```
Author
Owner

@RangerMauve commented on GitHub (Mar 13, 2026):

This would be great to have since these SBCs are getting more popular and more powerful. I'd love to be able to keep my usual ollama flow on my Khadas Edge 2

<!-- gh-comment-id:4058791545 --> @RangerMauve commented on GitHub (Mar 13, 2026): This would be great to have since these SBCs are getting more popular and more powerful. I'd love to be able to keep my usual ollama flow on my Khadas Edge 2
Author
Owner

@RangerMauve commented on GitHub (Mar 14, 2026):

@mweinelt which models have you gotten it to run with? Is it just the ones specifically converted to run on rockchip that work?

<!-- gh-comment-id:4058847127 --> @RangerMauve commented on GitHub (Mar 14, 2026): @mweinelt which models have you gotten it to run with? Is it just the ones specifically converted to run on rockchip that work?
Author
Owner

@mweinelt commented on GitHub (Mar 14, 2026):

I haven't tried running any, sorry.

<!-- gh-comment-id:4058858923 --> @mweinelt commented on GitHub (Mar 14, 2026): I haven't tried running any, sorry.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65201