[GH-ISSUE #3443] May I know whether Ollama support DBRX model? #2123

Closed
opened 2026-04-12 12:21:41 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @OPDEV001 on GitHub (Apr 1, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3443

What model would you like?

I checked the https://ollama.com/library but can not find the DBRX in list.

May I run DBRX model on local machine, CPU and GPU, like ollama run dbrx-xxx-yyyy?

Thanks,

Originally created by @OPDEV001 on GitHub (Apr 1, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3443 ### What model would you like? I checked the https://ollama.com/library but can not find the DBRX in list. May I run DBRX model on local machine, CPU and GPU, like ollama run dbrx-xxx-yyyy? Thanks,
GiteaMirror added the model label 2026-04-12 12:21:41 -05:00
Author
Owner

@crmne commented on GitHub (Apr 2, 2024):

duplicate of #3370

<!-- gh-comment-id:2031436778 --> @crmne commented on GitHub (Apr 2, 2024): duplicate of #3370
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2123