[GH-ISSUE #8650] Request Support for Running Inference Through LM Studio #31364

Closed
opened 2026-04-22 11:45:58 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @joseph777111 on GitHub (Jan 29, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8650

https://lmstudio.ai
https://github.com/lmstudio-ai/lms

LM Studio is one of the most popular locally run inference platforms, which has its own inference server. Much like Ollama, LM Studio uses llama.cpp for inferences - but it also supports MLX.

Please kindly add support to use Goose with LM Studio as the inference backend. Thanks in advance! 🙏

Originally created by @joseph777111 on GitHub (Jan 29, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8650 https://lmstudio.ai https://github.com/lmstudio-ai/lms LM Studio is one of the most popular locally run inference platforms, which has its own inference server. Much like Ollama, LM Studio uses llama.cpp for inferences - but it also supports MLX. Please kindly add support to use Goose with LM Studio as the inference backend. Thanks in advance! 🙏
GiteaMirror added the feature request label 2026-04-22 11:45:58 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 29, 2025):

#1730

<!-- gh-comment-id:2621043835 --> @rick-github commented on GitHub (Jan 29, 2025): #1730
Author
Owner

@pdevine commented on GitHub (Jan 29, 2025):

I haven't tried it, but I believe Goose can be run w/ ollama today..

<!-- gh-comment-id:2623167933 --> @pdevine commented on GitHub (Jan 29, 2025): I haven't tried it, but I believe Goose [can be run w/ ollama today.](https://block.github.io/goose/docs/getting-started/using-goose-free/).
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#31364