[GH-ISSUE #3861] Is it possible to start with fastchat and then wrap the interface with fastapi to become like ollama? #28150

Closed
opened 2026-04-22 05:59:46 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ciaoyizhen on GitHub (Apr 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3861

Since Ollama supports more than 30 models, but doesn't have the one I need, and in addition, having already deployed another application using fastchat, can I encapsulate the interface and disguise it as Ollama for reuse purposes?
What interfaces do I need to encapsulate and where is the interface documentation?

Originally created by @ciaoyizhen on GitHub (Apr 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3861 Since Ollama supports more than 30 models, but doesn't have the one I need, and in addition, having already deployed another application using fastchat, can I encapsulate the interface and disguise it as Ollama for reuse purposes? What interfaces do I need to encapsulate and where is the interface documentation?
GiteaMirror added the feature request label 2026-04-22 05:59:46 -05:00
Author
Owner

@samyIO commented on GitHub (Apr 24, 2024):

Hey,

you can transform any GGUF model from huggingface to ollama model.

  1. Download the GGUF from huggingface
  2. Create a Modelfile in ollama style and use your GGUF Models path in the FROM statement
  3. inject the according prompt template and start/stop parameters that your model uses
  4. safe the modelfile and run "ollama create -f

You can now use the model like any other ollama model.

I hope this helps, have a nice day!

<!-- gh-comment-id:2073990126 --> @samyIO commented on GitHub (Apr 24, 2024): Hey, you can transform any GGUF model from huggingface to ollama model. 1. Download the GGUF from huggingface 2. Create a Modelfile in ollama style and use your GGUF Models path in the FROM statement 3. inject the according prompt template and start/stop parameters that your model uses 4. safe the modelfile and run "ollama create <your-model-name> -f <your-modelfile> You can now use the model like any other ollama model. I hope this helps, have a nice day!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28150