[GH-ISSUE #14640] Add Mistral Vibe to ollama launch #9485

Open
opened 2026-04-12 22:24:39 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @MrScratchcat on GitHub (Mar 5, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14640

Originally assigned to: @ParthSareen on GitHub.

It would be very nice to have and its very easy to do do you need to edit ~/.vibe/config.toml line 1 is active_model = "devstral-2" so you can set the model you want to start with for ollama launch then you need to add a provider:

providers
name = "ollama"
api_base = "http://localhost:11434/v1"
api_key_env_var = ""
api_style = "openai"
backend = "generic"

The last step is to just add the model:

models
name = "glm-4.7-flash:latest"
provider = "ollama"
alias = "GLM-4.7-Flash"
temperature = 0.2
input_price = 0.0
output_price = 0.0

Originally created by @MrScratchcat on GitHub (Mar 5, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14640 Originally assigned to: @ParthSareen on GitHub. It would be very nice to have and its very easy to do do you need to edit ~/.vibe/config.toml line 1 is active_model = "devstral-2" so you can set the model you want to start with for ollama launch then you need to add a provider: [[providers]] name = "ollama" api_base = "http://localhost:11434/v1" api_key_env_var = "" api_style = "openai" backend = "generic" The last step is to just add the model: [[models]] name = "glm-4.7-flash:latest" provider = "ollama" alias = "GLM-4.7-Flash" temperature = 0.2 input_price = 0.0 output_price = 0.0
GiteaMirror added the feature request label 2026-04-12 22:24:39 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9485