[PR #7971] [CLOSED] ADD: OLLAMA_LLM_DEFAULT #43824

Closed
opened 2026-04-24 23:24:24 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/7971
Author: @bet0x
Created: 12/6/2024
Status: Closed

Base: mainHead: patch-1


📝 Commits (1)

📊 Changes

1 file changed (+96 additions, -16 deletions)

View changed files

📝 cmd/cmd.go (+96 -16)

📄 Description

he addition of OLLAMA_LLM_DEFAULT is a significant improvement over API-based model pulls. While Ollama's API does support model pulling, having a default model environment variable streamlines deployment and reduces operational overhead.

This approach aligns with modern DevOps practices by handling model downloads during server startup. It eliminates the need for separate API calls or scripts, ensuring the required model is always available before the service starts handling requests. For teams running Ollama in containers or orchestrated environments, this means simpler configurations and more reliable deployments.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/7971 **Author:** [@bet0x](https://github.com/bet0x) **Created:** 12/6/2024 **Status:** ❌ Closed **Base:** `main` ← **Head:** `patch-1` --- ### 📝 Commits (1) - [`95dd3c8`](https://github.com/ollama/ollama/commit/95dd3c8126418c0022ff4bf289972886850a3ad5) ADD: OLLAMA_LLM_DEFAULT ### 📊 Changes **1 file changed** (+96 additions, -16 deletions) <details> <summary>View changed files</summary> 📝 `cmd/cmd.go` (+96 -16) </details> ### 📄 Description he addition of OLLAMA_LLM_DEFAULT is a significant improvement over API-based model pulls. While Ollama's API does support model pulling, having a default model environment variable streamlines deployment and reduces operational overhead. This approach aligns with modern DevOps practices by handling model downloads during server startup. It eliminates the need for separate API calls or scripts, ensuring the required model is always available before the service starts handling requests. For teams running Ollama in containers or orchestrated environments, this means simpler configurations and more reliable deployments. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-24 23:24:24 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#43824