[PR #5760] [CLOSED] Make llama.cpp's cache_prompt parameter configurable #43163

Closed
opened 2026-04-24 22:50:43 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/5760
Author: @sayap
Created: 7/18/2024
Status: Closed

Base: mainHead: configurable-cache-prompt


📝 Commits (1)

  • 80d0656 Make llama.cpp's cache_prompt parameter configurable

📊 Changes

4 files changed (+6 additions, -2 deletions)

View changed files

📝 api/types.go (+2 -0)
📝 cmd/interactive.go (+1 -0)
📝 llm/server.go (+1 -1)
📝 server/routes_test.go (+2 -1)

📄 Description

This allows the output to be deterministic when setting the same seed and temperature.

Fixes #5321


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/5760 **Author:** [@sayap](https://github.com/sayap) **Created:** 7/18/2024 **Status:** ❌ Closed **Base:** `main` ← **Head:** `configurable-cache-prompt` --- ### 📝 Commits (1) - [`80d0656`](https://github.com/ollama/ollama/commit/80d065658df96c9e4b03ffdcdb8eb0890355ca5d) Make llama.cpp's cache_prompt parameter configurable ### 📊 Changes **4 files changed** (+6 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `api/types.go` (+2 -0) 📝 `cmd/interactive.go` (+1 -0) 📝 `llm/server.go` (+1 -1) 📝 `server/routes_test.go` (+2 -1) </details> ### 📄 Description This allows the output to be deterministic when setting the same seed and temperature. Fixes #5321 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-24 22:50:43 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#43163