[PR #4143] [MERGED] omit prompt and generate settings from final response #11396

Closed
opened 2026-04-12 23:29:21 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/4143
Author: @mxyng
Created: 5/3/2024
Status: Merged
Merged: 5/4/2024
Merged by: @mxyng

Base: mainHead: mxyng/final-response


📝 Commits (1)

  • 44869c5 omit prompt and generate settings from final response

📊 Changes

1 file changed (+0 additions, -2 deletions)

View changed files

📝 llm/ext_server/server.cpp (+0 -2)

📄 Description

if the input is large, it might overrun the response buffer. there's no need to return the prompt since the caller has it already


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/4143 **Author:** [@mxyng](https://github.com/mxyng) **Created:** 5/3/2024 **Status:** ✅ Merged **Merged:** 5/4/2024 **Merged by:** [@mxyng](https://github.com/mxyng) **Base:** `main` ← **Head:** `mxyng/final-response` --- ### 📝 Commits (1) - [`44869c5`](https://github.com/ollama/ollama/commit/44869c59d6b331e742d8bb2dab94304fed9842fa) omit prompt and generate settings from final response ### 📊 Changes **1 file changed** (+0 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `llm/ext_server/server.cpp` (+0 -2) </details> ### 📄 Description if the input is large, it might overrun the response buffer. there's no need to return the prompt since the caller has it already --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-12 23:29:21 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#11396