[GH-ISSUE #6778] Would be nice to have a "continue last message" option with the /api/chat endpoint #4273

Closed
opened 2026-04-12 15:12:13 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @hammer-ai on GitHub (Sep 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6778

Hi there, it would be nice to have a "continue last message" option with the /api/chat endpoint. Thanks!

Originally created by @hammer-ai on GitHub (Sep 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6778 Hi there, it would be nice to have a "continue last message" option with the `/api/chat` endpoint. Thanks!
GiteaMirror added the feature request label 2026-04-12 15:12:13 -05:00
Author
Owner

@pdevine commented on GitHub (Sep 12, 2024):

@hammer-ai not quite sure what you're asking for here. If you want to continue a conversation you just pass back the entire messages array with the next message that you want to send. Did you mean something different?

<!-- gh-comment-id:2347271685 --> @pdevine commented on GitHub (Sep 12, 2024): @hammer-ai not quite sure what you're asking for here. If you want to continue a conversation you just pass back the entire `messages` array with the next message that you want to send. Did you mean something different?
Author
Owner

@jmorganca commented on GitHub (Sep 13, 2024):

@hammer-ai this can be done today with a request similar to:

curl http://localhost:11434/api/chat -d '{
  "model": "llama3.1",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    },
    {
      "role": "assistant",
      "content": "Great question, the sky is blue because it was painted by "
      }
  ]
}'

Hope this helps!

<!-- gh-comment-id:2348055031 --> @jmorganca commented on GitHub (Sep 13, 2024): @hammer-ai this can be done today with a request similar to: ``` curl http://localhost:11434/api/chat -d '{ "model": "llama3.1", "messages": [ { "role": "user", "content": "why is the sky blue?" }, { "role": "assistant", "content": "Great question, the sky is blue because it was painted by " } ] }' ``` Hope this helps!
Author
Owner

@Omega-Centauri-21 commented on GitHub (Feb 25, 2025):

I guess what @hammer-ai was asking is a functionality in ollama like HuggingFace Continue final message.

I too do hope for this!

Thanks!

<!-- gh-comment-id:2681664622 --> @Omega-Centauri-21 commented on GitHub (Feb 25, 2025): I guess what @hammer-ai was asking is a functionality in ollama like [HuggingFace Continue final message](https://huggingface.co/docs/transformers/main/en/chat_templating#what-does-continuefinalmessage-do). I too do hope for this! Thanks!
Author
Owner

@pdevine commented on GitHub (Mar 11, 2025):

@Omega-Centauri-21 This is already essentially how the API already works today. Try out @jmorganca 's snippet above.

<!-- gh-comment-id:2715989432 --> @pdevine commented on GitHub (Mar 11, 2025): @Omega-Centauri-21 This is already essentially how the API already works today. Try out @jmorganca 's snippet above.
Author
Owner

@Omega-Centauri-21 commented on GitHub (Mar 12, 2025):

@pdevine I have already used this method, was hoping if there is any other way so that I don't have to pass the old message intrinsically. I do have another question. If I keep using this method, won't my message size or question token size be bigger and unstable as compared to a clean slate?

<!-- gh-comment-id:2717532038 --> @Omega-Centauri-21 commented on GitHub (Mar 12, 2025): @pdevine I have already used this method, was hoping if there is any other way so that I don't have to pass the old message intrinsically. I do have another question. If I keep using this method, won't my message size or question token size be bigger and unstable as compared to a clean slate?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4273