[GH-ISSUE #4450] Resume a conversation started in Open Web-UI using ollama command line #64818

Closed
opened 2026-05-03 18:53:30 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @tomav on GitHub (May 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4450

Hi there!
I can't find information on this, let me know if there's actually a way to do this.
Thanks!

Originally created by @tomav on GitHub (May 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4450 Hi there! I can't find information on this, let me know if there's actually a way to do this. Thanks!
GiteaMirror added the feature request label 2026-05-03 18:53:30 -05:00
Author
Owner

@zanderlewis commented on GitHub (May 15, 2024):

This wouldn't be an Ollama thing. I would bring this up to Open Web UI itself.

<!-- gh-comment-id:2112523845 --> @zanderlewis commented on GitHub (May 15, 2024): This wouldn't be an Ollama thing. I would bring this up to Open Web UI itself.
Author
Owner

@tomav commented on GitHub (May 15, 2024):

Hi @WolfTheDeveloper, well, my mental model was that it is possible to have conversations with context using Ollama APIs in Open Web UI. We can see a context parameter in Ollama API doc.
Question is how to do the same using Ollama command line, that's why I created the question here, and I can't find options to use a contact parameter on the ollama run command.
I don't think see it as an Open Web UI thing (and this just should be a frontend on top of Ollama and other API based LLMs.

<!-- gh-comment-id:2112962881 --> @tomav commented on GitHub (May 15, 2024): Hi @WolfTheDeveloper, well, my mental model was that it is possible to have conversations with context using Ollama APIs in Open Web UI. We can see a `context` parameter in [Ollama API doc](https://github.com/ollama/ollama/blob/main/docs/api.md#parameters). Question is how to do the same using Ollama command line, that's why I created the question here, and I can't find options to use a contact `parameter` on the `ollama run` command. I don't think see it as an Open Web UI thing (and this just should be a frontend on top of Ollama and other API based LLMs.
Author
Owner

@mxyng commented on GitHub (May 15, 2024):

Open WebUI and Ollama CLI are two distinct applications implementing frontends for the Ollama API. While you can definitely use context with /api/generate and messages with /api/chat to implement chat history, it's an implementation detail of the application to support persistence or cross-application chat history.

Ollama CLI has a feature which saves its conversation history into a Modelfile through the MESSAGE command. This can used to persist and reload the session. I'm not sure if Open WebUI uses this feature.

<!-- gh-comment-id:2113072065 --> @mxyng commented on GitHub (May 15, 2024): Open WebUI and Ollama CLI are two distinct applications implementing frontends for the Ollama API. While you can definitely use `context` with `/api/generate` and `messages` with `/api/chat` to implement chat history, it's an implementation detail of the application to support persistence or cross-application chat history. Ollama CLI has a feature which saves its conversation history into a Modelfile through the MESSAGE command. This can used to persist and reload the session. I'm not sure if Open WebUI uses this feature.
Author
Owner

@tomav commented on GitHub (May 26, 2024):

Thanks @mxyng, crystal clear.

<!-- gh-comment-id:2132181538 --> @tomav commented on GitHub (May 26, 2024): Thanks @mxyng, crystal clear.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64818