[GH-ISSUE #1751] [FEATURE] add more options while chatting like /bye (e.g /clear_context or /new_chat) #1002

Closed
opened 2026-04-12 10:42:50 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @tikendraw on GitHub (Dec 31, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1751

Originally assigned to: @pdevine on GitHub.

While chatting with the model, you necessarily do not need to have the context, or you just want a new chat. Well, there are no options for this, rather than just cancelling this chat and restarting it.

So, similar to the /bye option, there can be other options for the ease of using llm.

  • /clear_context or /no_context: to not use the above context
  • /new_chat : to initialize new chat

or any other option that may be useful for the user.

Originally created by @tikendraw on GitHub (Dec 31, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1751 Originally assigned to: @pdevine on GitHub. While chatting with the model, you necessarily do not need to have the context, or you just want a new chat. Well, there are no options for this, rather than just cancelling this chat and restarting it. So, similar to the `/bye` option, there can be other options for the ease of using llm. * `/clear_context` or `/no_context`: to not use the above context * `/new_chat` : to initialize new chat or any other option that may be useful for the user.
GiteaMirror added the feature request label 2026-04-12 10:42:50 -05:00
Author
Owner

@rgaidot commented on GitHub (Jan 5, 2024):

Maybe it's up to front to manage the context? Otherwise sessions will have to be introduced in ollama? No?

Ollama (/api/chat) <-> YourAPI (context w/ messages) <-> Front (context w/ messages)

<!-- gh-comment-id:1878870845 --> @rgaidot commented on GitHub (Jan 5, 2024): Maybe it's up to front to manage the context? Otherwise sessions will have to be introduced in ollama? No? Ollama (/api/chat) <-> _YourAPI_ (context w/ messages) <-> **Front** (context w/ messages)
Author
Owner

@rgaidot commented on GitHub (Jan 5, 2024):

IMO - So currently, I don't think it's up to Ollama to make this secret sauce.

<!-- gh-comment-id:1878892175 --> @rgaidot commented on GitHub (Jan 5, 2024): IMO - So currently, I don't think it's up to Ollama to make this secret sauce.
Author
Owner

@tikendraw commented on GitHub (Jan 5, 2024):

Generally, these LLM apis has a memory or context as argument in order to continue the chat; just not passing previous data will work for clear context.

<!-- gh-comment-id:1879352473 --> @tikendraw commented on GitHub (Jan 5, 2024): Generally, these LLM apis has a memory or context as argument in order to continue the chat; just not passing previous data will work for clear context.
Author
Owner

@FlippingBinary commented on GitHub (Jan 7, 2024):

Generally, these LLM apis has a memory or context as argument in order to continue the chat; just not passing previous data will work for clear context.

How would one do that in the CLI? Right now, I exit with /bye then manually delete the history file (~/.ollama/history), then load the model again. It would be quite nice if there was a command to clear the context, like /reset or /new_chat or anything would be nice.

<!-- gh-comment-id:1880154722 --> @FlippingBinary commented on GitHub (Jan 7, 2024): > Generally, these LLM apis has a memory or context as argument in order to continue the chat; just not passing previous data will work for clear context. How would one do that in the CLI? Right now, I exit with `/bye` then manually delete the history file (`~/.ollama/history`), then load the model again. It would be quite nice if there was a command to clear the context, like `/reset` or `/new_chat` or anything would be nice.
Author
Owner

@pdevine commented on GitHub (Jan 7, 2024):

@FlippingBinary If you just want the history to not be updated you can use the /set nohistory command. I actually have a change to clear the context as well, but I'm still not super happy with it. I'll be taking a look at this this coming week though.

<!-- gh-comment-id:1880156538 --> @pdevine commented on GitHub (Jan 7, 2024): @FlippingBinary If you just want the history to not be updated you can use the `/set nohistory` command. I actually have a change to clear the context as well, but I'm still not super happy with it. I'll be taking a look at this this coming week though.
Author
Owner

@FlippingBinary commented on GitHub (Jan 7, 2024):

@pdevine Thank you for the tip. I just tested that feature a bit more and now realize that I conflated context and history. Now I know that I have no use for history, but there doesn't seem to be a persistent way to disable it. I tried ln -s /dev/null ~/.ollama/history but ollama just deleted the symbolic link and recreated the text file during the next session. There appears to be no way to prevent it from being created and growing because it somewhat ironically records the /set nohistory command in the history so the history file grows with duplicate lines of /set nohistory if you run that command at the start of each session.

It would be nice to have a persistent nohistory setting or even a command-line argument, but I suppose that's not all that important because it doesn't affect the context of future sessions. For resetting the context, closing the session with /bye and opening again works well enough for the time being.

<!-- gh-comment-id:1880183582 --> @FlippingBinary commented on GitHub (Jan 7, 2024): @pdevine Thank you for the tip. I just tested that feature a bit more and now realize that I conflated context and history. Now I know that I have no use for history, but there doesn't seem to be a persistent way to disable it. I tried `ln -s /dev/null ~/.ollama/history` but ollama just deleted the symbolic link and recreated the text file during the next session. There appears to be no way to prevent it from being created and growing because it somewhat ironically records the `/set nohistory` command in the history so the history file grows with duplicate lines of `/set nohistory` if you run that command at the start of each session. It would be nice to have a persistent `nohistory` setting or even a command-line argument, but I suppose that's not all that important because it doesn't affect the context of future sessions. For resetting the context, closing the session with `/bye` and opening again works well enough for the time being.
Author
Owner

@easp commented on GitHub (Jan 8, 2024):

IMO - So currently, I don't think it's up to Ollama to make this secret sauce.

@rgaidot OP seems to be talking about adding commands to the ollama CLI. This CLI is provided in this repo as part of the Ollama project. How is their request out-of-scope for the project?

<!-- gh-comment-id:1880295162 --> @easp commented on GitHub (Jan 8, 2024): > IMO - So currently, I don't think it's up to Ollama to make this secret sauce. @rgaidot OP seems to be talking about adding commands to the ollama CLI. This CLI is provided in this repo as part of the Ollama project. How is their request out-of-scope for the project?
Author
Owner

@rgaidot commented on GitHub (Jan 8, 2024):

@easp At first, I thought it was for API (not CLI), hence my first reply

<!-- gh-comment-id:1880419657 --> @rgaidot commented on GitHub (Jan 8, 2024): @easp At first, I thought it was for API (not CLI), hence my first reply
Author
Owner

@pdevine commented on GitHub (Jan 25, 2024):

With 0.1.21 you'll be able to type /load <model> which will clear your context. You can also use /save <model> which will save your conversation up until that point and you can reload it later.

I think that meets the spirit of this issue, so I'm going to go ahead a close it. Please feel free to reopen!

<!-- gh-comment-id:1911127561 --> @pdevine commented on GitHub (Jan 25, 2024): With `0.1.21` you'll be able to type `/load <model>` which will clear your context. You can also use `/save <model>` which will save your conversation up until that point and you can reload it later. I _think_ that meets the spirit of this issue, so I'm going to go ahead a close it. Please feel free to reopen!
Author
Owner

@FlippingBinary commented on GitHub (Jan 26, 2024):

The new /load <model> command works well! Perhaps now it would be possible to make /load with no parameter have the effect of /load <model> where <model> is the name of the most recently loaded model? I think that would fully meet the spirit of this issue.

<!-- gh-comment-id:1912337253 --> @FlippingBinary commented on GitHub (Jan 26, 2024): The new `/load <model>` command works well! Perhaps now it would be possible to make `/load` with no parameter have the effect of `/load <model>` where `<model>` is the name of the most recently loaded model? I think that would fully meet the spirit of this issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1002