issue: Follow-up queries not generated in temporary chats #5544

Closed
opened 2025-11-11 16:23:57 -06:00 by GiteaMirror · 5 comments
Owner

Originally created by @silentoplayz on GitHub (Jun 15, 2025).

Originally assigned to: @tjbck on GitHub.

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.14

Ollama Version (if applicable)

v0.9.0

Operating System

Edition: Windows 11 Pro | Version: 24H2 | OS Build: 26100.4351 | Windows Feature Experience Pack: 1000.26100.107.0

Browser (if applicable)

LibreWolf v135.0.1-1 (Firefox)

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

When interacting with a model in a temporary chat, after the model finishes generating its response, a list of suggested follow-up queries (e.g., "Tell me more about X", "What are the implications of Y?") should appear below the response, similar to how they appear in non-temporary chats.

Actual Behavior

In temporary chats, follow-up queries are not generated or displayed after the model completes its response. This feature works correctly in non-temporary chats.

Steps to Reproduce

  1. Initiate a Temporary Chat:
  • Click the "New Chat" button in the top left corner.
  • Select a model from the model dropdown.
  1. Send a Query:
  • In the chat input box, type a simple question, e.g., "What is the capital of France?".
  • Press Enter or click the send button.
  1. Observe Behavior:
  • The model will generate the response ("The capital of France is Paris.").
  • Expected: Below the response, suggested follow-up queries should appear.
  • Actual: After the model's response has finished generating, wait up to 30 seconds to see that no follow-up queries are generated or displayed. The chat just ends with the model's response.
    Image
  1. Compare with Saved Chat (Optional - to confirm feature works elsewhere):
  • Click "New Chat" next to the model name dropdown, which will start a new empty chat.
  • Ask another question: "What is the capital of France?".
  • Observe that in this saved chat, follow-up queries are correctly generated and displayed after the response.

This issue consistently reproduces in temporary chats across different models and query types.

Logs & Screenshots

Image
Image

Additional Information

  • The issue appears to be specific to "temporary" chats, which do not get a unique chat ID in the URL and are not saved in the sidebar).
  • Follow-up query generation works as expected in "saved" chats (those initiated by clicking "New Chat" next to the model dropdown, or by selecting an existing chat from the sidebar).
Originally created by @silentoplayz on GitHub (Jun 15, 2025). Originally assigned to: @tjbck on GitHub. ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.14 ### Ollama Version (if applicable) v0.9.0 ### Operating System Edition: Windows 11 Pro | Version: 24H2 | OS Build: 26100.4351 | Windows Feature Experience Pack: 1000.26100.107.0 ### Browser (if applicable) LibreWolf v135.0.1-1 (Firefox) ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior When interacting with a model in a temporary chat, after the model finishes generating its response, a list of suggested follow-up queries (e.g., "Tell me more about X", "What are the implications of Y?") should appear below the response, similar to how they appear in non-temporary chats. ### Actual Behavior In temporary chats, follow-up queries are not generated or displayed after the model completes its response. This feature works correctly in non-temporary chats. ### Steps to Reproduce 1) **Initiate a Temporary Chat:** * Click the "New Chat" button in the top left corner. * Select a model from the model dropdown. 2) **Send a Query:** * In the chat input box, type a simple question, e.g., "What is the capital of France?". * Press Enter or click the send button. 3) **Observe Behavior:** * The model will generate the response ("The capital of France is Paris."). * **Expected:** Below the response, suggested follow-up queries should appear. * **Actual:** After the model's response has finished generating, wait up to 30 seconds to see that no follow-up queries are generated or displayed. The chat just ends with the model's response. ![Image](https://github.com/user-attachments/assets/be5920d6-c32b-4a34-bf58-d6e37461c76f) 4) **Compare with Saved Chat (Optional - to confirm feature works elsewhere):** * Click "New Chat" next to the model name dropdown, which will start a new empty chat. * Ask another question: "What is the capital of France?". * Observe that in this *saved* chat, follow-up queries *are* correctly generated and displayed after the response. This issue consistently reproduces in temporary chats across different models and query types. ### Logs & Screenshots ![Image](https://github.com/user-attachments/assets/a10405f0-038a-4fc7-89eb-32199d97c34a) ![Image](https://github.com/user-attachments/assets/fcf0c1f0-fc50-4e61-9168-aa92e6d0d2cd) ### Additional Information - The issue appears to be specific to "temporary" chats, which do not get a unique chat ID in the URL and are not saved in the sidebar). - Follow-up query generation works as expected in "saved" chats (those initiated by clicking "New Chat" next to the model dropdown, or by selecting an existing chat from the sidebar).
GiteaMirror added the bug label 2025-11-11 16:23:57 -06:00
Author
Owner

@decent-engineer-decent-datascientist commented on GitHub (Jun 15, 2025):

Completely unrelated, but how did you get that extremely cool organization in your models drop down? Specifically the tags per model and the tabs?

@decent-engineer-decent-datascientist commented on GitHub (Jun 15, 2025): Completely unrelated, but how did you get that extremely cool organization in your models drop down? Specifically the tags per model and the tabs?
Author
Owner

@silentoplayz commented on GitHub (Jun 15, 2025):

Completely unrelated, but how did you get that extremely cool organization in your models drop down? Specifically the tags per model and the tabs?

Model tags are the secret sauce behind that clean, organized dropdown!
Tags can be added to any model directly from its edit page in Open WebUI — it’s a breeze!
Image
YES — you can even tag custom models from the Workspace section!
Image

Tags aren’t limited to models alone — they can also be applied to connections!
You can tag any connection added via the Add Connection or Edit Connection modal — including OpenAI API, Ollama API, and Direct Connections.
Image

Here’s a real-world example: my GroqCloud connection.
Image
I added the tags Free and GroqCloud to the connection itself, not each model. Why? Because doing so automatically adds the same tags to every model that connection provides access to — saving me from repetitive tagging.
From there, I can still manually add extra tags to specific models if needed — like adding a Code tag to models that handle code generation.

And here’s how to tag direct connections too:
Image

Now, the horizontal tag bar at the top of the model dropdown?
That’s auto-generated — it appears whenever you’ve tagged models or connections in your Open WebUI instance.

This bar is super useful:
You can scroll through it to see all tags, and clicking any tag filters the dropdown in real time.
Image

P.S: If you're wondering why I decided not to use Prefix ID for connections, it's because doing so would would treat each model provided by the connection as a new one, which would require me to manually update ALL model details again. Instead, I manually name each model inside of Open WebUI, but I will prefix the model's name with something like Groq / or Ollama /, which helps me to quickly find out if I am using a local or external model from the model dropdown. I'll admit though, this is counterintuitive when compared to how I utilize tags in Open WebUI on connections to simplify manual work to do. Also, note that if you have added tags to a connection, you will not see those tags applied to the models that the connection provides from the models edit page, but you will see those tags added to the models from the model dropdown.

@silentoplayz commented on GitHub (Jun 15, 2025): > Completely unrelated, but how did you get that extremely cool organization in your models drop down? Specifically the tags per model and the tabs? **Model tags are the secret sauce behind that clean, organized dropdown!** Tags can be added to any model directly from its edit page in Open WebUI — it’s a breeze! ![Image](https://github.com/user-attachments/assets/d8f23e4f-c043-4432-a8a9-97d020340a0f) **YES — you can even tag custom models from the `Workspace` section!** ![Image](https://github.com/user-attachments/assets/26e8931d-1d6f-445a-9377-d3d9448d17de) **Tags aren’t limited to models alone — they can also be applied to connections!** You can tag any connection added via the `Add Connection` or `Edit Connection` modal — including `OpenAI API`, `Ollama API`, and `Direct Connections`. ![Image](https://github.com/user-attachments/assets/b6836396-608c-4d71-b33f-96b0b0436381) **Here’s a real-world example: my GroqCloud connection.** ![Image](https://github.com/user-attachments/assets/dd4ad1cf-d211-402f-b45f-10c14a83d691) I added the tags `Free` and `GroqCloud` to the **connection itself**, not each model. Why? Because doing so automatically adds the same tags to every model that connection provides access to — saving me from repetitive tagging. From there, I can still manually add extra tags to specific models if needed — like adding a `Code` tag to models that handle code generation. **And here’s how to tag direct connections too:** ![Image](https://github.com/user-attachments/assets/e7cf5bbb-343b-4f74-883c-e83b3a601279) **Now, the horizontal tag bar at the top of the model dropdown?** That’s auto-generated — it appears whenever you’ve tagged models or connections in your Open WebUI instance. **This bar is super useful:** You can scroll through it to see all tags, and clicking any tag filters the dropdown in real time. ![Image](https://github.com/user-attachments/assets/8b106fba-b2f5-4af1-b1f1-076b86396e40) P.S: If you're wondering why I decided not to use `Prefix ID` for connections, it's because doing so would would treat each model provided by the connection as a new one, which would require me to manually update ALL model details again. Instead, I manually name each model inside of Open WebUI, but I will prefix the model's name with something like `Groq / ` or `Ollama /`, which helps me to quickly find out if I am using a local or external model from the model dropdown. I'll admit though, this is counterintuitive when compared to how I utilize tags in Open WebUI on connections to simplify manual work to do. Also, note that if you have added tags to a connection, you will not see those tags applied to the models that the connection provides from the models **edit page**, but you will see those tags added to the models from the **model dropdown**.
Author
Owner

@decent-engineer-decent-datascientist commented on GitHub (Jun 16, 2025):

Completely unrelated, but how did you get that extremely cool organization in your models drop down? Specifically the tags per model and the tabs?

Model tags are the secret sauce behind that clean, organized dropdown! Tags can be added to any model directly from its edit page in Open WebUI — it’s a breeze! Image YES — you can even tag custom models from the Workspace section! Image

Tags aren’t limited to models alone — they can also be applied to connections! You can tag any connection added via the Add Connection or Edit Connection modal — including OpenAI API, Ollama API, and Direct Connections. Image

Here’s a real-world example: my GroqCloud connection. Image I added the tags Free and GroqCloud to the connection itself, not each model. Why? Because doing so automatically adds the same tags to every model that connection provides access to — saving me from repetitive tagging. From there, I can still manually add extra tags to specific models if needed — like adding a Code tag to models that handle code generation.

And here’s how to tag direct connections too: Image

Now, the horizontal tag bar at the top of the model dropdown? That’s auto-generated — it appears whenever you’ve tagged models or connections in your Open WebUI instance.

This bar is super useful: You can scroll through it to see all tags, and clicking any tag filters the dropdown in real time. Image

P.S: If you're wondering why I decided not to use Prefix ID for connections, it's because doing so would would treat each model provided by the connection as a new one, which would require me to manually update ALL model details again. Instead, I manually name each model inside of Open WebUI, but I will prefix the model's name with something like Groq / or Ollama /, which helps me to quickly find out if I am using a local or external model from the model dropdown. I'll admit though, this is counterintuitive when compared to how I utilize tags in Open WebUI on connections to simplify manual work to do. Also, note that if you have added tags to a connection, you will not see those tags applied to the models that the connection provides from the models edit page, but you will see those tags added to the models from the model dropdown.

Thank you so much! This is so clean, can't believe I never leveraged this.

@decent-engineer-decent-datascientist commented on GitHub (Jun 16, 2025): > > Completely unrelated, but how did you get that extremely cool organization in your models drop down? Specifically the tags per model and the tabs? > > **Model tags are the secret sauce behind that clean, organized dropdown!** Tags can be added to any model directly from its edit page in Open WebUI — it’s a breeze! ![Image](https://github.com/user-attachments/assets/d8f23e4f-c043-4432-a8a9-97d020340a0f) **YES — you can even tag custom models from the `Workspace` section!** ![Image](https://github.com/user-attachments/assets/26e8931d-1d6f-445a-9377-d3d9448d17de) > > **Tags aren’t limited to models alone — they can also be applied to connections!** You can tag any connection added via the `Add Connection` or `Edit Connection` modal — including `OpenAI API`, `Ollama API`, and `Direct Connections`. ![Image](https://github.com/user-attachments/assets/b6836396-608c-4d71-b33f-96b0b0436381) > > **Here’s a real-world example: my GroqCloud connection.** ![Image](https://github.com/user-attachments/assets/dd4ad1cf-d211-402f-b45f-10c14a83d691) I added the tags `Free` and `GroqCloud` to the **connection itself**, not each model. Why? Because doing so automatically adds the same tags to every model that connection provides access to — saving me from repetitive tagging. From there, I can still manually add extra tags to specific models if needed — like adding a `Code` tag to models that handle code generation. > > **And here’s how to tag direct connections too:** ![Image](https://github.com/user-attachments/assets/e7cf5bbb-343b-4f74-883c-e83b3a601279) > > **Now, the horizontal tag bar at the top of the model dropdown?** That’s auto-generated — it appears whenever you’ve tagged models or connections in your Open WebUI instance. > > **This bar is super useful:** You can scroll through it to see all tags, and clicking any tag filters the dropdown in real time. ![Image](https://github.com/user-attachments/assets/8b106fba-b2f5-4af1-b1f1-076b86396e40) > > P.S: If you're wondering why I decided not to use `Prefix ID` for connections, it's because doing so would would treat each model provided by the connection as a new one, which would require me to manually update ALL model details again. Instead, I manually name each model inside of Open WebUI, but I will prefix the model's name with something like `Groq / ` or `Ollama /`, which helps me to quickly find out if I am using a local or external model from the model dropdown. I'll admit though, this is counterintuitive when compared to how I utilize tags in Open WebUI on connections to simplify manual work to do. Also, note that if you have added tags to a connection, you will not see those tags applied to the models that the connection provides from the models **edit page**, but you will see those tags added to the models from the **model dropdown**. Thank you so much! This is so clean, can't believe I never leveraged this.
Author
Owner

@tjbck commented on GitHub (Oct 2, 2025):

Addressed in dev.

@tjbck commented on GitHub (Oct 2, 2025): Addressed in dev.
Author
Owner

@silentoplayz commented on GitHub (Oct 2, 2025):

Addressed in dev.

Awesome! I have tested with local models, external models, and custom models and can confirm that the issue has been addressed.

Local model in temporary chat:
Image

Custom workspace model with local model as base in temporary chat:
Image

External model in temporary chat:
Image

Custom workspace model with external model as base in temporary chat:
Image

Image
@silentoplayz commented on GitHub (Oct 2, 2025): > Addressed in dev. Awesome! I have tested with local models, external models, and custom models and can confirm that the issue has been addressed. Local model in temporary chat: <img width="2298" height="1276" alt="Image" src="https://github.com/user-attachments/assets/ec96aeac-49a4-4de8-a5b9-2611c77b4342" /> Custom workspace model with local model as base in temporary chat: <img width="2298" height="1276" alt="Image" src="https://github.com/user-attachments/assets/92b16e81-8fab-4f47-b736-28e5500891bd" /> External model in temporary chat: <img width="2298" height="1276" alt="Image" src="https://github.com/user-attachments/assets/e3de8d7f-3e88-4a1d-8896-81e2effe4f14" /> Custom workspace model with external model as base in temporary chat: <img width="2298" height="1276" alt="Image" src="https://github.com/user-attachments/assets/d750e416-7ee0-4caa-8cbc-7bd05f56c498" /> <img width="2298" height="1276" alt="Image" src="https://github.com/user-attachments/assets/942c25bb-adaa-4519-b5fa-af1face6c5a0" />
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#5544