[PR #15280] [MERGED] app/ui: fix model picker showing stale model after switching chats #46349

Closed
opened 2026-04-25 01:48:24 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/15280
Author: @matteocelani
Created: 4/3/2026
Status: Merged
Merged: 4/21/2026
Merged by: @hoyyeva

Base: mainHead: fix/model-picker-stale-on-chat-switch


📝 Commits (2)

  • af5af9d app/ui: fix model picker showing stale model after switching chats
  • fc36408 app/ui: fix two more instances of Model object passed as model name

📊 Changes

1 file changed (+5 additions, -5 deletions)

View changed files

📝 app/ui/app/src/hooks/useChats.ts (+5 -5)

📄 Description

When switching between chats that use different models, the model picker can get stuck showing the previous chat's model. This happens specifically after streaming: optimistic messages created by the streaming batcher store a Model object in the model field instead of a string. When the restore effect in useSelectedModel reads these cached messages to determine the chat's model, the object/string mismatch causes the comparison and settings update to fail silently.

The fix passes effectiveModel.model (the name string) instead of effectiveModel (the full object) when constructing optimistic Message instances during streaming.

To reproduce:

  1. Start a chat with model A (e.g. llama3) and send a message
  2. While the response is streaming (or after), switch to a different chat using model B
  3. Switch back to the first chat — the model picker stays on model B

Fixes #14504


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/15280 **Author:** [@matteocelani](https://github.com/matteocelani) **Created:** 4/3/2026 **Status:** ✅ Merged **Merged:** 4/21/2026 **Merged by:** [@hoyyeva](https://github.com/hoyyeva) **Base:** `main` ← **Head:** `fix/model-picker-stale-on-chat-switch` --- ### 📝 Commits (2) - [`af5af9d`](https://github.com/ollama/ollama/commit/af5af9d1b209ff3e3cab6c1431a947d9e786eec5) app/ui: fix model picker showing stale model after switching chats - [`fc36408`](https://github.com/ollama/ollama/commit/fc364084565ea73402333df19c2c6b20cfa45b75) app/ui: fix two more instances of Model object passed as model name ### 📊 Changes **1 file changed** (+5 additions, -5 deletions) <details> <summary>View changed files</summary> 📝 `app/ui/app/src/hooks/useChats.ts` (+5 -5) </details> ### 📄 Description When switching between chats that use different models, the model picker can get stuck showing the previous chat's model. This happens specifically after streaming: optimistic messages created by the streaming batcher store a `Model` object in the `model` field instead of a string. When the restore effect in `useSelectedModel` reads these cached messages to determine the chat's model, the object/string mismatch causes the comparison and settings update to fail silently. The fix passes `effectiveModel.model` (the name string) instead of `effectiveModel` (the full object) when constructing optimistic `Message` instances during streaming. To reproduce: 1. Start a chat with model A (e.g. llama3) and send a message 2. While the response is streaming (or after), switch to a different chat using model B 3. Switch back to the first chat — the model picker stays on model B Fixes #14504 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 01:48:24 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46349